Sample records for modified signed log-likelihood

  1. Standardized likelihood ratio test for comparing several log-normal means and confidence interval for the common mean.

    PubMed

    Krishnamoorthy, K; Oral, Evrim

    2017-12-01

    Standardized likelihood ratio test (SLRT) for testing the equality of means of several log-normal distributions is proposed. The properties of the SLRT and an available modified likelihood ratio test (MLRT) and a generalized variable (GV) test are evaluated by Monte Carlo simulation and compared. Evaluation studies indicate that the SLRT is accurate even for small samples, whereas the MLRT could be quite liberal for some parameter values, and the GV test is in general conservative and less powerful than the SLRT. Furthermore, a closed-form approximate confidence interval for the common mean of several log-normal distributions is developed using the method of variance estimate recovery, and compared with the generalized confidence interval with respect to coverage probabilities and precision. Simulation studies indicate that the proposed confidence interval is accurate and better than the generalized confidence interval in terms of coverage probabilities. The methods are illustrated using two examples.

  2. Computing Maximum Likelihood Estimates of Loglinear Models from Marginal Sums with Special Attention to Loglinear Item Response Theory. [Project Psychometric Aspects of Item Banking No. 53.] Research Report 91-1.

    ERIC Educational Resources Information Center

    Kelderman, Henk

    In this paper, algorithms are described for obtaining the maximum likelihood estimates of the parameters in log-linear models. Modified versions of the iterative proportional fitting and Newton-Raphson algorithms are described that work on the minimal sufficient statistics rather than on the usual counts in the full contingency table. This is…

  3. 77 FR 40033 - Certain New Chemicals; Receipt and Status Information

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-06

    ... show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an...-(isocyanatomethyl)- alkylcyclohexane. P-10-0378 05/03/2012 04/30/2012 (G) Metal oxide modified with alkyl and vinyl...

  4. Limits of visual communication: the effect of signal-to-noise ratio on the intelligibility of American Sign Language.

    PubMed

    Pavel, M; Sperling, G; Riedl, T; Vanderbeek, A

    1987-12-01

    To determine the limits of human observers' ability to identify visually presented American Sign Language (ASL), the contrast s and the amount of additive noise n in dynamic ASL images were varied independently. Contrast was tested over a 4:1 range; the rms signal-to-noise ratios (s/n) investigated were s/n = 1/4, 1/2, 1, and infinity (which is used to designate the original, uncontaminated images). Fourteen deaf subjects were tested with an intelligibility test composed of 85 isolated ASL signs, each 2-3 sec in length. For these ASL signs (64 x 96 pixels, 30 frames/sec), subjects' performance asymptotes between s/n = 0.5 and 1.0; further increases in s/n do not improve intelligibility. Intelligibility was found to depend only on s/n and not on contrast. A formulation in terms of logistic functions was proposed to derive intelligibility of ASL signs from s/n, sign familiarity, and sign difficulty. Familiarity (ignorance) is represented by additive signal-correlated noise; it represents the likelihood of a subject's knowing a particular ASL sign, and it adds to s/n. Difficulty is represented by a multiplicative difficulty coefficient; it represents the perceptual vulnerability of an ASL sign to noise and it adds to log(s/n).

  5. The Extended Erlang-Truncated Exponential distribution: Properties and application to rainfall data.

    PubMed

    Okorie, I E; Akpanta, A C; Ohakwe, J; Chikezie, D C

    2017-06-01

    The Erlang-Truncated Exponential ETE distribution is modified and the new lifetime distribution is called the Extended Erlang-Truncated Exponential EETE distribution. Some statistical and reliability properties of the new distribution are given and the method of maximum likelihood estimate was proposed for estimating the model parameters. The usefulness and flexibility of the EETE distribution was illustrated with an uncensored data set and its fit was compared with that of the ETE and three other three-parameter distributions. Results based on the minimized log-likelihood ([Formula: see text]), Akaike information criterion (AIC), Bayesian information criterion (BIC) and the generalized Cramér-von Mises [Formula: see text] statistics shows that the EETE distribution provides a more reasonable fit than the one based on the other competing distributions.

  6. Lidar-Incorporated Traffic Sign Detection from Video Log Images of Mobile Mapping System

    NASA Astrophysics Data System (ADS)

    Li, Y.; Fan, J.; Huang, Y.; Chen, Z.

    2016-06-01

    Mobile Mapping System (MMS) simultaneously collects the Lidar points and video log images in a scenario with the laser profiler and digital camera. Besides the textural details of video log images, it also captures the 3D geometric shape of point cloud. It is widely used to survey the street view and roadside transportation infrastructure, such as traffic sign, guardrail, etc., in many transportation agencies. Although many literature on traffic sign detection are available, they only focus on either Lidar or imagery data of traffic sign. Based on the well-calibrated extrinsic parameters of MMS, 3D Lidar points are, the first time, incorporated into 2D video log images to enhance the detection of traffic sign both physically and visually. Based on the local elevation, the 3D pavement area is first located. Within a certain distance and height of the pavement, points of the overhead and roadside traffic signs can be obtained according to the setup specification of traffic signs in different transportation agencies. The 3D candidate planes of traffic signs are then fitted using the RANSAC plane-fitting of those points. By projecting the candidate planes onto the image, Regions of Interest (ROIs) of traffic signs are found physically with the geometric constraints between laser profiling and camera imaging. The Random forest learning of the visual color and shape features of traffic signs is adopted to validate the sign ROIs from the video log images. The sequential occurrence of a traffic sign among consecutive video log images are defined by the geometric constraint of the imaging geometry and GPS movement. Candidate ROIs are predicted in this temporal context to double-check the salient traffic sign among video log images. The proposed algorithm is tested on a diverse set of scenarios on the interstate highway G-4 near Beijing, China under varying lighting conditions and occlusions. Experimental results show the proposed algorithm enhances the rate of detecting traffic signs with the incorporation of the 3D planar constraint of their Lidar points. It is promising for the robust and large-scale survey of most transportation infrastructure with the application of MMS.

  7. Comparison of robustness to outliers between robust poisson models and log-binomial models when estimating relative risks for common binary outcomes: a simulation study.

    PubMed

    Chen, Wansu; Shi, Jiaxiao; Qian, Lei; Azen, Stanley P

    2014-06-26

    To estimate relative risks or risk ratios for common binary outcomes, the most popular model-based methods are the robust (also known as modified) Poisson and the log-binomial regression. Of the two methods, it is believed that the log-binomial regression yields more efficient estimators because it is maximum likelihood based, while the robust Poisson model may be less affected by outliers. Evidence to support the robustness of robust Poisson models in comparison with log-binomial models is very limited. In this study a simulation was conducted to evaluate the performance of the two methods in several scenarios where outliers existed. The findings indicate that for data coming from a population where the relationship between the outcome and the covariate was in a simple form (e.g. log-linear), the two models yielded comparable biases and mean square errors. However, if the true relationship contained a higher order term, the robust Poisson models consistently outperformed the log-binomial models even when the level of contamination is low. The robust Poisson models are more robust (or less sensitive) to outliers compared to the log-binomial models when estimating relative risks or risk ratios for common binary outcomes. Users should be aware of the limitations when choosing appropriate models to estimate relative risks or risk ratios.

  8. NLSCIDNT user's guide maximum likehood parameter identification computer program with nonlinear rotorcraft model

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A nonlinear, maximum likelihood, parameter identification computer program (NLSCIDNT) is described which evaluates rotorcraft stability and control coefficients from flight test data. The optimal estimates of the parameters (stability and control coefficients) are determined (identified) by minimizing the negative log likelihood cost function. The minimization technique is the Levenberg-Marquardt method, which behaves like the steepest descent method when it is far from the minimum and behaves like the modified Newton-Raphson method when it is nearer the minimum. Twenty-one states and 40 measurement variables are modeled, and any subset may be selected. States which are not integrated may be fixed at an input value, or time history data may be substituted for the state in the equations of motion. Any aerodynamic coefficient may be expressed as a nonlinear polynomial function of selected 'expansion variables'.

  9. Preexposure Prophylaxis Modality Preferences Among Men Who Have Sex With Men and Use Social Media in the United States.

    PubMed

    Hall, Eric William; Heneine, Walid; Sanchez, Travis; Sineath, Robert Craig; Sullivan, Patrick

    2016-05-19

    Preexposure prophylaxis (PrEP) is available as a daily pill for preventing infection with the human immunodeficiency virus (HIV). Innovative methods of administering PrEP systemically or topically are being discussed and developed. The objective of our study was to assess attitudes toward different experimental modalities of PrEP administration. From April to July 2015, we recruited 1106 HIV-negative men who have sex with men through online social media advertisements and surveyed them about their likelihood of using different PrEP modalities. Participants responded to 5-point Likert-scale items indicating how likely they were to use each of the following PrEP modalities: a daily oral pill, on-demand pills, periodic injection, penile gel (either before or after intercourse), rectal gel (before/after), and rectal suppository (before/after). We used Wilcoxon signed rank tests to determine whether the stated likelihood of using any modality differed from daily oral PrEP. Related items were combined to assess differences in likelihood of use based on tissue or time of administration. Participants also ranked their interest in using each modality, and we used the modified Borda count method to determine consensual rankings. Most participants indicated they would be somewhat likely or very likely to use PrEP as an on-demand pill (685/1105, 61.99%), daily oral pill (528/1036, 50.97%), injection (575/1091, 52.70%), or penile gel (438/755, 58.01% before intercourse; 408/751, 54.33% after). The stated likelihoods of using on-demand pills (median score 4) and of using a penile gel before intercourse (median 4) were both higher than that of using a daily oral pill (median 4, P<.001 and P=.001, respectively). Compared with a daily oral pill, participants reported a significantly lower likelihood of using any of the 4 rectal modalities (Wilcoxon signed rank test, all P<.001). On 10-point Likert scales created by combining application methods, the reported likelihood of using a penile gel (median 7) was higher than that of using a rectal gel (median 6, P<.001), which was higher than the likelihood of using a rectal suppository (median 6, P<.001). The modified Borda count ranked on-demand pills as the most preferred modality. There was no difference in likelihood of use of PrEP (gel or suppository) before or after intercourse. Participants typically prefer systemic PrEP and are less likely to use a modality that is administered rectally. Although most of these modalities are seen as favorable or neutral, attitudes may change as information about efficacy and application becomes available. Further data on modality preference across risk groups will better inform PrEP development.

  10. Estimating relative risks for common outcome using PROC NLP.

    PubMed

    Yu, Binbing; Wang, Zhuoqiao

    2008-05-01

    In cross-sectional or cohort studies with binary outcomes, it is biologically interpretable and of interest to estimate the relative risk or prevalence ratio, especially when the response rates are not rare. Several methods have been used to estimate the relative risk, among which the log-binomial models yield the maximum likelihood estimate (MLE) of the parameters. Because of restrictions on the parameter space, the log-binomial models often run into convergence problems. Some remedies, e.g., the Poisson and Cox regressions, have been proposed. However, these methods may give out-of-bound predicted response probabilities. In this paper, a new computation method using the SAS Nonlinear Programming (NLP) procedure is proposed to find the MLEs. The proposed NLP method was compared to the COPY method, a modified method to fit the log-binomial model. Issues in the implementation are discussed. For illustration, both methods were applied to data on the prevalence of microalbuminuria (micro-protein leakage into urine) for kidney disease patients from the Diabetes Control and Complications Trial. The sample SAS macro for calculating relative risk is provided in the appendix.

  11. An iterative procedure for obtaining maximum-likelihood estimates of the parameters for a mixture of normal distributions

    NASA Technical Reports Server (NTRS)

    Peters, B. C., Jr.; Walker, H. F.

    1975-01-01

    A general iterative procedure is given for determining the consistent maximum likelihood estimates of normal distributions. In addition, a local maximum of the log-likelihood function, Newtons's method, a method of scoring, and modifications of these procedures are discussed.

  12. Size distribution of Portuguese firms between 2006 and 2012

    NASA Astrophysics Data System (ADS)

    Pascoal, Rui; Augusto, Mário; Monteiro, A. M.

    2016-09-01

    This study aims to describe the size distribution of Portuguese firms, as measured by annual sales and total assets, between 2006 and 2012, giving an economic interpretation for the evolution of the distribution along the time. Three distributions are fitted to data: the lognormal, the Pareto (and as a particular case Zipf) and the Simplified Canonical Law (SCL). We present the main arguments found in literature to justify the use of distributions and emphasize the interpretation of SCL coefficients. Methods of estimation include Maximum Likelihood, modified Ordinary Least Squares in log-log scale and Nonlinear Least Squares considering the Levenberg-Marquardt algorithm. When applying these approaches to Portuguese's firms data, we analyze if the evolution of estimated parameters in both lognormal power and SCL is in accordance with the known existence of a recession period after 2008. This is confirmed for sales but not for assets, leading to the conclusion that the first variable is a best proxy for firm size.

  13. An efficient algorithm for accurate computation of the Dirichlet-multinomial log-likelihood function.

    PubMed

    Yu, Peng; Shaw, Chad A

    2014-06-01

    The Dirichlet-multinomial (DMN) distribution is a fundamental model for multicategory count data with overdispersion. This distribution has many uses in bioinformatics including applications to metagenomics data, transctriptomics and alternative splicing. The DMN distribution reduces to the multinomial distribution when the overdispersion parameter ψ is 0. Unfortunately, numerical computation of the DMN log-likelihood function by conventional methods results in instability in the neighborhood of [Formula: see text]. An alternative formulation circumvents this instability, but it leads to long runtimes that make it impractical for large count data common in bioinformatics. We have developed a new method for computation of the DMN log-likelihood to solve the instability problem without incurring long runtimes. The new approach is composed of a novel formula and an algorithm to extend its applicability. Our numerical experiments show that this new method both improves the accuracy of log-likelihood evaluation and the runtime by several orders of magnitude, especially in high-count data situations that are common in deep sequencing data. Using real metagenomic data, our method achieves manyfold runtime improvement. Our method increases the feasibility of using the DMN distribution to model many high-throughput problems in bioinformatics. We have included in our work an R package giving access to this method and a vingette applying this approach to metagenomic data. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. Maximum likelihood estimates, from censored data, for mixed-Weibull distributions

    NASA Astrophysics Data System (ADS)

    Jiang, Siyuan; Kececioglu, Dimitri

    1992-06-01

    A new algorithm for estimating the parameters of mixed-Weibull distributions from censored data is presented. The algorithm follows the principle of maximum likelihood estimate (MLE) through the expectation and maximization (EM) algorithm, and it is derived for both postmortem and nonpostmortem time-to-failure data. It is concluded that the concept of the EM algorithm is easy to understand and apply (only elementary statistics and calculus are required). The log-likelihood function cannot decrease after an EM sequence; this important feature was observed in all of the numerical calculations. The MLEs of the nonpostmortem data were obtained successfully for mixed-Weibull distributions with up to 14 parameters in a 5-subpopulation, mixed-Weibull distribution. Numerical examples indicate that some of the log-likelihood functions of the mixed-Weibull distributions have multiple local maxima; therefore, the algorithm should start at several initial guesses of the parameter set.

  15. Evaluation of post-interchange guide signs

    DOT National Transportation Integrated Search

    2002-12-01

    There are four basic types of guide signs related to tourist and recreational facilities, each with its own requirements and purpose. These include limited supplemental guide signs, cultural and recreational supplemental guide signs, fifth legend log...

  16. Varied applications of a new maximum-likelihood code with complete covariance capability. [FERRET, for data adjustment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmittroth, F.

    1978-01-01

    Applications of a new data-adjustment code are given. The method is based on a maximum-likelihood extension of generalized least-squares methods that allow complete covariance descriptions for the input data and the final adjusted data evaluations. The maximum-likelihood approach is used with a generalized log-normal distribution that provides a way to treat problems with large uncertainties and that circumvents the problem of negative values that can occur for physically positive quantities. The computer code, FERRET, is written to enable the user to apply it to a large variety of problems by modifying only the input subroutine. The following applications are discussed:more » A 75-group a priori damage function is adjusted by as much as a factor of two by use of 14 integral measurements in different reactor spectra. Reactor spectra and dosimeter cross sections are simultaneously adjusted on the basis of both integral measurements and experimental proton-recoil spectra. The simultaneous use of measured reaction rates, measured worths, microscopic measurements, and theoretical models are used to evaluate dosimeter and fission-product cross sections. Applications in the data reduction of neutron cross section measurements and in the evaluation of reactor after-heat are also considered. 6 figures.« less

  17. Condition and fate of logged forests in the Brazilian Amazon.

    Treesearch

    Gregory P. Asner; Eben N. Broadbent; Paulo J. C. Oliveira; Michael Keller; David E. Knapp; Jose N. M. Silva

    2006-01-01

    The long-term viability of a forest industry in the Amazon region of Brazil depends on the maintenance of adequate timber volume and growth in healthy forests. Using extensive high-resolution satellite analyses, we studied the forest damage caused by recent logging operations and the likelihood that logged forests would be cleared within 4 years after timber harvest....

  18. On the log-normality of historical magnetic-storm intensity statistics: implications for extreme-event probabilities

    USGS Publications Warehouse

    Love, Jeffrey J.; Rigler, E. Joshua; Pulkkinen, Antti; Riley, Pete

    2015-01-01

    An examination is made of the hypothesis that the statistics of magnetic-storm-maximum intensities are the realization of a log-normal stochastic process. Weighted least-squares and maximum-likelihood methods are used to fit log-normal functions to −Dst storm-time maxima for years 1957-2012; bootstrap analysis is used to established confidence limits on forecasts. Both methods provide fits that are reasonably consistent with the data; both methods also provide fits that are superior to those that can be made with a power-law function. In general, the maximum-likelihood method provides forecasts having tighter confidence intervals than those provided by weighted least-squares. From extrapolation of maximum-likelihood fits: a magnetic storm with intensity exceeding that of the 1859 Carrington event, −Dst≥850 nT, occurs about 1.13 times per century and a wide 95% confidence interval of [0.42,2.41] times per century; a 100-yr magnetic storm is identified as having a −Dst≥880 nT (greater than Carrington) but a wide 95% confidence interval of [490,1187] nT.

  19. Parameter estimation of history-dependent leaky integrate-and-fire neurons using maximum-likelihood methods

    PubMed Central

    Dong, Yi; Mihalas, Stefan; Russell, Alexander; Etienne-Cummings, Ralph; Niebur, Ernst

    2012-01-01

    When a neuronal spike train is observed, what can we say about the properties of the neuron that generated it? A natural way to answer this question is to make an assumption about the type of neuron, select an appropriate model for this type, and then to choose the model parameters as those that are most likely to generate the observed spike train. This is the maximum likelihood method. If the neuron obeys simple integrate and fire dynamics, Paninski, Pillow, and Simoncelli (2004) showed that its negative log-likelihood function is convex and that its unique global minimum can thus be found by gradient descent techniques. The global minimum property requires independence of spike time intervals. Lack of history dependence is, however, an important constraint that is not fulfilled in many biological neurons which are known to generate a rich repertoire of spiking behaviors that are incompatible with history independence. Therefore, we expanded the integrate and fire model by including one additional variable, a variable threshold (Mihalas & Niebur, 2009) allowing for history-dependent firing patterns. This neuronal model produces a large number of spiking behaviors while still being linear. Linearity is important as it maintains the distribution of the random variables and still allows for maximum likelihood methods to be used. In this study we show that, although convexity of the negative log-likelihood is not guaranteed for this model, the minimum of the negative log-likelihood function yields a good estimate for the model parameters, in particular if the noise level is treated as a free parameter. Furthermore, we show that a nonlinear function minimization method (r-algorithm with space dilation) frequently reaches the global minimum. PMID:21851282

  20. Optimal signal constellation design for ultra-high-speed optical transport in the presence of nonlinear phase noise.

    PubMed

    Liu, Tao; Djordjevic, Ivan B

    2014-12-29

    In this paper, we first describe an optimal signal constellation design algorithm suitable for the coherent optical channels dominated by the linear phase noise. Then, we modify this algorithm to be suitable for the nonlinear phase noise dominated channels. In optimization procedure, the proposed algorithm uses the cumulative log-likelihood function instead of the Euclidian distance. Further, an LDPC coded modulation scheme is proposed to be used in combination with signal constellations obtained by proposed algorithm. Monte Carlo simulations indicate that the LDPC-coded modulation schemes employing the new constellation sets, obtained by our new signal constellation design algorithm, outperform corresponding QAM constellations significantly in terms of transmission distance and have better nonlinearity tolerance.

  1. Soil moisture assimilation using a modified ensemble transform Kalman filter with water balance constraint

    NASA Astrophysics Data System (ADS)

    Wu, Guocan; Zheng, Xiaogu; Dan, Bo

    2016-04-01

    The shallow soil moisture observations are assimilated into Common Land Model (CoLM) to estimate the soil moisture in different layers. The forecast error is inflated to improve the analysis state accuracy and the water balance constraint is adopted to reduce the water budget residual in the assimilation procedure. The experiment results illustrate that the adaptive forecast error inflation can reduce the analysis error, while the proper inflation layer can be selected based on the -2log-likelihood function of the innovation statistic. The water balance constraint can result in reducing water budget residual substantially, at a low cost of assimilation accuracy loss. The assimilation scheme can be potentially applied to assimilate the remote sensing data.

  2. Effects of Methamphetamine on Vigilance and Tracking during Extended Wakefulness.

    DTIC Science & Technology

    1993-09-01

    the log likelihood ratio (log(p); Green & Swets, 1966; Macmillan & Creelman , 1990), was also derived from hit and false-alarm probabilities...vigilance task. Canadian Journal of Psychology, 19, 104-110. Macmillan, N.E., & Creelman , C.D. (1990). Response bias: Characteristics of detection

  3. Maximum likelihood solution for inclination-only data in paleomagnetism

    NASA Astrophysics Data System (ADS)

    Arason, P.; Levi, S.

    2010-08-01

    We have developed a new robust maximum likelihood method for estimating the unbiased mean inclination from inclination-only data. In paleomagnetic analysis, the arithmetic mean of inclination-only data is known to introduce a shallowing bias. Several methods have been introduced to estimate the unbiased mean inclination of inclination-only data together with measures of the dispersion. Some inclination-only methods were designed to maximize the likelihood function of the marginal Fisher distribution. However, the exact analytical form of the maximum likelihood function is fairly complicated, and all the methods require various assumptions and approximations that are often inappropriate. For some steep and dispersed data sets, these methods provide estimates that are significantly displaced from the peak of the likelihood function to systematically shallower inclination. The problem locating the maximum of the likelihood function is partly due to difficulties in accurately evaluating the function for all values of interest, because some elements of the likelihood function increase exponentially as precision parameters increase, leading to numerical instabilities. In this study, we succeeded in analytically cancelling exponential elements from the log-likelihood function, and we are now able to calculate its value anywhere in the parameter space and for any inclination-only data set. Furthermore, we can now calculate the partial derivatives of the log-likelihood function with desired accuracy, and locate the maximum likelihood without the assumptions required by previous methods. To assess the reliability and accuracy of our method, we generated large numbers of random Fisher-distributed data sets, for which we calculated mean inclinations and precision parameters. The comparisons show that our new robust Arason-Levi maximum likelihood method is the most reliable, and the mean inclination estimates are the least biased towards shallow values.

  4. Spatio-temporal analysis of Modified Omori law in Bayesian framework

    NASA Astrophysics Data System (ADS)

    Rezanezhad, V.; Narteau, C.; Shebalin, P.; Zoeller, G.; Holschneider, M.

    2017-12-01

    This work presents a study of the spatio temporal evolution of the modified Omori parameters in southern California in then time period of 1981-2016. A nearest-neighbor approach is applied for earthquake clustering. This study targets small mainshocks and corresponding big aftershocks ( 2.5 ≤ mmainshocks ≤ 4.5 and 1.8 ≤ maftershocks ≤ 2.8 ). We invert for the spatio temporal behavior of c and p values (especially c) all over the area using a MCMC based maximum likelihood estimator. As parameterizing families we use Voronoi cells with randomly distributed cell centers. Considering that c value represents a physical character like stress change we expect to see a coherent c value pattern over seismologically coacting areas. This correlation of c valus can actually be seen for the San Andreas, San Jacinto and Elsinore faults. Moreover, the depth dependency of c value is studied which shows a linear behavior of log(c) with respect to aftershock's depth within 5 to 15 km depth.

  5. [Evaluation of estimation of prevalence ratio using bayesian log-binomial regression model].

    PubMed

    Gao, W L; Lin, H; Liu, X N; Ren, X W; Li, J S; Shen, X P; Zhu, S L

    2017-03-10

    To evaluate the estimation of prevalence ratio ( PR ) by using bayesian log-binomial regression model and its application, we estimated the PR of medical care-seeking prevalence to caregivers' recognition of risk signs of diarrhea in their infants by using bayesian log-binomial regression model in Openbugs software. The results showed that caregivers' recognition of infant' s risk signs of diarrhea was associated significantly with a 13% increase of medical care-seeking. Meanwhile, we compared the differences in PR 's point estimation and its interval estimation of medical care-seeking prevalence to caregivers' recognition of risk signs of diarrhea and convergence of three models (model 1: not adjusting for the covariates; model 2: adjusting for duration of caregivers' education, model 3: adjusting for distance between village and township and child month-age based on model 2) between bayesian log-binomial regression model and conventional log-binomial regression model. The results showed that all three bayesian log-binomial regression models were convergence and the estimated PRs were 1.130(95 %CI : 1.005-1.265), 1.128(95 %CI : 1.001-1.264) and 1.132(95 %CI : 1.004-1.267), respectively. Conventional log-binomial regression model 1 and model 2 were convergence and their PRs were 1.130(95 % CI : 1.055-1.206) and 1.126(95 % CI : 1.051-1.203), respectively, but the model 3 was misconvergence, so COPY method was used to estimate PR , which was 1.125 (95 %CI : 1.051-1.200). In addition, the point estimation and interval estimation of PRs from three bayesian log-binomial regression models differed slightly from those of PRs from conventional log-binomial regression model, but they had a good consistency in estimating PR . Therefore, bayesian log-binomial regression model can effectively estimate PR with less misconvergence and have more advantages in application compared with conventional log-binomial regression model.

  6. Non-Asymptotic Oracle Inequalities for the High-Dimensional Cox Regression via Lasso.

    PubMed

    Kong, Shengchun; Nan, Bin

    2014-01-01

    We consider finite sample properties of the regularized high-dimensional Cox regression via lasso. Existing literature focuses on linear models or generalized linear models with Lipschitz loss functions, where the empirical risk functions are the summations of independent and identically distributed (iid) losses. The summands in the negative log partial likelihood function for censored survival data, however, are neither iid nor Lipschitz.We first approximate the negative log partial likelihood function by a sum of iid non-Lipschitz terms, then derive the non-asymptotic oracle inequalities for the lasso penalized Cox regression using pointwise arguments to tackle the difficulties caused by lacking iid Lipschitz losses.

  7. Non-Asymptotic Oracle Inequalities for the High-Dimensional Cox Regression via Lasso

    PubMed Central

    Kong, Shengchun; Nan, Bin

    2013-01-01

    We consider finite sample properties of the regularized high-dimensional Cox regression via lasso. Existing literature focuses on linear models or generalized linear models with Lipschitz loss functions, where the empirical risk functions are the summations of independent and identically distributed (iid) losses. The summands in the negative log partial likelihood function for censored survival data, however, are neither iid nor Lipschitz.We first approximate the negative log partial likelihood function by a sum of iid non-Lipschitz terms, then derive the non-asymptotic oracle inequalities for the lasso penalized Cox regression using pointwise arguments to tackle the difficulties caused by lacking iid Lipschitz losses. PMID:24516328

  8. ELASTIC NET FOR COX'S PROPORTIONAL HAZARDS MODEL WITH A SOLUTION PATH ALGORITHM.

    PubMed

    Wu, Yichao

    2012-01-01

    For least squares regression, Efron et al. (2004) proposed an efficient solution path algorithm, the least angle regression (LAR). They showed that a slight modification of the LAR leads to the whole LASSO solution path. Both the LAR and LASSO solution paths are piecewise linear. Recently Wu (2011) extended the LAR to generalized linear models and the quasi-likelihood method. In this work we extend the LAR further to handle Cox's proportional hazards model. The goal is to develop a solution path algorithm for the elastic net penalty (Zou and Hastie (2005)) in Cox's proportional hazards model. This goal is achieved in two steps. First we extend the LAR to optimizing the log partial likelihood plus a fixed small ridge term. Then we define a path modification, which leads to the solution path of the elastic net regularized log partial likelihood. Our solution path is exact and piecewise determined by ordinary differential equation systems.

  9. Shape, Illumination, and Reflectance from Shading

    DTIC Science & Technology

    2013-05-29

    the global entropy of log-reflectance. 3) An “absolute” prior on reflectance which prefers to paint the scene with some colors ( white , gray, green...in log- RGB from pixel i to pixel j, and c (· ;α, σ) is the negative log-likelihood of a discrete univariate Gaussian scale mixture (GSM), parametrized...gs(R) = ∑ i ∑ j∈N(i) C (Ri −Rj ;αR, σR,ΣR) (6) Where Ri−Rj is now a 3-vector of the log- RGB differ- ences, α are mixing coefficients, σ are the

  10. Performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data.

    PubMed

    Yelland, Lisa N; Salter, Amy B; Ryan, Philip

    2011-10-15

    Modified Poisson regression, which combines a log Poisson regression model with robust variance estimation, is a useful alternative to log binomial regression for estimating relative risks. Previous studies have shown both analytically and by simulation that modified Poisson regression is appropriate for independent prospective data. This method is often applied to clustered prospective data, despite a lack of evidence to support its use in this setting. The purpose of this article is to evaluate the performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data, by using generalized estimating equations to account for clustering. A simulation study is conducted to compare log binomial regression and modified Poisson regression for analyzing clustered data from intervention and observational studies. Both methods generally perform well in terms of bias, type I error, and coverage. Unlike log binomial regression, modified Poisson regression is not prone to convergence problems. The methods are contrasted by using example data sets from 2 large studies. The results presented in this article support the use of modified Poisson regression as an alternative to log binomial regression for analyzing clustered prospective data when clustering is taken into account by using generalized estimating equations.

  11. Effect of advanced age and vital signs on admission from an ED observation unit.

    PubMed

    Caterino, Jeffrey M; Hoover, Emily M; Moseley, Mark G

    2013-01-01

    The primary objective was to determine the relationship between advanced age and need for admission from an emergency department (ED) observation unit. The secondary objective was to determine the relationship between initial ED vital signs and admission. We conducted a prospective, observational cohort study of ED patients placed in an ED-based observation unit. Multivariable penalized maximum likelihood logistic regression was used to identify independent predictors of need for hospital admission. Age was examined continuously and at a cutoff of 65 years or more. Vital signs were examined continuously and at commonly accepted cutoffs.We additionally controlled for demographics, comorbid conditions, laboratory values, and observation protocol. Three hundred patients were enrolled, 12% (n = 35) were 65 years or older, and 11% (n = 33) required admission. Admission rates were 2.9% (95% confidence interval [CI], 0.07%-14.9%) in older adults and 12.1% (95% CI, 8.4%-16.6%) in younger adults. In multivariable analysis, age was not associated with admission (odds ratio [OR], 0.30; 95% CI, 0.05-1.67). Predictors of admission included systolic pressure 180 mm Hg or greater (OR, 4.19; 95% CI, 1.08-16.30), log Charlson comorbidity score (OR, 2.93; 95% CI, 1.57-5.46), and white blood cell count 14,000/mm(3) or greater (OR, 11.35; 95% CI, 3.42-37.72). Among patients placed in an ED observation unit, age 65 years or more is not associated with need for admission. Older adults can successfully be discharged from these units. Systolic pressure 180 mm Hg or greater was the only predictive vital sign. In determining appropriateness of patients selected for an ED observation unit, advanced age should not be an automatic disqualifying criterion. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Effect of advanced age and vital signs on admission from an emergency department observation unit

    PubMed Central

    Caterino, Jeffrey M.; Hoover, Emily; Moseley, Mark G.

    2012-01-01

    Objectives The primary objective was to determine the relationship between advanced age and need for admission from an emergency department (ED) observation unit. The secondary objective was to determine the relationship between initial ED vital signs and admission. Methods We conducted a prospective, observational cohort study of ED patients placed in an ED-based observation unit. Multivariable penalized maximum likelihood logistic regression was used to identify independent predictors of need for hospital admission. Age was examined continuously and at a cutoff of ≥65 years. Vital signs were examined continuously and at commonly accepted cutoffs. We additionally controlled for demographics, co-morbid conditions, laboratory values, and observation protocol. Results Three hundred patients were enrolled, 12% (n=35) ≥65 years old and 11% (n=33) requiring admission. Admission rates were 2.9% (95% confidence interval [CI], 0.07-14.9%) in older adults and 12.1% (95% CI, 8.4-16.6%) in younger adults. In multivariable analysis, age was not associated with admission (odds ratio [OR] 0.30, 95% CI 0.05-1.67). Predictors of admission included: systolic pressure ≥180 mmHg (OR 4.19, 95% CI 1.08-16.30), log Charlson co-morbidity score (OR 2.93, 95% CI 1.57-5.46), and white blood cell count ≥14,000/mm3 (OR11.35, 95% CI 3.42-37.72). Conclusions Among patients placed in an ED observation unit, age ≥65 years is not associated with need for admission. Older adults can successfully be discharged from these units. Systolic pressure≥180 mmHg was the only predictive vital sign. In determining appropriateness of patients selected for an ED observation unit, advanced age should not be an automatic disqualifying criterion. PMID:22386358

  13. Analysis of the observed and intrinsic durations of Swift/BAT gamma-ray bursts

    NASA Astrophysics Data System (ADS)

    Tarnopolski, Mariusz

    2016-07-01

    The duration distribution of 947 GRBs observed by Swift/BAT, as well as its subsample of 347 events with measured redshift, allowing to examine the durations in both the observer and rest frames, are examined. Using a maximum log-likelihood method, mixtures of two and three standard Gaussians are fitted to each sample, and the adequate model is chosen based on the value of the difference in the log-likelihoods, Akaike information criterion and Bayesian information criterion. It is found that a two-Gaussian is a better description than a three-Gaussian, and that the presumed intermediate-duration class is unlikely to be present in the Swift duration data.

  14. On the Log-Normality of Historical Magnetic-Storm Intensity Statistics: Implications for Extreme-Event Probabilities

    NASA Astrophysics Data System (ADS)

    Love, J. J.; Rigler, E. J.; Pulkkinen, A. A.; Riley, P.

    2015-12-01

    An examination is made of the hypothesis that the statistics of magnetic-storm-maximum intensities are the realization of a log-normal stochastic process. Weighted least-squares and maximum-likelihood methods are used to fit log-normal functions to -Dst storm-time maxima for years 1957-2012; bootstrap analysis is used to established confidence limits on forecasts. Both methods provide fits that are reasonably consistent with the data; both methods also provide fits that are superior to those that can be made with a power-law function. In general, the maximum-likelihood method provides forecasts having tighter confidence intervals than those provided by weighted least-squares. From extrapolation of maximum-likelihood fits: a magnetic storm with intensity exceeding that of the 1859 Carrington event, -Dst > 850 nT, occurs about 1.13 times per century and a wide 95% confidence interval of [0.42, 2.41] times per century; a 100-yr magnetic storm is identified as having a -Dst > 880 nT (greater than Carrington) but a wide 95% confidence interval of [490, 1187] nT. This work is partially motivated by United States National Science and Technology Council and Committee on Space Research and International Living with a Star priorities and strategic plans for the assessment and mitigation of space-weather hazards.

  15. GLOBAL RATES OF CONVERGENCE OF THE MLES OF LOG-CONCAVE AND s-CONCAVE DENSITIES

    PubMed Central

    Doss, Charles R.; Wellner, Jon A.

    2017-01-01

    We establish global rates of convergence for the Maximum Likelihood Estimators (MLEs) of log-concave and s-concave densities on ℝ. The main finding is that the rate of convergence of the MLE in the Hellinger metric is no worse than n−2/5 when −1 < s < ∞ where s = 0 corresponds to the log-concave case. We also show that the MLE does not exist for the classes of s-concave densities with s < −1. PMID:28966409

  16. Effect of radiance-to-reflectance transformation and atmosphere removal on maximum likelihood classification accuracy of high-dimensional remote sensing data

    NASA Technical Reports Server (NTRS)

    Hoffbeck, Joseph P.; Landgrebe, David A.

    1994-01-01

    Many analysis algorithms for high-dimensional remote sensing data require that the remotely sensed radiance spectra be transformed to approximate reflectance to allow comparison with a library of laboratory reflectance spectra. In maximum likelihood classification, however, the remotely sensed spectra are compared to training samples, thus a transformation to reflectance may or may not be helpful. The effect of several radiance-to-reflectance transformations on maximum likelihood classification accuracy is investigated in this paper. We show that the empirical line approach, LOWTRAN7, flat-field correction, single spectrum method, and internal average reflectance are all non-singular affine transformations, and that non-singular affine transformations have no effect on discriminant analysis feature extraction and maximum likelihood classification accuracy. (An affine transformation is a linear transformation with an optional offset.) Since the Atmosphere Removal Program (ATREM) and the log residue method are not affine transformations, experiments with Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data were conducted to determine the effect of these transformations on maximum likelihood classification accuracy. The average classification accuracy of the data transformed by ATREM and the log residue method was slightly less than the accuracy of the original radiance data. Since the radiance-to-reflectance transformations allow direct comparison of remotely sensed spectra with laboratory reflectance spectra, they can be quite useful in labeling the training samples required by maximum likelihood classification, but these transformations have only a slight effect or no effect at all on discriminant analysis and maximum likelihood classification accuracy.

  17. Consolidation of Military Pay and Personnel Functions (Copper). Volume II

    DTIC Science & Technology

    1978-05-01

    Inclusi Log In 3815 Sign 3815 To T.S ToPage 11-15-A12 1 1 -1-AlICOPPER From Pago 11-15-All Log 11TL In UTL Control Log From Page 11-15-AIO Today’s saing...and Annota 381 for Inclusi 381 Il-S-ll To Page 11-15-A12 i11-15_Al INE TK T38T SUflLUEOT From Page I1-15-All Log UTL In UTL Control Log Fro Po Pa 9 I-5

  18. Tardive Dyskinesia

    MedlinePlus

    ... Search form Sorry, we didn't find an account with that username and password. Please try again. Close Sign In to myNAMI signin form Forgot Sign In Create an Account Logging in... Learn More Find Support Get Involved ...

  19. ELASTIC NET FOR COX’S PROPORTIONAL HAZARDS MODEL WITH A SOLUTION PATH ALGORITHM

    PubMed Central

    Wu, Yichao

    2012-01-01

    For least squares regression, Efron et al. (2004) proposed an efficient solution path algorithm, the least angle regression (LAR). They showed that a slight modification of the LAR leads to the whole LASSO solution path. Both the LAR and LASSO solution paths are piecewise linear. Recently Wu (2011) extended the LAR to generalized linear models and the quasi-likelihood method. In this work we extend the LAR further to handle Cox’s proportional hazards model. The goal is to develop a solution path algorithm for the elastic net penalty (Zou and Hastie (2005)) in Cox’s proportional hazards model. This goal is achieved in two steps. First we extend the LAR to optimizing the log partial likelihood plus a fixed small ridge term. Then we define a path modification, which leads to the solution path of the elastic net regularized log partial likelihood. Our solution path is exact and piecewise determined by ordinary differential equation systems. PMID:23226932

  20. Maximum likelihood density modification by pattern recognition of structural motifs

    DOEpatents

    Terwilliger, Thomas C.

    2004-04-13

    An electron density for a crystallographic structure having protein regions and solvent regions is improved by maximizing the log likelihood of a set of structures factors {F.sub.h } using a local log-likelihood function: (x)+p(.rho.(x).vertline.SOLV)p.sub.SOLV (x)+p(.rho.(x).vertline.H)p.sub.H (x)], where p.sub.PROT (x) is the probability that x is in the protein region, p(.rho.(x).vertline.PROT) is the conditional probability for .rho.(x) given that x is in the protein region, and p.sub.SOLV (x) and p(.rho.(x).vertline.SOLV) are the corresponding quantities for the solvent region, p.sub.H (x) refers to the probability that there is a structural motif at a known location, with a known orientation, in the vicinity of the point x; and p(.rho.(x).vertline.H) is the probability distribution for electron density at this point given that the structural motif actually is present. One appropriate structural motif is a helical structure within the crystallographic structure.

  1. 47 CFR 1.550 - Requests for new or modified call sign assignments.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Requests for new or modified call sign assignments. 1.550 Section 1.550 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND... modified call sign assignments. See § 73.3550. ...

  2. Performance of blend sign in predicting hematoma expansion in intracerebral hemorrhage: A meta-analysis.

    PubMed

    Yu, Zhiyuan; Zheng, Jun; Guo, Rui; Ma, Lu; Li, Mou; Wang, Xiaoze; Lin, Sen; Li, Hao; You, Chao

    2017-12-01

    Hematoma expansion is independently associated with poor outcome in intracerebral hemorrhage (ICH). Blend sign is a simple predictor for hematoma expansion on non-contrast computed tomography. However, its accuracy for predicting hematoma expansion is inconsistent in previous studies. This meta-analysis is aimed to systematically assess the performance of blend sign in predicting hematoma expansion in ICH. A systematic literature search was conducted. Original studies about predictive accuracy of blend sign for hematoma expansion in ICH were included. Pooled sensitivity, specificity, positive and negative likelihood ratios were calculated. Summary receiver operating characteristics curve was constructed. Publication bias was assessed by Deeks' funnel plot asymmetry test. A total of 5 studies with 2248 patients were included in this meta-analysis. The pooled sensitivity, specificity, positive and negative likelihood ratios of blend sign for predicting hematoma expansion were 0.28, 0.92, 3.4 and 0.78, respectively. The area under the curve (AUC) was 0.85. No significant publication bias was found. This meta-analysis demonstrates that blend sign is a useful predictor with high specificity for hematoma expansion in ICH. Further studies with larger sample size are still necessary to verify the accuracy of blend sign for predicting hematoma expansion. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Compatibility of pedigree-based and marker-based relationship matrices for single-step genetic evaluation.

    PubMed

    Christensen, Ole F

    2012-12-03

    Single-step methods provide a coherent and conceptually simple approach to incorporate genomic information into genetic evaluations. An issue with single-step methods is compatibility between the marker-based relationship matrix for genotyped animals and the pedigree-based relationship matrix. Therefore, it is necessary to adjust the marker-based relationship matrix to the pedigree-based relationship matrix. Moreover, with data from routine evaluations, this adjustment should in principle be based on both observed marker genotypes and observed phenotypes, but until now this has been overlooked. In this paper, I propose a new method to address this issue by 1) adjusting the pedigree-based relationship matrix to be compatible with the marker-based relationship matrix instead of the reverse and 2) extending the single-step genetic evaluation using a joint likelihood of observed phenotypes and observed marker genotypes. The performance of this method is then evaluated using two simulated datasets. The method derived here is a single-step method in which the marker-based relationship matrix is constructed assuming all allele frequencies equal to 0.5 and the pedigree-based relationship matrix is constructed using the unusual assumption that animals in the base population are related and inbred with a relationship coefficient γ and an inbreeding coefficient γ / 2. Taken together, this γ parameter and a parameter that scales the marker-based relationship matrix can handle the issue of compatibility between marker-based and pedigree-based relationship matrices. The full log-likelihood function used for parameter inference contains two terms. The first term is the REML-log-likelihood for the phenotypes conditional on the observed marker genotypes, whereas the second term is the log-likelihood for the observed marker genotypes. Analyses of the two simulated datasets with this new method showed that 1) the parameters involved in adjusting marker-based and pedigree-based relationship matrices can depend on both observed phenotypes and observed marker genotypes and 2) a strong association between these two parameters exists. Finally, this method performed at least as well as a method based on adjusting the marker-based relationship matrix. Using the full log-likelihood and adjusting the pedigree-based relationship matrix to be compatible with the marker-based relationship matrix provides a new and interesting approach to handle the issue of compatibility between the two matrices in single-step genetic evaluation.

  4. Pest risk assessment of the importation into the United States of unprocessed Pinus logs and chips from Australia

    Treesearch

    John T Kliejunas; Harold H. Burdsall; Gregg A. DeNitto; Andris Eglitis; Dennis A. Haugen; Michael I. Haverty; Jessie A. Micales-Glaeser

    2006-01-01

    The unmitigated pest risk potential for the importation of unprocessed logs and chips of species of Pinus (Pinus radiata, P. elliottii Engelm. var. elliottii, P. taeda L., and P. caribaea var. hondurensis, principally) from Australia into the United States was assessed by estimating the likelihood and consequences of introduction of representative insects and pathogens...

  5. Pest risk assessment of the importation into the United States of unprocessed Eucalyptus logs and chips from South America

    Treesearch

    John T. Kliejunas; Harold H. Burdsall; Gregg A. DeNitto; Andris Eglitis; Dennis A. Haugen; William E. Wallner

    2001-01-01

    In this report, we assess the unmitigated pest risk potential of importing Eucalyptus logs and chips from South America into the United States. To do this, we estimated the likelihood and consequences of introducing representative insects and pathogens of concern. Nineteen individual pest risk assessments were prepared, eleven dealing with insects and eight with...

  6. Dual Diagnosis: Substance Abuse and Mental Illness

    MedlinePlus

    ... Search form Sorry, we didn't find an account with that username and password. Please try again. Close Sign In to myNAMI signin form Forgot Sign In Create an Account Logging in... Learn More Find Support Get Involved ...

  7. Exponential series approaches for nonparametric graphical models

    NASA Astrophysics Data System (ADS)

    Janofsky, Eric

    Markov Random Fields (MRFs) or undirected graphical models are parsimonious representations of joint probability distributions. This thesis studies high-dimensional, continuous-valued pairwise Markov Random Fields. We are particularly interested in approximating pairwise densities whose logarithm belongs to a Sobolev space. For this problem we propose the method of exponential series which approximates the log density by a finite-dimensional exponential family with the number of sufficient statistics increasing with the sample size. We consider two approaches to estimating these models. The first is regularized maximum likelihood. This involves optimizing the sum of the log-likelihood of the data and a sparsity-inducing regularizer. We then propose a variational approximation to the likelihood based on tree-reweighted, nonparametric message passing. This approximation allows for upper bounds on risk estimates, leverages parallelization and is scalable to densities on hundreds of nodes. We show how the regularized variational MLE may be estimated using a proximal gradient algorithm. We then consider estimation using regularized score matching. This approach uses an alternative scoring rule to the log-likelihood, which obviates the need to compute the normalizing constant of the distribution. For general continuous-valued exponential families, we provide parameter and edge consistency results. As a special case we detail a new approach to sparse precision matrix estimation which has statistical performance competitive with the graphical lasso and computational performance competitive with the state-of-the-art glasso algorithm. We then describe results for model selection in the nonparametric pairwise model using exponential series. The regularized score matching problem is shown to be a convex program; we provide scalable algorithms based on consensus alternating direction method of multipliers (ADMM) and coordinate-wise descent. We use simulations to compare our method to others in the literature as well as the aforementioned TRW estimator.

  8. Drilling, logging, and testing information from borehole UE-25 UZ{number_sign}16, Yucca Mountain, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thamir, F.; Thordarson, W.; Kume, J.

    Borehole UE-25 UZ{number_sign}16 is the first of two boreholes that may be used to determine the subsurface structure at Yucca Mountain by using vertical seismic profiling. This report contains information collected while this borehole was being drilled, logged, and tested from May 27, 1992, to April 22, 1994. It does not contain the vertical seismic profiling data. This report is intended to be used as: (1) a reference for drilling similar boreholes in the same area, (2) a data source on this borehole, and (3) a reference for other information that is available from this borehole. The reference information includesmore » drilling chronology, equipment, parameters, coring methods, penetration rates, completion information, drilling problems, and corrective actions. The data sources include lithology, fracture logs, a list of available borehole logs, and depths at which water was recorded. Other information is listed in an appendix that includes studies done after April 22, 1994.« less

  9. Modified signed-digit arithmetic based on redundant bit representation.

    PubMed

    Huang, H; Itoh, M; Yatagai, T

    1994-09-10

    Fully parallel modified signed-digit arithmetic operations are realized based on redundant bit representation of the digits proposed. A new truth-table minimizing technique is presented based on redundant-bitrepresentation coding. It is shown that only 34 minterms are enough for implementing one-step modified signed-digit addition and subtraction with this new representation. Two optical implementation schemes, correlation and matrix multiplication, are described. Experimental demonstrations of the correlation architecture are presented. Both architectures use fixed minterm masks for arbitrary-length operands, taking full advantage of the parallelism of the modified signed-digit number system and optics.

  10. Posterior propriety for hierarchical models with log-likelihoods that have norm bounds

    DOE PAGES

    Michalak, Sarah E.; Morris, Carl N.

    2015-07-17

    Statisticians often use improper priors to express ignorance or to provide good frequency properties, requiring that posterior propriety be verified. Our paper addresses generalized linear mixed models, GLMMs, when Level I parameters have Normal distributions, with many commonly-used hyperpriors. It provides easy-to-verify sufficient posterior propriety conditions based on dimensions, matrix ranks, and exponentiated norm bounds, ENBs, for the Level I likelihood. Since many familiar likelihoods have ENBs, which is often verifiable via log-concavity and MLE finiteness, our novel use of ENBs permits unification of posterior propriety results and posterior MGF/moment results for many useful Level I distributions, including those commonlymore » used with multilevel generalized linear models, e.g., GLMMs and hierarchical generalized linear models, HGLMs. Furthermore, those who need to verify existence of posterior distributions or of posterior MGFs/moments for a multilevel generalized linear model given a proper or improper multivariate F prior as in Section 1 should find the required results in Sections 1 and 2 and Theorem 3 (GLMMs), Theorem 4 (HGLMs), or Theorem 5 (posterior MGFs/moments).« less

  11. A Poisson Log-Normal Model for Constructing Gene Covariation Network Using RNA-seq Data.

    PubMed

    Choi, Yoonha; Coram, Marc; Peng, Jie; Tang, Hua

    2017-07-01

    Constructing expression networks using transcriptomic data is an effective approach for studying gene regulation. A popular approach for constructing such a network is based on the Gaussian graphical model (GGM), in which an edge between a pair of genes indicates that the expression levels of these two genes are conditionally dependent, given the expression levels of all other genes. However, GGMs are not appropriate for non-Gaussian data, such as those generated in RNA-seq experiments. We propose a novel statistical framework that maximizes a penalized likelihood, in which the observed count data follow a Poisson log-normal distribution. To overcome the computational challenges, we use Laplace's method to approximate the likelihood and its gradients, and apply the alternating directions method of multipliers to find the penalized maximum likelihood estimates. The proposed method is evaluated and compared with GGMs using both simulated and real RNA-seq data. The proposed method shows improved performance in detecting edges that represent covarying pairs of genes, particularly for edges connecting low-abundant genes and edges around regulatory hubs.

  12. The Association between Environmental Factors and Scarlet Fever Incidence in Beijing Region: Using GIS and Spatial Regression Models

    PubMed Central

    Mahara, Gehendra; Wang, Chao; Yang, Kun; Chen, Sipeng; Guo, Jin; Gao, Qi; Wang, Wei; Wang, Quanyi; Guo, Xiuhua

    2016-01-01

    (1) Background: Evidence regarding scarlet fever and its relationship with meteorological, including air pollution factors, is not very available. This study aimed to examine the relationship between ambient air pollutants and meteorological factors with scarlet fever occurrence in Beijing, China. (2) Methods: A retrospective ecological study was carried out to distinguish the epidemic characteristics of scarlet fever incidence in Beijing districts from 2013 to 2014. Daily incidence and corresponding air pollutant and meteorological data were used to develop the model. Global Moran’s I statistic and Anselin’s local Moran’s I (LISA) were applied to detect the spatial autocorrelation (spatial dependency) and clusters of scarlet fever incidence. The spatial lag model (SLM) and spatial error model (SEM) including ordinary least squares (OLS) models were then applied to probe the association between scarlet fever incidence and meteorological including air pollution factors. (3) Results: Among the 5491 cases, more than half (62%) were male, and more than one-third (37.8%) were female, with the annual average incidence rate 14.64 per 100,000 population. Spatial autocorrelation analysis exhibited the existence of spatial dependence; therefore, we applied spatial regression models. After comparing the values of R-square, log-likelihood and the Akaike information criterion (AIC) among the three models, the OLS model (R2 = 0.0741, log likelihood = −1819.69, AIC = 3665.38), SLM (R2 = 0.0786, log likelihood = −1819.04, AIC = 3665.08) and SEM (R2 = 0.0743, log likelihood = −1819.67, AIC = 3665.36), identified that the spatial lag model (SLM) was best for model fit for the regression model. There was a positive significant association between nitrogen oxide (p = 0.027), rainfall (p = 0.036) and sunshine hour (p = 0.048), while the relative humidity (p = 0.034) had an adverse association with scarlet fever incidence in SLM. (4) Conclusions: Our findings indicated that meteorological, as well as air pollutant factors may increase the incidence of scarlet fever; these findings may help to guide scarlet fever control programs and targeting the intervention. PMID:27827946

  13. The Association between Environmental Factors and Scarlet Fever Incidence in Beijing Region: Using GIS and Spatial Regression Models.

    PubMed

    Mahara, Gehendra; Wang, Chao; Yang, Kun; Chen, Sipeng; Guo, Jin; Gao, Qi; Wang, Wei; Wang, Quanyi; Guo, Xiuhua

    2016-11-04

    (1) Background: Evidence regarding scarlet fever and its relationship with meteorological, including air pollution factors, is not very available. This study aimed to examine the relationship between ambient air pollutants and meteorological factors with scarlet fever occurrence in Beijing, China. (2) Methods: A retrospective ecological study was carried out to distinguish the epidemic characteristics of scarlet fever incidence in Beijing districts from 2013 to 2014. Daily incidence and corresponding air pollutant and meteorological data were used to develop the model. Global Moran's I statistic and Anselin's local Moran's I (LISA) were applied to detect the spatial autocorrelation (spatial dependency) and clusters of scarlet fever incidence. The spatial lag model (SLM) and spatial error model (SEM) including ordinary least squares (OLS) models were then applied to probe the association between scarlet fever incidence and meteorological including air pollution factors. (3) Results: Among the 5491 cases, more than half (62%) were male, and more than one-third (37.8%) were female, with the annual average incidence rate 14.64 per 100,000 population. Spatial autocorrelation analysis exhibited the existence of spatial dependence; therefore, we applied spatial regression models. After comparing the values of R-square, log-likelihood and the Akaike information criterion (AIC) among the three models, the OLS model (R² = 0.0741, log likelihood = -1819.69, AIC = 3665.38), SLM (R² = 0.0786, log likelihood = -1819.04, AIC = 3665.08) and SEM (R² = 0.0743, log likelihood = -1819.67, AIC = 3665.36), identified that the spatial lag model (SLM) was best for model fit for the regression model. There was a positive significant association between nitrogen oxide ( p = 0.027), rainfall ( p = 0.036) and sunshine hour ( p = 0.048), while the relative humidity ( p = 0.034) had an adverse association with scarlet fever incidence in SLM. (4) Conclusions: Our findings indicated that meteorological, as well as air pollutant factors may increase the incidence of scarlet fever; these findings may help to guide scarlet fever control programs and targeting the intervention.

  14. The Frequency of Fitness Peak Shifts Is Increased at Expanding Range Margins Due to Mutation Surfing

    PubMed Central

    Burton, Olivia J.; Travis, Justin M. J.

    2008-01-01

    Dynamic species' ranges, those that are either invasive or shifting in response to environmental change, are the focus of much recent interest in ecology, evolution, and genetics. Understanding how range expansions can shape evolutionary trajectories requires the consideration of nonneutral variability and genetic architecture, yet the majority of empirical and theoretical work to date has explored patterns of neutral variability. Here we use forward computer simulations of population growth, dispersal, and mutation to explore how range-shifting dynamics can influence evolution on rugged fitness landscapes. We employ a two-locus model, incorporating sign epistasis, and find that there is an increased likelihood of fitness peak shifts during a period of range expansion. Maladapted valley genotypes can accumulate at an expanding range front through a phenomenon called mutation surfing, which increases the likelihood that a mutation leading to a higher peak will occur. Our results indicate that most peak shifts occur close to the expanding front. We also demonstrate that periods of range shifting are especially important for peak shifting in species with narrow geographic distributions. Our results imply that trajectories on rugged fitness landscapes can be modified substantially when ranges are dynamic. PMID:18505864

  15. Linear solvation energy relationships regarding sorption and retention properties of hydrophobic organic compounds in soil leaching column chromatography.

    PubMed

    Xu, Feng; Liang, Xinmiao; Lin, Bingcheng; Su, Fan; Schramm, Karl-Werner; Kettrup, Antonius

    2002-08-01

    The capacity factors of a series of hydrophobic organic compounds (HOCs) were measured in soil leaching column chromatography (SLCC) on a soil column, and in reversed-phase liquid chromatography on a C18 column with different volumetric fractions (phi) of methanol in methanol-water mixtures. A general equation of linear solvation energy relationships, log(XYZ) XYZ0 + mV(I)/100 + spi + bbetam + aalpham, was applied to analyze capacity factors (k'), soil organic partition coefficients (Koc) and octanol-water partition coefficients (P). The analyses exhibited high accuracy. The chief solute factors that control logKoc, log P, and logk' (on soil and on C18) are the solute size (V(I)/100) and hydrogen-bond basicity (betam). Less important solute factors are the dipolarity/polarizability (pi*) and hydrogen-bond acidity (alpham). Log k' on soil and log Koc have similar signs in four fitting coefficients (m, s, b and a) and similar ratios (m:s:b:a), while log k' on C18 and logP have similar signs in coefficients (m, s, b and a) and similar ratios (m:s:b:a). Consequently, logk' values on C18 have good correlations with logP (r > 0.97), while logk' values on soil have good correlations with logKoc (r > 0.98). Two Koc estimation methods were developed, one through solute solvatochromic parameters, and the other through correlations with k' on soil. For HOCs, a linear relationship between logarithmic capacity factor and methanol composition in methanol-water mixtures could also be derived in SLCC.

  16. Indicators of Terrorism Vulnerability in Africa

    DTIC Science & Technology

    2015-03-26

    the terror threat and vulnerabilities across Africa. Key words: Terrorism, Africa, Negative Binomial Regression, Classification Tree iv I would like...31 Metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 Log -likelihood...70 viii Page 5.3 Classification Tree Description

  17. Image transmission system using adaptive joint source and channel decoding

    NASA Astrophysics Data System (ADS)

    Liu, Weiliang; Daut, David G.

    2005-03-01

    In this paper, an adaptive joint source and channel decoding method is designed to accelerate the convergence of the iterative log-dimain sum-product decoding procedure of LDPC codes as well as to improve the reconstructed image quality. Error resilience modes are used in the JPEG2000 source codec, which makes it possible to provide useful source decoded information to the channel decoder. After each iteration, a tentative decoding is made and the channel decoded bits are then sent to the JPEG2000 decoder. Due to the error resilience modes, some bits are known to be either correct or in error. The positions of these bits are then fed back to the channel decoder. The log-likelihood ratios (LLR) of these bits are then modified by a weighting factor for the next iteration. By observing the statistics of the decoding procedure, the weighting factor is designed as a function of the channel condition. That is, for lower channel SNR, a larger factor is assigned, and vice versa. Results show that the proposed joint decoding methods can greatly reduce the number of iterations, and thereby reduce the decoding delay considerably. At the same time, this method always outperforms the non-source controlled decoding method up to 5dB in terms of PSNR for various reconstructed images.

  18. Interpretation of diagnostic data: 5. How to do it with simple maths.

    PubMed

    1983-11-01

    The use of simple maths with the likelihood ratio strategy fits in nicely with our clinical views. By making the most out of the entire range of diagnostic test results (i.e., several levels, each with its own likelihood ratio, rather than a single cut-off point and a single ratio) and by permitting us to keep track of the likelihood that a patient has the target disorder at each point along the diagnostic sequence, this strategy allows us to place patients at an extremely high or an extremely low likelihood of disease. Thus, the numbers of patients with ultimately false-positive results (who suffer the slings of labelling and the arrows of needless therapy) and of those with ultimately false-negative results (who therefore miss their chance for diagnosis and, possibly, efficacious therapy) will be dramatically reduced. The following guidelines will be useful in interpreting signs, symptoms and laboratory tests with the likelihood ratio strategy: Seek out, and demand from the clinical or laboratory experts who ought to know, the likelihood ratios for key symptoms and signs, and several levels (rather than just the positive and negative results) of diagnostic test results. Identify, when feasible, the logical sequence of diagnostic tests. Estimate the pretest probability of disease for the patient, and, using either the nomogram or the conversion formulas, apply the likelihood ratio that corresponds to the first diagnostic test result. While remembering that the resulting post-test probability or odds from the first test becomes the pretest probability or odds for the next diagnostic test, repeat the process for all the pertinent symptoms, signs and laboratory studies that pertain to the target disorder. However, these combinations may not be independent, and convergent diagnostic tests, if treated as independent, will combine to overestimate the final post-test probability of disease. You are now far more sophisticated in interpreting diagnostic tests than most of your teachers. In the last part of our series we will show you some rather complex strategies that combine diagnosis and therapy, quantify our as yet nonquantified ideas about use, and require the use of at least a hand calculator.

  19. Interpretation of diagnostic data: 5. How to do it with simple maths.

    PubMed Central

    1983-01-01

    The use of simple maths with the likelihood ratio strategy fits in nicely with our clinical views. By making the most out of the entire range of diagnostic test results (i.e., several levels, each with its own likelihood ratio, rather than a single cut-off point and a single ratio) and by permitting us to keep track of the likelihood that a patient has the target disorder at each point along the diagnostic sequence, this strategy allows us to place patients at an extremely high or an extremely low likelihood of disease. Thus, the numbers of patients with ultimately false-positive results (who suffer the slings of labelling and the arrows of needless therapy) and of those with ultimately false-negative results (who therefore miss their chance for diagnosis and, possibly, efficacious therapy) will be dramatically reduced. The following guidelines will be useful in interpreting signs, symptoms and laboratory tests with the likelihood ratio strategy: Seek out, and demand from the clinical or laboratory experts who ought to know, the likelihood ratios for key symptoms and signs, and several levels (rather than just the positive and negative results) of diagnostic test results. Identify, when feasible, the logical sequence of diagnostic tests. Estimate the pretest probability of disease for the patient, and, using either the nomogram or the conversion formulas, apply the likelihood ratio that corresponds to the first diagnostic test result. While remembering that the resulting post-test probability or odds from the first test becomes the pretest probability or odds for the next diagnostic test, repeat the process for all the pertinent symptoms, signs and laboratory studies that pertain to the target disorder. However, these combinations may not be independent, and convergent diagnostic tests, if treated as independent, will combine to overestimate the final post-test probability of disease. You are now far more sophisticated in interpreting diagnostic tests than most of your teachers. In the last part of our series we will show you some rather complex strategies that combine diagnosis and therapy, quantify our as yet nonquantified ideas about use, and require the use of at least a hand calculator. PMID:6671182

  20. Transfer Entropy as a Log-Likelihood Ratio

    NASA Astrophysics Data System (ADS)

    Barnett, Lionel; Bossomaier, Terry

    2012-09-01

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

  1. Transfer entropy as a log-likelihood ratio.

    PubMed

    Barnett, Lionel; Bossomaier, Terry

    2012-09-28

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

  2. Evaluation of Guide Sign Fonts

    DOT National Transportation Integrated Search

    2014-04-01

    Researchers at Texas A&M Transportation Institute completed a study of E-modified, Enhanced E-Modified, and Clearview 5W for overhead and shoulder-mounted guide signs. The overhead guide signed consisted of three six-letter words stacked over each ot...

  3. Modifying the communicative effectiveness of fire prevention signs

    Treesearch

    William S. Folkman

    1966-01-01

    Two versions of a commonly used U.S. Forest Service sign ('America Needs Productive Forests') were tested on four adult special-interest groups in Butte County, California. Half the members were shown the regularly used sign; the other half, a modified sign that included the Smokey Bear symbol. Responses to questionnaires by both groups suggested that each...

  4. A function accounting for training set size and marker density to model the average accuracy of genomic prediction.

    PubMed

    Erbe, Malena; Gredler, Birgit; Seefried, Franz Reinhold; Bapst, Beat; Simianer, Henner

    2013-01-01

    Prediction of genomic breeding values is of major practical relevance in dairy cattle breeding. Deterministic equations have been suggested to predict the accuracy of genomic breeding values in a given design which are based on training set size, reliability of phenotypes, and the number of independent chromosome segments ([Formula: see text]). The aim of our study was to find a general deterministic equation for the average accuracy of genomic breeding values that also accounts for marker density and can be fitted empirically. Two data sets of 5'698 Holstein Friesian bulls genotyped with 50 K SNPs and 1'332 Brown Swiss bulls genotyped with 50 K SNPs and imputed to ∼600 K SNPs were available. Different k-fold (k = 2-10, 15, 20) cross-validation scenarios (50 replicates, random assignment) were performed using a genomic BLUP approach. A maximum likelihood approach was used to estimate the parameters of different prediction equations. The highest likelihood was obtained when using a modified form of the deterministic equation of Daetwyler et al. (2010), augmented by a weighting factor (w) based on the assumption that the maximum achievable accuracy is [Formula: see text]. The proportion of genetic variance captured by the complete SNP sets ([Formula: see text]) was 0.76 to 0.82 for Holstein Friesian and 0.72 to 0.75 for Brown Swiss. When modifying the number of SNPs, w was found to be proportional to the log of the marker density up to a limit which is population and trait specific and was found to be reached with ∼20'000 SNPs in the Brown Swiss population studied.

  5. Functional form and risk adjustment of hospital costs: Bayesian analysis of a Box-Cox random coefficients model.

    PubMed

    Hollenbeak, Christopher S

    2005-10-15

    While risk-adjusted outcomes are often used to compare the performance of hospitals and physicians, the most appropriate functional form for the risk adjustment process is not always obvious for continuous outcomes such as costs. Semi-log models are used most often to correct skewness in cost data, but there has been limited research to determine whether the log transformation is sufficient or whether another transformation is more appropriate. This study explores the most appropriate functional form for risk-adjusting the cost of coronary artery bypass graft (CABG) surgery. Data included patients undergoing CABG surgery at four hospitals in the midwest and were fit to a Box-Cox model with random coefficients (BCRC) using Markov chain Monte Carlo methods. Marginal likelihoods and Bayes factors were computed to perform model comparison of alternative model specifications. Rankings of hospital performance were created from the simulation output and the rankings produced by Bayesian estimates were compared to rankings produced by standard models fit using classical methods. Results suggest that, for these data, the most appropriate functional form is not logarithmic, but corresponds to a Box-Cox transformation of -1. Furthermore, Bayes factors overwhelmingly rejected the natural log transformation. However, the hospital ranking induced by the BCRC model was not different from the ranking produced by maximum likelihood estimates of either the linear or semi-log model. Copyright (c) 2005 John Wiley & Sons, Ltd.

  6. Factors associated with willingness to participate in a vaccine clinical trial among elderly Hispanic patients.

    PubMed

    Rikin, Sharon; Shea, Steven; LaRussa, Philip; Stockwell, Melissa

    2017-09-01

    A population specific understanding of barriers and facilitators to participation in clinical trials could improve recruitment of elderly and minority populations. We investigated how prior exposure to clinical trials and incentives were associated with likelihood of participation in a vaccine clinical trial through a questionnaire administered to 200 elderly patients in an academic general internal medicine clinic. Wilcoxon signed rank sum test compared likelihood of participation with and without monetary incentives. Logistic regression evaluated characteristics associated with intent to participate in an influenza vaccine trial, adjusted for age, gender, language, and education history. When asked about likelihood of participation if there was monetary compensation, there was a 12.2% absolute increase in those reporting that they would not participate, with a significant difference in the distribution of likelihood before and after mentioning a monetary incentive (Wilcoxon signed rank test, p = 0.001). Those with previous knowledge of clinical trials (54.4%) were more likely to report they would participate vs. those without prior knowledge (OR 2.5, 95% CI [1.2, 5.2]). The study highlights the importance of pre-testing recruitment materials and incentives in key group populations prior to implementing clinical trials.

  7. Informativeness of Early Huntington Disease Signs about Gene Status.

    PubMed

    Oster, Emily; Eberly, Shirley W; Dorsey, E Ray; Kayson-Rubin, Elise; Oakes, David; Shoulson, Ira

    2015-01-01

    The cohort-level risk of Huntington disease (HD) is related to the age and symptom level of the cohort, but this relationship has not been made precise. To predict the evolving likelihood of carrying the Huntington disease (HD) gene for at-risk adults using age and sign level. Using data from adults with early signs and symptoms of HD linked to information on genetic status, we use Bayes' theorem to calculate the probability that an undiagnosed individual of a certain age and sign level has an expanded CAG repeat. Both age and sign levels have substantial influence on the likelihood of HD onset, and the probability of eventual diagnosis changes as those at risk age and exhibit (or fail to exhibit) symptoms. For example, our data suggest that in a cohort of individuals age 26 with a Unified Huntington's Disease Rating Scale (UHDRS) motor score of 7-10 70% of them will carry the HD mutation. For individuals age 56, the same motor score suggests only a 40% chance of carrying the mutation. Early motor signs of HD, overall and the chorea subscore, were highly predictive of disease onset at any age. However, body mass index (BMI) and cognitive performance scores were not as highly predictive. These results suggest that if researchers or clinicians are looking for early clues of HD, it may be more foretelling to look at motor rather than cognitive signs. Application of similar approaches could be used with other adult-onset genetic conditions.

  8. General Multidecision Theory: Hypothesis Testing and Changepoint Detection with Applications to Homeland Security

    DTIC Science & Technology

    2014-10-06

    to a subset Θ̃ of `-dimensional Euclidean space. The sub-σ-algebra Fn = FXn = σ(X n 1 ) of F is generated by the stochastic process X n 1 = (X1...developed asymptotic hypothesis testing theory is based on the SLLN and rates of convergence in the strong law for the LLR processes , specifically by...ξn to C. Write λn(θ, θ̃) = log dPnθ dPn θ̃ = ∑n k=1 log pθ(Xk|Xk−11 ) pθ̃(Xk|X k−1 1 ) for the log-likelihood ratio (LLR) process . Assume that there

  9. ANALYZING COHORT MORTALITY DATA

    EPA Science Inventory

    Several methods for analyzing data from mortality studies of occupationally or environmentally exposed cohorts are shown to be special cases of a single procedure. The procedure assumes a proportional hazards model for exposure effects and represents the log-likelihood kernel for...

  10. MIXOR: a computer program for mixed-effects ordinal regression analysis.

    PubMed

    Hedeker, D; Gibbons, R D

    1996-03-01

    MIXOR provides maximum marginal likelihood estimates for mixed-effects ordinal probit, logistic, and complementary log-log regression models. These models can be used for analysis of dichotomous and ordinal outcomes from either a clustered or longitudinal design. For clustered data, the mixed-effects model assumes that data within clusters are dependent. The degree of dependency is jointly estimated with the usual model parameters, thus adjusting for dependence resulting from clustering of the data. Similarly, for longitudinal data, the mixed-effects approach can allow for individual-varying intercepts and slopes across time, and can estimate the degree to which these time-related effects vary in the population of individuals. MIXOR uses marginal maximum likelihood estimation, utilizing a Fisher-scoring solution. For the scoring solution, the Cholesky factor of the random-effects variance-covariance matrix is estimated, along with the effects of model covariates. Examples illustrating usage and features of MIXOR are provided.

  11. M-dwarf exoplanet surface density distribution. A log-normal fit from 0.07 to 400 AU

    NASA Astrophysics Data System (ADS)

    Meyer, Michael R.; Amara, Adam; Reggiani, Maddalena; Quanz, Sascha P.

    2018-04-01

    Aims: We fit a log-normal function to the M-dwarf orbital surface density distribution of gas giant planets, over the mass range 1-10 times that of Jupiter, from 0.07 to 400 AU. Methods: We used a Markov chain Monte Carlo approach to explore the likelihoods of various parameter values consistent with point estimates of the data given our assumed functional form. Results: This fit is consistent with radial velocity, microlensing, and direct-imaging observations, is well-motivated from theoretical and phenomenological points of view, and predicts results of future surveys. We present probability distributions for each parameter and a maximum likelihood estimate solution. Conclusions: We suggest that this function makes more physical sense than other widely used functions, and we explore the implications of our results on the design of future exoplanet surveys.

  12. Role of transvaginal sonography and magnetic resonance imaging in the diagnosis of uterine adenomyosis.

    PubMed

    Bazot, Marc; Daraï, Emile

    2018-03-01

    The aim of the present review, conducted according to PRISMA statement recommendations, was to evaluate the contribution of transvaginal sonography (TVS) and magnetic resonance imaging (MRI) to diagnose adenomyosis. Although there is a lack of consensus on adenomyosis classification, three subtypes are described, internal, external adenomyosis, and adenomyomas. Using TVS, whatever the subtype, pooled sensitivities, pooled specificities, and pooled positive likelihood ratios are 0.72-0.82, 0.85-0.81, and 4.67-3.7, respectively, but with a high heterogeneity between the studies. MRI has a pooled sensitivity of 0.77, specificity of 0.89, positive likelihood ratio of 6.5, and negative likelihood ratio of 0.2 for all subtypes. Our results suggest that MRI is more useful than TVS in the diagnosis of adenomyosis. Further studies are required to determine the performance of direct signs (cystic component) and indirect signs (characteristics of junctional zone) to avoid misdiagnosis of adenomyosis. Copyright © 2018 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  13. Greater stroke severity predominates over all other factors for the worse outcome of cardioembolic stroke.

    PubMed

    Hong, Keun-Sik; Lee, Juneyoung; Bae, Hee-Joon; Lee, Ji Sung; Kang, Dong-Wha; Yu, Kyung-Ho; Han, Moon-Ku; Cho, Yong-Jin; Song, Pamela; Park, Jong-Moo; Oh, Mi-Sun; Koo, Jaseong; Lee, Byung-Chul

    2013-11-01

    Cardioembolic (CE) strokes are more disabling and more fatal than non-CE strokes. Multiple prognostic factors have been recognized, but the magnitude of their relative contributions has not been well explored. Using a prospective stroke outcome database, we compared the 3-month outcomes of CE and non-CE strokes. We assessed the relative contribution of each prognostic factor of initial stroke severity, poststroke complications, and baseline characteristics with multivariable analyses and model fitness improvement using -2 log-likelihood and Nagelkerke R2. This study included 1233 patients with acute ischemic stroke: 193 CE strokes and 1040 non-CE strokes. Compared with the non-CE group, CE group had less modified Rankin Scale (mRS) 0-2 outcomes (47.2% versus 68.5%; odds ratio [95% confidence interval], .41 [.30-.56]), less mRS 0-1 outcomes (33.7% versus 53.5%; .44 [.32-.61]), more mRS 5-6 outcomes (32.1% versus 10.9%; 3.88 [2.71-5.56]), and higher mortality (19.2% versus 5.2%; 4.33 [2.76-6.80]) at 3 months. When adjusting either baseline characteristics or poststroke complications, the outcome differences between the 2 groups remained significant. However, adjusting initial National Institute of Health Stroke Scale (NIHSS) score alone abolished all outcome differences except for mortality. For mRS 0-2 outcomes, the decrement of -2 log-likelihood and the Nagelkerke R2 of the model adjusting initial NIHSS score alone approached 70.2% and 76.7% of the fully adjusting model. Greater stroke severity predominates over all other factors for the worse outcome of CE stroke. Primary prevention and more efficient acute therapy for stroke victims should be given top priorities to reduce the burden of CE strokes. Copyright © 2013 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  14. Reducing Sugar-Sweetened Beverage Consumption by Providing Caloric Information: How Black Adolescents Alter Their Purchases and Whether the Effects Persist

    PubMed Central

    Barry, Colleen L.; Gary-Webb, Tiffany L.; Herring, Bradley J.

    2014-01-01

    Objectives. We examined the ways in which adolescents altered the type and size of their purchases of sugar-sweetened beverages (SSBs), together with whether the effects persisted after removing caloric information signs in stores. Methods. We used a case-crossover design with 6 stores located in low-income Black neighborhoods in Baltimore, Maryland, from 2012 to 2013. The intervention used 1 of 4 randomly posted signs with caloric information: absolute calories, number of teaspoons of sugar, and number of minutes of running or miles of walking necessary to burn off a beverage. We collected data for 4516 purchases by Black adolescents, including both baseline and postintervention periods with no signs posted. Results. We found that providing caloric information significantly reduced the number of total beverage calories purchased, the likelihood of buying an SSB, and the likelihood of buying an SSB greater than 16 ounces (P < .05). After removing the signs, the quantity, volume, and number of calories from SSB purchases remained lower than baseline (P < .05). Conclusions. Providing caloric information was associated with purchasing a smaller SSB, switching to a beverage with no calories, or opting to not purchase a beverage; there was a persistent effect on reducing SSB purchases after signs were removed. PMID:25322298

  15. Reducing sugar-sweetened beverage consumption by providing caloric information: how Black adolescents alter their purchases and whether the effects persist.

    PubMed

    Bleich, Sara N; Barry, Colleen L; Gary-Webb, Tiffany L; Herring, Bradley J

    2014-12-01

    We examined the ways in which adolescents altered the type and size of their purchases of sugar-sweetened beverages (SSBs), together with whether the effects persisted after removing caloric information signs in stores. We used a case-crossover design with 6 stores located in low-income Black neighborhoods in Baltimore, Maryland, from 2012 to 2013. The intervention used 1 of 4 randomly posted signs with caloric information: absolute calories, number of teaspoons of sugar, and number of minutes of running or miles of walking necessary to burn off a beverage. We collected data for 4516 purchases by Black adolescents, including both baseline and postintervention periods with no signs posted. We found that providing caloric information significantly reduced the number of total beverage calories purchased, the likelihood of buying an SSB, and the likelihood of buying an SSB greater than 16 ounces (P < .05). After removing the signs, the quantity, volume, and number of calories from SSB purchases remained lower than baseline (P < .05). Providing caloric information was associated with purchasing a smaller SSB, switching to a beverage with no calories, or opting to not purchase a beverage; there was a persistent effect on reducing SSB purchases after signs were removed.

  16. A complete X-ray sample of the high latitude sky from HEAO-1 A-2: log N lo S and luminosity functions

    NASA Technical Reports Server (NTRS)

    Piccinotti, G.; Mushotzky, R. F.; Boldt, E. A.; Holt, S. S.; Marshall, F. E.; Serlemitsos, P. J.; Shafer, R. A.

    1981-01-01

    An experiment was performed in which a complete X-ray survey of the 8.2 steradians of the sky at galactic latitudes where the absolute value of b is 20 deg down to a limiting sensitivity of 3.1 x ten to the minus 11th power ergs/sq cm sec in the 2-10 keV band. Of the 85 detected sources 17 were identified with galactic objects, 61 were identified with extragalactic objects, and 7 remain unidentified. The log N - log S relation for the non-galactic objects is well fit by the Euclidean relationship. The X-ray spectra of these objects were used to construct log N - log S in physical units. The complete sample of identified sources was used to construct X-ray luminosity functions, using the absolute maximum likelihood method, for clusters galaxies and active galactic nuclei.

  17. Channel incision and suspended sediment delivery at Caspar Creek, Mendocino County, California

    Treesearch

    Nicholas J. Dewey; Thomas E. Lisle; Leslie M. Reid

    2003-01-01

    Tributary and headwater valleys in the Caspar Creek watershed,in coastal Mendocino County, California,show signs of incision along much of their lengths.An episode of incision followed initial-entry logging which took place between 1860 and 1906. Another episode of incision cut into skid-trails created for second-entry logging in the 1970's.

  18. SciLinks

    Science.gov Websites

    SciLinks Forgot your login? Sign up for FREE access Log In I'm a ... Teacher Student User Name questions and satisfy their curiosity Learn More Sign up for Free Access Sites in the SciLinks program . SciLinks-Targeted, Grade-Specific Web Content for your Books Free web content to extend and expand student

  19. A flexible system for vital signs monitoring in hospital general care wards based on the integration of UNIX-based workstations, standard networks and portable vital signs monitors.

    PubMed

    Welch, J P; Sims, N; Ford-Carlton, P; Moon, J B; West, K; Honore, G; Colquitt, N

    1991-01-01

    The article describes a study conducted on general surgical and thoracic surgical floors of a 1000-bed hospital to assess the impact of a new network for portable patient care devices. This network was developed to address the needs of hospital patients who need constant, multi-parameter, vital signs surveillance, but do not require intensive nursing care. Bedside wall jacks were linked to UNIX-based workstations using standard digital network hardware, creating a flexible system (for general care floors of the hospital) that allowed the number of monitored locations to increase and decrease as patient census and acuity levels varied. It also allowed the general care floors to provide immediate, centralized vital signs monitoring for patients who unexpectedly became unstable, and permitted portable monitors to travel with patients as they were transferred between hospital departments. A disk-based log within the workstation automatically collected performance data, including patient demographics, monitor alarms, and network status for analysis. The log has allowed the developers to evaluate the use and performance of the system.

  20. Determination of drug lipophilicity by phosphatidylcholine-modified microemulsion high-performance liquid chromatography.

    PubMed

    Xuan, Xueyi; Xu, Liyuan; Li, Liangxing; Gao, Chongkai; Li, Ning

    2015-07-25

    A new biomembrane-mimetic liquid chromatographic method using a C8 stationary phase and phosphatidylcholine-modified (PC-modified) microemulsion mobile phase was used to estimate unionized and ionized drugs lipophilicity expressed as an n-octanol/water partition coefficient (logP and logD). The introduction of PC into sodium dodecyl sulfate (SDS) microemulsion yielded a good correlation between logk and logD (R(2)=0.8). The optimal composition of the PC-modified microemulsion liquid chromatography (PC-modified MELC) mobile phase was 0.2% PC-3.0% SDS-6.0% n-butanol-0.8% ethyl acetate-90.0% water (pH 7.0) for neutral and ionized molecules. The interactions between the analytes and system described by this chromatographic method is more similar to biological membrane than the n-octanol/water partition system. The result in this paper suggests that PC-modified MELC can serve as a possible alternative to the shake-flask method for high-throughput unionized and ionized drugs lipophilicity determination and simulation of biological processes. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Racial/ethnic disparities in self-reported short sleep duration among US-born and foreign-born adults.

    PubMed

    Cunningham, Timothy J; Wheaton, Anne G; Ford, Earl S; Croft, Janet B

    2016-12-01

    Racial/ethnic health disparities are infrequently considered by nativity status in the United States, although the immigrant population has practically doubled since 1990. We investigated the modifying role of nativity status (US- vs. foreign-born) on racial/ethnic disparities in short sleep duration (<7 h), which has serious health consequences. Cross-sectional data from 23,505 US-born and 4,326 foreign-born adults aged ≥ 18 years from the 2012 National Health Interview Survey and multivariable log-linear regression were used to estimate prevalence ratios (PR) for reporting short sleep duration and their corresponding 95% confidence intervals (CI). After controlling for sociodemographic covariates, short sleep was more prevalent among blacks (PR 1.29, 95% CI: 1.21-1.37), Hispanics (PR 1.18, 95% CI: 1.08, 1.29), and Asians (PR 1.37, 95% CI: 1.16-1.61) than whites among US-born adults. Short sleep was more prevalent among blacks (PR 1.71, 95% CI: 1.38, 2.13) and Asians (PR 1.23, 95% CI: 1.02, 1.47) than whites among the foreign-born. Among both US- and foreign-born adults, blacks and Asians had a higher likelihood of short sleep compared to whites. US-born Hispanics, but not foreign-born Hispanics, had a higher likelihood than their white counterparts. Future research should aim to uncover mechanisms underlying these disparities.

  2. Socioeconomic status and the likelihood of antibiotic treatment for signs and symptoms of pulmonary exacerbation in children with cystic fibrosis.

    PubMed

    Schechter, Michael S; McColley, Susanna A; Regelmann, Warren; Millar, Stefanie J; Pasta, David J; Wagener, Jeffrey S; Konstan, Michael W; Morgan, Wayne J

    2011-11-01

    To determine whether socioeconomic status (SES) influences the likelihood of antibiotic treatment of pulmonary exacerbations in patients with cystic fibrosis (CF). We used data on 9895 patients ≤ 18 years old from the Epidemiologic Study of CF. After establishing an individual baseline of clinical signs and symptoms, we ascertained whether antibiotics were prescribed when new signs/symptoms suggested a pulmonary exacerbation, adjusting for sex, presence of Pseudomonas aeruginosa, the number of new signs/symptoms, and baseline disease severity. In a 12-month period, 20.0% of patients <6 years of age, 33.8% of patients 6 to 12 years of age, and 41.4% of patients 13 to 18 years of age were treated with any (oral, intravenous (IV), or inhaled) antibiotics; the percentage receiving IV antibiotics was 7.3%, 15.2%, and 20.9%, respectively. SES had little effect on treatment for pulmonary exacerbation with any antibiotics, but IV antibiotics were prescribed more frequently for patients with lower SES. SES-related disparities in CF health outcomes do not appear to be explained by differential treatment of pulmonary exacerbations. Copyright © 2011 Mosby, Inc. All rights reserved.

  3. The Diagnostic Accuracy of Special Tests for Rotator Cuff Tear: The ROW Cohort Study

    PubMed Central

    Jain, Nitin B.; Luz, Jennifer; Higgins, Laurence D.; Dong, Yan; Warner, Jon J.P.; Matzkin, Elizabeth; Katz, Jeffrey N.

    2016-01-01

    Objective The aim was to assess diagnostic accuracy of 15 shoulder special tests for rotator cuff tears. Design From 02/2011 to 12/2012, 208 participants with shoulder pain were recruited in a cohort study. Results Among tests for supraspinatus tears, Jobe’s test had a sensitivity of 88% (95% CI=80% to 96%), specificity of 62% (95% CI=53% to 71%), and likelihood ratio of 2.30 (95% CI=1.79 to 2.95). The full can test had a sensitivity of 70% (95% CI=59% to 82%) and a specificity of 81% (95% CI=74% to 88%). Among tests for infraspinatus tears, external rotation lag signs at 0° had a specificity of 98% (95% CI=96% to 100%) and a likelihood ratio of 6.06 (95% CI=1.30 to 28.33), and the Hornblower’s sign had a specificity of 96% (95% CI=93% to 100%) and likelihood ratio of 4.81 (95% CI=1.60 to 14.49). Conclusions Jobe’s test and full can test had high sensitivity and specificity for supraspinatus tears and Hornblower’s sign performed well for infraspinatus tears. In general, special tests described for subscapularis tears have high specificity but low sensitivity. These data can be used in clinical practice to diagnose rotator cuff tears and may reduce the reliance on expensive imaging. PMID:27386812

  4. Make the most of your samples: Bayes factor estimators for high-dimensional models of sequence evolution.

    PubMed

    Baele, Guy; Lemey, Philippe; Vansteelandt, Stijn

    2013-03-06

    Accurate model comparison requires extensive computation times, especially for parameter-rich models of sequence evolution. In the Bayesian framework, model selection is typically performed through the evaluation of a Bayes factor, the ratio of two marginal likelihoods (one for each model). Recently introduced techniques to estimate (log) marginal likelihoods, such as path sampling and stepping-stone sampling, offer increased accuracy over the traditional harmonic mean estimator at an increased computational cost. Most often, each model's marginal likelihood will be estimated individually, which leads the resulting Bayes factor to suffer from errors associated with each of these independent estimation processes. We here assess the original 'model-switch' path sampling approach for direct Bayes factor estimation in phylogenetics, as well as an extension that uses more samples, to construct a direct path between two competing models, thereby eliminating the need to calculate each model's marginal likelihood independently. Further, we provide a competing Bayes factor estimator using an adaptation of the recently introduced stepping-stone sampling algorithm and set out to determine appropriate settings for accurately calculating such Bayes factors, with context-dependent evolutionary models as an example. While we show that modest efforts are required to roughly identify the increase in model fit, only drastically increased computation times ensure the accuracy needed to detect more subtle details of the evolutionary process. We show that our adaptation of stepping-stone sampling for direct Bayes factor calculation outperforms the original path sampling approach as well as an extension that exploits more samples. Our proposed approach for Bayes factor estimation also has preferable statistical properties over the use of individual marginal likelihood estimates for both models under comparison. Assuming a sigmoid function to determine the path between two competing models, we provide evidence that a single well-chosen sigmoid shape value requires less computational efforts in order to approximate the true value of the (log) Bayes factor compared to the original approach. We show that the (log) Bayes factors calculated using path sampling and stepping-stone sampling differ drastically from those estimated using either of the harmonic mean estimators, supporting earlier claims that the latter systematically overestimate the performance of high-dimensional models, which we show can lead to erroneous conclusions. Based on our results, we argue that highly accurate estimation of differences in model fit for high-dimensional models requires much more computational effort than suggested in recent studies on marginal likelihood estimation.

  5. Make the most of your samples: Bayes factor estimators for high-dimensional models of sequence evolution

    PubMed Central

    2013-01-01

    Background Accurate model comparison requires extensive computation times, especially for parameter-rich models of sequence evolution. In the Bayesian framework, model selection is typically performed through the evaluation of a Bayes factor, the ratio of two marginal likelihoods (one for each model). Recently introduced techniques to estimate (log) marginal likelihoods, such as path sampling and stepping-stone sampling, offer increased accuracy over the traditional harmonic mean estimator at an increased computational cost. Most often, each model’s marginal likelihood will be estimated individually, which leads the resulting Bayes factor to suffer from errors associated with each of these independent estimation processes. Results We here assess the original ‘model-switch’ path sampling approach for direct Bayes factor estimation in phylogenetics, as well as an extension that uses more samples, to construct a direct path between two competing models, thereby eliminating the need to calculate each model’s marginal likelihood independently. Further, we provide a competing Bayes factor estimator using an adaptation of the recently introduced stepping-stone sampling algorithm and set out to determine appropriate settings for accurately calculating such Bayes factors, with context-dependent evolutionary models as an example. While we show that modest efforts are required to roughly identify the increase in model fit, only drastically increased computation times ensure the accuracy needed to detect more subtle details of the evolutionary process. Conclusions We show that our adaptation of stepping-stone sampling for direct Bayes factor calculation outperforms the original path sampling approach as well as an extension that exploits more samples. Our proposed approach for Bayes factor estimation also has preferable statistical properties over the use of individual marginal likelihood estimates for both models under comparison. Assuming a sigmoid function to determine the path between two competing models, we provide evidence that a single well-chosen sigmoid shape value requires less computational efforts in order to approximate the true value of the (log) Bayes factor compared to the original approach. We show that the (log) Bayes factors calculated using path sampling and stepping-stone sampling differ drastically from those estimated using either of the harmonic mean estimators, supporting earlier claims that the latter systematically overestimate the performance of high-dimensional models, which we show can lead to erroneous conclusions. Based on our results, we argue that highly accurate estimation of differences in model fit for high-dimensional models requires much more computational effort than suggested in recent studies on marginal likelihood estimation. PMID:23497171

  6. Identification of contemporary selection signatures using composite log likelihood and their associations with marbling score in Korean cattle.

    PubMed

    Ryu, Jihye; Lee, Chaeyoung

    2014-12-01

    Positive selection not only increases beneficial allele frequency but also causes augmentation of allele frequencies of sequence variants in close proximity. Signals for positive selection were detected by the statistical differences in subsequent allele frequencies. To identify selection signatures in Korean cattle, we applied a composite log-likelihood (CLL)-based method, which calculates a composite likelihood of the allelic frequencies observed across sliding windows of five adjacent loci and compares the value with the critical statistic estimated by 50,000 permutations. Data for a total of 11,799 nucleotide polymorphisms were used with 71 Korean cattle and 209 foreign beef cattle. As a result, 147 signals were identified for Korean cattle based on CLL estimates (P < 0.01). The signals might be candidate genetic factors for meat quality by which the Korean cattle have been selected. Further genetic association analysis with 41 intragenic variants in the selection signatures with the greatest CLL for each chromosome revealed that marbling score was associated with five variants. Intensive association studies with all the selection signatures identified in this study are required to exclude signals associated with other phenotypes or signals falsely detected and thus to identify genetic markers for meat quality. © 2014 Stichting International Foundation for Animal Genetics.

  7. Using the Logarithm of Odds to Define a Vector Space on Probabilistic Atlases

    PubMed Central

    Pohl, Kilian M.; Fisher, John; Bouix, Sylvain; Shenton, Martha; McCarley, Robert W.; Grimson, W. Eric L.; Kikinis, Ron; Wells, William M.

    2007-01-01

    The Logarithm of the Odds ratio (LogOdds) is frequently used in areas such as artificial neural networks, economics, and biology, as an alternative representation of probabilities. Here, we use LogOdds to place probabilistic atlases in a linear vector space. This representation has several useful properties for medical imaging. For example, it not only encodes the shape of multiple anatomical structures but also captures some information concerning uncertainty. We demonstrate that the resulting vector space operations of addition and scalar multiplication have natural probabilistic interpretations. We discuss several examples for placing label maps into the space of LogOdds. First, we relate signed distance maps, a widely used implicit shape representation, to LogOdds and compare it to an alternative that is based on smoothing by spatial Gaussians. We find that the LogOdds approach better preserves shapes in a complex multiple object setting. In the second example, we capture the uncertainty of boundary locations by mapping multiple label maps of the same object into the LogOdds space. Third, we define a framework for non-convex interpolations among atlases that capture different time points in the aging process of a population. We evaluate the accuracy of our representation by generating a deformable shape atlas that captures the variations of anatomical shapes across a population. The deformable atlas is the result of a principal component analysis within the LogOdds space. This atlas is integrated into an existing segmentation approach for MR images. We compare the performance of the resulting implementation in segmenting 20 test cases to a similar approach that uses a more standard shape model that is based on signed distance maps. On this data set, the Bayesian classification model with our new representation outperformed the other approaches in segmenting subcortical structures. PMID:17698403

  8. Condition and fate of logged forests in the Brazilian Amazon.

    PubMed

    Asner, Gregory P; Broadbent, Eben N; Oliveira, Paulo J C; Keller, Michael; Knapp, David E; Silva, José N M

    2006-08-22

    The long-term viability of a forest industry in the Amazon region of Brazil depends on the maintenance of adequate timber volume and growth in healthy forests. Using extensive high-resolution satellite analyses, we studied the forest damage caused by recent logging operations and the likelihood that logged forests would be cleared within 4 years after timber harvest. Across 2,030,637 km2 of the Brazilian Amazon from 1999 to 2004, at least 76% of all harvest practices resulted in high levels of canopy damage sufficient to leave forests susceptible to drought and fire. We found that 16+/-1% of selectively logged areas were deforested within 1 year of logging, with a subsequent annual deforestation rate of 5.4% for 4 years after timber harvests. Nearly all logging occurred within 25 km of main roads, and within that area, the probability of deforestation for a logged forest was up to four times greater than for unlogged forests. In combination, our results show that logging in the Brazilian Amazon is dominated by highly damaging operations, often followed rapidly by deforestation decades before forests can recover sufficiently to produce timber for a second harvest. Under the management regimes in effect at the time of our study in the Brazilian Amazon, selective logging would not be sustained.

  9. Condition and fate of logged forests in the Brazilian Amazon

    PubMed Central

    Asner, Gregory P.; Broadbent, Eben N.; Oliveira, Paulo J. C.; Keller, Michael; Knapp, David E.; Silva, José N. M.

    2006-01-01

    The long-term viability of a forest industry in the Amazon region of Brazil depends on the maintenance of adequate timber volume and growth in healthy forests. Using extensive high-resolution satellite analyses, we studied the forest damage caused by recent logging operations and the likelihood that logged forests would be cleared within 4 years after timber harvest. Across 2,030,637 km2 of the Brazilian Amazon from 1999 to 2004, at least 76% of all harvest practices resulted in high levels of canopy damage sufficient to leave forests susceptible to drought and fire. We found that 16 ± 1% of selectively logged areas were deforested within 1 year of logging, with a subsequent annual deforestation rate of 5.4% for 4 years after timber harvests. Nearly all logging occurred within 25 km of main roads, and within that area, the probability of deforestation for a logged forest was up to four times greater than for unlogged forests. In combination, our results show that logging in the Brazilian Amazon is dominated by highly damaging operations, often followed rapidly by deforestation decades before forests can recover sufficiently to produce timber for a second harvest. Under the management regimes in effect at the time of our study in the Brazilian Amazon, selective logging would not be sustained. PMID:16901980

  10. Individual stem value recovery of modified and conventional tree-length systems in the southeastern United States

    Treesearch

    Amanda H. Lang; Shawn A. Baker; W. Dale Greene; Glen E. Murphy

    2010-01-01

    We compared value recovery of a modified treelength (MTL) logging system that measures product diameter and length using a Waratah 626 harvester head to that of a treelength (TL) system that estimates dimensions. A field test compared the actual value cut to the maximum potential value suggested by the log bucking optimization program Assessment of Value by Individual...

  11. Modified-Signed-Digit Optical Computing Using Fan-Out

    NASA Technical Reports Server (NTRS)

    Liu, Hua-Kuang; Zhou, Shaomin; Yeh, Pochi

    1996-01-01

    Experimental optical computing system containing optical fan-out elements implements modified signed-digit (MSD) arithmetic and logic. In comparison with previous optical implementations of MSD arithmetic, this one characterized by larger throughput, greater flexibility, and simpler optics.

  12. LogLines. September-October 2009

    DTIC Science & Technology

    2009-10-01

    troop support team jumps on Pacific Partnership bandwagon , supplies aid to mission. Building Blocks 34 Logistics support forms backbone for...floors, pillars and bare walls. A massive sign over the main entrance identifies the location as belonging to DLA. The sign, and the branding of DLA...provide warfighters and other government agencies with comprehensive energy solutions in the most wartime effective and peacetime efficient manner

  13. A flexible system for vital signs monitoring in hospital general care wards based on the integration of UNIX-based workstations, standard networks and portable vital signs monitors.

    PubMed Central

    Welch, J. P.; Sims, N.; Ford-Carlton, P.; Moon, J. B.; West, K.; Honore, G.; Colquitt, N.

    1991-01-01

    The article describes a study conducted on general surgical and thoracic surgical floors of a 1000-bed hospital to assess the impact of a new network for portable patient care devices. This network was developed to address the needs of hospital patients who need constant, multi-parameter, vital signs surveillance, but do not require intensive nursing care. Bedside wall jacks were linked to UNIX-based workstations using standard digital network hardware, creating a flexible system (for general care floors of the hospital) that allowed the number of monitored locations to increase and decrease as patient census and acuity levels varied. It also allowed the general care floors to provide immediate, centralized vital signs monitoring for patients who unexpectedly became unstable, and permitted portable monitors to travel with patients as they were transferred between hospital departments. A disk-based log within the workstation automatically collected performance data, including patient demographics, monitor alarms, and network status for analysis. The log has allowed the developers to evaluate the use and performance of the system. PMID:1807720

  14. An Informative Interpretation of Decision Theory: The Information Theoretic Basis for Signal-to-Noise Ratio and Log Likelihood Ratio

    DOE PAGES

    Polcari, J.

    2013-08-16

    The signal processing concept of signal-to-noise ratio (SNR), in its role as a performance measure, is recast within the more general context of information theory, leading to a series of useful insights. Establishing generalized SNR (GSNR) as a rigorous information theoretic measure inherent in any set of observations significantly strengthens its quantitative performance pedigree while simultaneously providing a specific definition under general conditions. This directly leads to consideration of the log likelihood ratio (LLR): first, as the simplest possible information-preserving transformation (i.e., signal processing algorithm) and subsequently, as an absolute, comparable measure of information for any specific observation exemplar. Furthermore,more » the information accounting methodology that results permits practical use of both GSNR and LLR as diagnostic scalar performance measurements, directly comparable across alternative system/algorithm designs, applicable at any tap point within any processing string, in a form that is also comparable with the inherent performance bounds due to information conservation.« less

  15. Simple, Efficient Estimators of Treatment Effects in Randomized Trials Using Generalized Linear Models to Leverage Baseline Variables

    PubMed Central

    Rosenblum, Michael; van der Laan, Mark J.

    2010-01-01

    Models, such as logistic regression and Poisson regression models, are often used to estimate treatment effects in randomized trials. These models leverage information in variables collected before randomization, in order to obtain more precise estimates of treatment effects. However, there is the danger that model misspecification will lead to bias. We show that certain easy to compute, model-based estimators are asymptotically unbiased even when the working model used is arbitrarily misspecified. Furthermore, these estimators are locally efficient. As a special case of our main result, we consider a simple Poisson working model containing only main terms; in this case, we prove the maximum likelihood estimate of the coefficient corresponding to the treatment variable is an asymptotically unbiased estimator of the marginal log rate ratio, even when the working model is arbitrarily misspecified. This is the log-linear analog of ANCOVA for linear models. Our results demonstrate one application of targeted maximum likelihood estimation. PMID:20628636

  16. Quantification of residual dose estimation error on log file-based patient dose calculation.

    PubMed

    Katsuta, Yoshiyuki; Kadoya, Noriyuki; Fujita, Yukio; Shimizu, Eiji; Matsunaga, Kenichi; Matsushita, Haruo; Majima, Kazuhiro; Jingu, Keiichi

    2016-05-01

    The log file-based patient dose estimation includes a residual dose estimation error caused by leaf miscalibration, which cannot be reflected on the estimated dose. The purpose of this study is to determine this residual dose estimation error. Modified log files for seven head-and-neck and prostate volumetric modulated arc therapy (VMAT) plans simulating leaf miscalibration were generated by shifting both leaf banks (systematic leaf gap errors: ±2.0, ±1.0, and ±0.5mm in opposite directions and systematic leaf shifts: ±1.0mm in the same direction) using MATLAB-based (MathWorks, Natick, MA) in-house software. The generated modified and non-modified log files were imported back into the treatment planning system and recalculated. Subsequently, the generalized equivalent uniform dose (gEUD) was quantified for the definition of the planning target volume (PTV) and organs at risks. For MLC leaves calibrated within ±0.5mm, the quantified residual dose estimation errors that obtained from the slope of the linear regression of gEUD changes between non- and modified log file doses per leaf gap are in head-and-neck plans 1.32±0.27% and 0.82±0.17Gy for PTV and spinal cord, respectively, and in prostate plans 1.22±0.36%, 0.95±0.14Gy, and 0.45±0.08Gy for PTV, rectum, and bladder, respectively. In this work, we determine the residual dose estimation errors for VMAT delivery using the log file-based patient dose calculation according to the MLC calibration accuracy. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  17. Accuracy of maximum likelihood estimates of a two-state model in single-molecule FRET

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gopich, Irina V.

    2015-01-21

    Photon sequences from single-molecule Förster resonance energy transfer (FRET) experiments can be analyzed using a maximum likelihood method. Parameters of the underlying kinetic model (FRET efficiencies of the states and transition rates between conformational states) are obtained by maximizing the appropriate likelihood function. In addition, the errors (uncertainties) of the extracted parameters can be obtained from the curvature of the likelihood function at the maximum. We study the standard deviations of the parameters of a two-state model obtained from photon sequences with recorded colors and arrival times. The standard deviations can be obtained analytically in a special case when themore » FRET efficiencies of the states are 0 and 1 and in the limiting cases of fast and slow conformational dynamics. These results are compared with the results of numerical simulations. The accuracy and, therefore, the ability to predict model parameters depend on how fast the transition rates are compared to the photon count rate. In the limit of slow transitions, the key parameters that determine the accuracy are the number of transitions between the states and the number of independent photon sequences. In the fast transition limit, the accuracy is determined by the small fraction of photons that are correlated with their neighbors. The relative standard deviation of the relaxation rate has a “chevron” shape as a function of the transition rate in the log-log scale. The location of the minimum of this function dramatically depends on how well the FRET efficiencies of the states are separated.« less

  18. Accuracy of maximum likelihood estimates of a two-state model in single-molecule FRET

    PubMed Central

    Gopich, Irina V.

    2015-01-01

    Photon sequences from single-molecule Förster resonance energy transfer (FRET) experiments can be analyzed using a maximum likelihood method. Parameters of the underlying kinetic model (FRET efficiencies of the states and transition rates between conformational states) are obtained by maximizing the appropriate likelihood function. In addition, the errors (uncertainties) of the extracted parameters can be obtained from the curvature of the likelihood function at the maximum. We study the standard deviations of the parameters of a two-state model obtained from photon sequences with recorded colors and arrival times. The standard deviations can be obtained analytically in a special case when the FRET efficiencies of the states are 0 and 1 and in the limiting cases of fast and slow conformational dynamics. These results are compared with the results of numerical simulations. The accuracy and, therefore, the ability to predict model parameters depend on how fast the transition rates are compared to the photon count rate. In the limit of slow transitions, the key parameters that determine the accuracy are the number of transitions between the states and the number of independent photon sequences. In the fast transition limit, the accuracy is determined by the small fraction of photons that are correlated with their neighbors. The relative standard deviation of the relaxation rate has a “chevron” shape as a function of the transition rate in the log-log scale. The location of the minimum of this function dramatically depends on how well the FRET efficiencies of the states are separated. PMID:25612692

  19. Correlation of the Capacity Factor in Vesicular Electrokinetic Chromatography with the Octanol:Water Partition Coefficient for Charged and Neutral Analytes

    PubMed Central

    Razak, J. L.; Cutak, B. J.; Larive, C. K.; Lunte, C. E.

    2008-01-01

    Purpose The aim of this study was to develop a method based upon electrokinetic chromatography (EKC) using oppositely charged surfactant vesicles as a buffer modifier to estimate hydrophobicity (log P) for a range of neutral and charged compounds. Methods Vesicles were formed from cetyltrimethylammonium bromide (CTAB) and sodium n-octyl sulfate (SOS). The size and polydispersity of the vesicles were characterized by electron microscopy, dynamic light scattering, and pulsed-field gradient NMR (PFG-NMR). PFG-NMR was also used to determine if ion-pairing between cationic analytes and free SOS monomer occurred. The CTAB/SOS vesicles were used as a buffer modifier in capillary electrophoresis (CE). The capacity factor (log k′) was calculated by determining the mobility of the analytes both in the presence and absence of vesicles. Log k′ was determined for 29 neutral and charged analytes. Results There was a linear relationship between the log of capacity factor (log k′) and octanol/water partition coefficient (log P) for both neutral and basic species at pH 6.0, 7.3, and 10.2. This indicated that interaction between the cation and vesicle was dominated by hydrophobic forces. At pH 4.3, the log k′ values for the least hydrophobic basic analytes were higher than expected, indicating that electrostatic attraction as well as hydrophobic forces contributed to the overall interaction between the cation and vesicle. Anionic compounds could not be evaluated using this system. Conclusion Vesicular electrokinetic chromatography (VEKC) using surfactant vesicles as buffer modifiers is a promising method for the estimation of hydrophobicity. PMID:11336344

  20. Analysis of survival in breast cancer patients by using different parametric models

    NASA Astrophysics Data System (ADS)

    Enera Amran, Syahila; Asrul Afendi Abdullah, M.; Kek, Sie Long; Afiqah Muhamad Jamil, Siti

    2017-09-01

    In biomedical applications or clinical trials, right censoring was often arising when studying the time to event data. In this case, some individuals are still alive at the end of the study or lost to follow up at a certain time. It is an important issue to handle the censoring data in order to prevent any bias information in the analysis. Therefore, this study was carried out to analyze the right censoring data with three different parametric models; exponential model, Weibull model and log-logistic models. Data of breast cancer patients from Hospital Sultan Ismail, Johor Bahru from 30 December 2008 until 15 February 2017 was used in this study to illustrate the right censoring data. Besides, the covariates included in this study are the time of breast cancer infection patients survive t, age of each patients X1 and treatment given to the patients X2 . In order to determine the best parametric models in analysing survival of breast cancer patients, the performance of each model was compare based on Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and log-likelihood value using statistical software R. When analysing the breast cancer data, all three distributions were shown consistency of data with the line graph of cumulative hazard function resembles a straight line going through the origin. As the result, log-logistic model was the best fitted parametric model compared with exponential and Weibull model since it has the smallest value in AIC and BIC, also the biggest value in log-likelihood.

  1. Local Influence Analysis of Nonlinear Structural Equation Models

    ERIC Educational Resources Information Center

    Lee, Sik-Yum; Tang, Nian-Sheng

    2004-01-01

    By regarding the latent random vectors as hypothetical missing data and based on the conditional expectation of the complete-data log-likelihood function in the EM algorithm, we investigate assessment of local influence of various perturbation schemes in a nonlinear structural equation model. The basic building blocks of local influence analysis…

  2. On the Nature of SEM Estimates of ARMA Parameters.

    ERIC Educational Resources Information Center

    Hamaker, Ellen L.; Dolan, Conor V.; Molenaar, Peter C. M.

    2002-01-01

    Reexamined the nature of structural equation modeling (SEM) estimates of autoregressive moving average (ARMA) models, replicated the simulation experiments of P. Molenaar, and examined the behavior of the log-likelihood ratio test. Simulation studies indicate that estimates of ARMA parameters observed with SEM software are identical to those…

  3. Displacement chromatography on cyclodextrin silicas. IV. Separation of the enantiomers of ibuprofen.

    PubMed

    Farkas, G; Irgens, L H; Quintero, G; Beeson, M D; al-Saeed, A; Vigh, G

    1993-08-13

    A displacement chromatographic method has been developed for the preparative separation of the enantiomers of ibuprofen using a beta-cyclodextrin silica stationary phase. The retention behavior of ibuprofen was studied in detail: the log k' vs. polar organic modifier concentration, the log k' vs. pH, the log k' vs. buffer concentration and the log k' vs. 1/T relationships; also, the alpha vs. polar organic modifier concentration, the alpha vs. pH, the alpha vs. buffer concentration and the log alpha vs. 1/T relationships have been determined in order to find the carrier solution composition which results in maximum chiral selectivity and sufficient, but not excessive solute retention (1 < k' < 30). 4-tert.-Butylcyclohexanol, a structurally similar but more retained compound than ibuprofen, was selected as displacer for the separation. Even with an alpha value as small as 1.08, good preparative chiral separations were observed both in the displacement mode and in the overloaded elution mode, up to a sample load of 0.5 mg.

  4. Likelihood-based confidence intervals for estimating floods with given return periods

    NASA Astrophysics Data System (ADS)

    Martins, Eduardo Sávio P. R.; Clarke, Robin T.

    1993-06-01

    This paper discusses aspects of the calculation of likelihood-based confidence intervals for T-year floods, with particular reference to (1) the two-parameter gamma distribution; (2) the Gumbel distribution; (3) the two-parameter log-normal distribution, and other distributions related to the normal by Box-Cox transformations. Calculation of the confidence limits is straightforward using the Nelder-Mead algorithm with a constraint incorporated, although care is necessary to ensure convergence either of the Nelder-Mead algorithm, or of the Newton-Raphson calculation of maximum-likelihood estimates. Methods are illustrated using records from 18 gauging stations in the basin of the River Itajai-Acu, State of Santa Catarina, southern Brazil. A small and restricted simulation compared likelihood-based confidence limits with those given by use of the central limit theorem; for the same confidence probability, the confidence limits of the simulation were wider than those of the central limit theorem, which failed more frequently to contain the true quantile being estimated. The paper discusses possible applications of likelihood-based confidence intervals in other areas of hydrological analysis.

  5. The Diagnostic Value of the Clarke Sign in Assessing Chondromalacia Patella

    PubMed Central

    Doberstein, Scott T; Romeyn, Richard L; Reineke, David M

    2008-01-01

    Context: Various techniques have been described for assessing conditions that cause pain at the patellofemoral (PF) joint. The Clarke sign is one such test, but the diagnostic value of this test in assessing chondromalacia patella is unknown. Objective: To (1) investigate the diagnostic value of the Clarke sign in assessing the presence of chondromalacia patella using arthroscopic examination of the PF joint as the “gold standard,” and (2) provide a historical perspective of the Clarke sign as a clinical diagnostic test. Design: Validation study. Setting: All patients of one of the investigators who had knee pain or injuries unrelated to the patellofemoral joint and were scheduled for arthroscopic surgery were recruited for this study. Patients or Other Participants: A total of 106 otherwise healthy individuals with no history of patellofemoral pain or dysfunction volunteered. Main Outcome Measure(s): The Clarke sign was performed on the surgical knee by a single investigator in the clinic before surgery. A positive test was indicated by the presence of pain sufficient to prevent the patient from maintaining a quadriceps muscle contraction against manual resistance for longer than 2 seconds. The preoperative result was compared with visual evidence of chondromalacia patella during arthroscopy. Results: Sensitivity was 0.39, specificity was 0.67, likelihood ratio for a positive test was 1.18, likelihood ratio for a negative test was 0.91, positive predictive value was 0.25, and negative predictive value was 0.80. Conclusions: Diagnostic validity values for the use of the Clarke sign in assessing chondromalacia patella were unsatisfactory, supporting suggestions that it has poor diagnostic value as a clinical examination technique. Additionally, an extensive search of the available literature for the Clarke sign reveals multiple problems with the test, causing significant confusion for clinicians. Therefore, the use of the Clarke sign as a routine part of a knee examination is not beneficial, and its use should be discontinued. PMID:18345345

  6. The diagnostic value of the Clarke sign in assessing chondromalacia patella.

    PubMed

    Doberstein, Scott T; Romeyn, Richard L; Reineke, David M

    2008-01-01

    Various techniques have been described for assessing conditions that cause pain at the patellofemoral (PF) joint. The Clarke sign is one such test, but the diagnostic value of this test in assessing chondromalacia patella is unknown. To (1) investigate the diagnostic value of the Clarke sign in assessing the presence of chondromalacia patella using arthroscopic examination of the PF joint as the "gold standard," and (2) provide a historical perspective of the Clarke sign as a clinical diagnostic test. Validation study. All patients of one of the investigators who had knee pain or injuries unrelated to the patellofemoral joint and were scheduled for arthroscopic surgery were recruited for this study. A total of 106 otherwise healthy individuals with no history of patellofemoral pain or dysfunction volunteered. The Clarke sign was performed on the surgical knee by a single investigator in the clinic before surgery. A positive test was indicated by the presence of pain sufficient to prevent the patient from maintaining a quadriceps muscle contraction against manual resistance for longer than 2 seconds. The preoperative result was compared with visual evidence of chondromalacia patella during arthroscopy. Sensitivity was 0.39, specificity was 0.67, likelihood ratio for a positive test was 1.18, likelihood ratio for a negative test was 0.91, positive predictive value was 0.25, and negative predictive value was 0.80. Diagnostic validity values for the use of the Clarke sign in assessing chondromalacia patella were unsatisfactory, supporting suggestions that it has poor diagnostic value as a clinical examination technique. Additionally, an extensive search of the available literature for the Clarke sign reveals multiple problems with the test, causing significant confusion for clinicians. Therefore, the use of the Clarke sign as a routine part of a knee examination is not beneficial, and its use should be discontinued.

  7. Manual signing in adults with intellectual disability: influence of sign characteristics on functional sign vocabulary.

    PubMed

    Meuris, Kristien; Maes, Bea; De Meyer, Anne-Marie; Zink, Inge

    2014-06-01

    The purpose of this study was to investigate the influence of sign characteristics in a key word signing (KWS) system on the functional use of those signs by adults with intellectual disability (ID). All 507 signs from a Flemish KWS system were characterized in terms of phonological, iconic, and referential characteristics. Phonological and referential characteristics were assigned to the signs by speech-language pathologists. The iconicity (i.e., transparency, guessing the meaning of the sign; and translucency, rating on a 6-point scale) of the signs were tested in 467 students. Sign functionality was studied in 119 adults with ID (mean mental age of 50.54 months) by means of a questionnaire, filled out by a support worker. A generalized linear model with a negative binomial distribution (with log-link) showed that semantic category was the factor with the strongest influence on sign functionality, with grammatical class, referential concreteness, and translucency also playing a part. No sign phonological characteristics were found to be of significant influence on sign use. The meaning of a sign is the most important factor regarding its functionality (i.e., whether a sign is used in everyday communication). Phonological characteristics seem only of minor importance.

  8. Fast Component Pursuit for Large-Scale Inverse Covariance Estimation.

    PubMed

    Han, Lei; Zhang, Yu; Zhang, Tong

    2016-08-01

    The maximum likelihood estimation (MLE) for the Gaussian graphical model, which is also known as the inverse covariance estimation problem, has gained increasing interest recently. Most existing works assume that inverse covariance estimators contain sparse structure and then construct models with the ℓ 1 regularization. In this paper, different from existing works, we study the inverse covariance estimation problem from another perspective by efficiently modeling the low-rank structure in the inverse covariance, which is assumed to be a combination of a low-rank part and a diagonal matrix. One motivation for this assumption is that the low-rank structure is common in many applications including the climate and financial analysis, and another one is that such assumption can reduce the computational complexity when computing its inverse. Specifically, we propose an efficient COmponent Pursuit (COP) method to obtain the low-rank part, where each component can be sparse. For optimization, the COP method greedily learns a rank-one component in each iteration by maximizing the log-likelihood. Moreover, the COP algorithm enjoys several appealing properties including the existence of an efficient solution in each iteration and the theoretical guarantee on the convergence of this greedy approach. Experiments on large-scale synthetic and real-world datasets including thousands of millions variables show that the COP method is faster than the state-of-the-art techniques for the inverse covariance estimation problem when achieving comparable log-likelihood on test data.

  9. Combining counts and incidence data: an efficient approach for estimating the log-normal species abundance distribution and diversity indices.

    PubMed

    Bellier, Edwige; Grøtan, Vidar; Engen, Steinar; Schartau, Ann Kristin; Diserud, Ola H; Finstad, Anders G

    2012-10-01

    Obtaining accurate estimates of diversity indices is difficult because the number of species encountered in a sample increases with sampling intensity. We introduce a novel method that requires that the presence of species in a sample to be assessed while the counts of the number of individuals per species are only required for just a small part of the sample. To account for species included as incidence data in the species abundance distribution, we modify the likelihood function of the classical Poisson log-normal distribution. Using simulated community assemblages, we contrast diversity estimates based on a community sample, a subsample randomly extracted from the community sample, and a mixture sample where incidence data are added to a subsample. We show that the mixture sampling approach provides more accurate estimates than the subsample and at little extra cost. Diversity indices estimated from a freshwater zooplankton community sampled using the mixture approach show the same pattern of results as the simulation study. Our method efficiently increases the accuracy of diversity estimates and comprehension of the left tail of the species abundance distribution. We show how to choose the scale of sample size needed for a compromise between information gained, accuracy of the estimates and cost expended when assessing biological diversity. The sample size estimates are obtained from key community characteristics, such as the expected number of species in the community, the expected number of individuals in a sample and the evenness of the community.

  10. A new discriminative kernel from probabilistic models.

    PubMed

    Tsuda, Koji; Kawanabe, Motoaki; Rätsch, Gunnar; Sonnenburg, Sören; Müller, Klaus-Robert

    2002-10-01

    Recently, Jaakkola and Haussler (1999) proposed a method for constructing kernel functions from probabilistic models. Their so-called Fisher kernel has been combined with discriminative classifiers such as support vector machines and applied successfully in, for example, DNA and protein analysis. Whereas the Fisher kernel is calculated from the marginal log-likelihood, we propose the TOP kernel derived; from tangent vectors of posterior log-odds. Furthermore, we develop a theoretical framework on feature extractors from probabilistic models and use it for analyzing the TOP kernel. In experiments, our new discriminative TOP kernel compares favorably to the Fisher kernel.

  11. Assessing the feasibility and profitability of cable logging in southern upland hardwood forests

    Treesearch

    Chris B. LeDoux; Dennis M. May; Tony Johnson; Richard H. Widmann

    1995-01-01

    Procedures developed to assess available timber supplies from upland hardwood forest statistics reported by the USDA Forest Services' Forest Inventory and Analysis unit were modified to assess the feasibility and profitability of cable logging in southern upland hardwood forests. Depending on the harvest system and yarding distance used, cable logging can be...

  12. Automation and Evaluation of the SOWH Test with SOWHAT.

    PubMed

    Church, Samuel H; Ryan, Joseph F; Dunn, Casey W

    2015-11-01

    The Swofford-Olsen-Waddell-Hillis (SOWH) test evaluates statistical support for incongruent phylogenetic topologies. It is commonly applied to determine if the maximum likelihood tree in a phylogenetic analysis is significantly different than an alternative hypothesis. The SOWH test compares the observed difference in log-likelihood between two topologies to a null distribution of differences in log-likelihood generated by parametric resampling. The test is a well-established phylogenetic method for topology testing, but it is sensitive to model misspecification, it is computationally burdensome to perform, and its implementation requires the investigator to make several decisions that each have the potential to affect the outcome of the test. We analyzed the effects of multiple factors using seven data sets to which the SOWH test was previously applied. These factors include a number of sample replicates, likelihood software, the introduction of gaps to simulated data, the use of distinct models of evolution for data simulation and likelihood inference, and a suggested test correction wherein an unresolved "zero-constrained" tree is used to simulate sequence data. To facilitate these analyses and future applications of the SOWH test, we wrote SOWHAT, a program that automates the SOWH test. We find that inadequate bootstrap sampling can change the outcome of the SOWH test. The results also show that using a zero-constrained tree for data simulation can result in a wider null distribution and higher p-values, but does not change the outcome of the SOWH test for most of the data sets tested here. These results will help others implement and evaluate the SOWH test and allow us to provide recommendations for future applications of the SOWH test. SOWHAT is available for download from https://github.com/josephryan/SOWHAT. © The Author(s) 2015. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  13. Patient Eye Examinations - Adults

    MedlinePlus

    Explore Recent Photos Trending Flickr VR The Commons Galleries World Map Camera Finder The Weekly Flickr Flickr Blog Create Upload Log In Sign Up Explore Recent Photos Trending The Commons Galleries The Weekly Flickr Flickr ...

  14. Anatomy of the Eye

    MedlinePlus

    Explore Recent Photos Trending Flickr VR The Commons Galleries World Map Camera Finder The Weekly Flickr Flickr Blog Create Upload Log In Sign Up Explore Recent Photos Trending The Commons Galleries The Weekly Flickr Flickr ...

  15. Wildlife Warning Signs: Public Assessment of Components, Placement and Designs to Optimise Driver Response.

    PubMed

    Bond, Amy R F; Jones, Darryl N

    2013-12-17

    Wildlife warning signs are the most commonly used and widespread form of road impact mitigation, aimed at reducing the incidence of wildlife-vehicle collisions. Evidence of the effectiveness of currently used signs is rare and often indicates minimal change in driver behaviour. Improving the design of these signs to increase the likelihood of appropriate driver response has the potential to reduce the incidence of wildlife-vehicle collisions. This study aimed to examine and assess the opinions of drivers on wildlife warning sign designs through a public opinion survey. Three currently used sign designs and five alternative sign designs were compared in the survey. A total of 134 drivers were surveyed. The presence of temporal specifications and an updated count of road-killed animals on wildlife warning signs were assessed, as well as the position of the sign. Drivers' responses to the eight signs were scaled separately at three speed limits and participants indicated the sign to which they were most likely to respond. Three signs consistently ranked high. The messages conveyed by these signs and their prominent features were explored. Animal-activated and vehicle speed-activated signs were ranked very highly by participants. Extensive field trials of various sign designs are needed to further this research into optimizing wildlife warning sign designs.

  16. Cumulative risk and AIDS-orphanhood: interactions of stigma, bullying and poverty on child mental health in South Africa.

    PubMed

    Cluver, Lucie; Orkin, Mark

    2009-10-01

    Research shows that AIDS-orphaned children are more likely to experience clinical-range psychological problems. Little is known about possible interactions between factors mediating these high distress levels. We assessed how food insecurity, bullying, and AIDS-related stigma interacted with each other and with likelihood of experiencing clinical-range disorder. In South Africa, 1025 adolescents completed standardised measures of depression, anxiety and post-traumatic stress. 52 potential mediators were measured, including AIDS-orphanhood status. Logistic regressions and hierarchical log-linear modelling were used to identify interactions among significant risk factors. Food insecurity, stigma and bullying all independently increased likelihood of disorder. Poverty and stigma were found to interact strongly, and with both present, likelihood of disorder rose from 19% to 83%. Similarly, bullying interacted with AIDS-orphanhood status, and with both present, likelihood of disorder rose from 12% to 76%. Approaches to alleviating psychological distress amongst AIDS-affected children must address cumulative risk effects.

  17. The word frequency effect during sentence reading: A linear or nonlinear effect of log frequency?

    PubMed

    White, Sarah J; Drieghe, Denis; Liversedge, Simon P; Staub, Adrian

    2016-10-20

    The effect of word frequency on eye movement behaviour during reading has been reported in many experimental studies. However, the vast majority of these studies compared only two levels of word frequency (high and low). Here we assess whether the effect of log word frequency on eye movement measures is linear, in an experiment in which a critical target word in each sentence was at one of three approximately equally spaced log frequency levels. Separate analyses treated log frequency as a categorical or a continuous predictor. Both analyses showed only a linear effect of log frequency on the likelihood of skipping a word, and on first fixation duration. Ex-Gaussian analyses of first fixation duration showed similar effects on distributional parameters in comparing high- and medium-frequency words, and medium- and low-frequency words. Analyses of gaze duration and the probability of a refixation suggested a nonlinear pattern, with a larger effect at the lower end of the log frequency scale. However, the nonlinear effects were small, and Bayes Factor analyses favoured the simpler linear models for all measures. The possible roles of lexical and post-lexical factors in producing nonlinear effects of log word frequency during sentence reading are discussed.

  18. Equivalence between Step Selection Functions and Biased Correlated Random Walks for Statistical Inference on Animal Movement.

    PubMed

    Duchesne, Thierry; Fortin, Daniel; Rivest, Louis-Paul

    2015-01-01

    Animal movement has a fundamental impact on population and community structure and dynamics. Biased correlated random walks (BCRW) and step selection functions (SSF) are commonly used to study movements. Because no studies have contrasted the parameters and the statistical properties of their estimators for models constructed under these two Lagrangian approaches, it remains unclear whether or not they allow for similar inference. First, we used the Weak Law of Large Numbers to demonstrate that the log-likelihood function for estimating the parameters of BCRW models can be approximated by the log-likelihood of SSFs. Second, we illustrated the link between the two approaches by fitting BCRW with maximum likelihood and with SSF to simulated movement data in virtual environments and to the trajectory of bison (Bison bison L.) trails in natural landscapes. Using simulated and empirical data, we found that the parameters of a BCRW estimated directly from maximum likelihood and by fitting an SSF were remarkably similar. Movement analysis is increasingly used as a tool for understanding the influence of landscape properties on animal distribution. In the rapidly developing field of movement ecology, management and conservation biologists must decide which method they should implement to accurately assess the determinants of animal movement. We showed that BCRW and SSF can provide similar insights into the environmental features influencing animal movements. Both techniques have advantages. BCRW has already been extended to allow for multi-state modeling. Unlike BCRW, however, SSF can be estimated using most statistical packages, it can simultaneously evaluate habitat selection and movement biases, and can easily integrate a large number of movement taxes at multiple scales. SSF thus offers a simple, yet effective, statistical technique to identify movement taxis.

  19. Improvement of mannitol-yolk-polymyxin B agar by supplementing with trimethoprim for quantitative detection of Bacillus cereus in foods.

    PubMed

    Chon, Jung-Whan; Hyeon, Ji-Yeon; Park, Jun-Ho; Song, Kwang-Young; Kim, Jong-Hyun; Seo, Kun-Ho

    2012-07-01

    Mannitol-yolk-polymyxin B agar (MYPA) was modified by supplementation with trimethoprim. The ability of the supplemented medium to select for and recover Bacillus cereus from pure cultures and food samples with high background microflora was compared with MYPA. For evaluation of the modified MYPA (mMYPA) in food samples with high background microflora, B. cereus was experimentally spiked into red pepper powder, fermented soybean paste, vegetable salad, and radish sprouts, and then it was recovered on MYPA and mMYPA for comparison. In all food samples, there was no difference in recoverability (P > 0.05) between mMYPA (red pepper powder, 3.34 ± 0.24 log CFU/g; fermented soybean paste, 3.52 ± 0.47 log CFU/g; vegetable salad, 3.51 ± 0.23 log CFU/g; radish sprouts, 3.32 ± 0.40 log CFU/g) and MYPA (red pepper powder, 3.18 ± 0.20 log CFU/g; fermented soybean paste, 3.33 ± 0.43 log CFU/g; vegetable salad, 3.36 ± 0.19 log CFU/g; radish sprouts, 3.33 ± 0.31 log CFU/g). However, mMYPA exhibited better selectivity than MYPA, because additional trimethoprim made the differentiation of suspected colonies easier by inhibiting competing flora. The addition of trimethoprim to conventional media could be a useful option to improve selectivity in foods with high background microflora.

  20. Performance Analysis for Joint Target Parameter Estimation in UMTS-Based Passive Multistatic Radar with Antenna Arrays Using Modified Cramér-Rao Lower Bounds.

    PubMed

    Shi, Chenguang; Wang, Fei; Salous, Sana; Zhou, Jianjiang

    2017-10-18

    In this study, the modified Cramér-Rao lower bounds (MCRLBs) on the joint estimation of target position and velocity is investigated for a universal mobile telecommunication system (UMTS)-based passive multistatic radar system with antenna arrays. First, we analyze the log-likelihood redfunction of the received signal for a complex Gaussian extended target. Then, due to the non-deterministic transmitted data symbols, the analytically closed-form expressions of the MCRLBs on the Cartesian coordinates of target position and velocity are derived for a multistatic radar system with N t UMTS-based transmit station of L t antenna elements and N r receive stations of L r antenna elements. With the aid of numerical simulations, it is shown that increasing the number of receiving elements in each receive station can reduce the estimation errors. In addition, it is demonstrated that the MCRLB is not only a function of signal-to-noise ratio (SNR), the number of receiving antenna elements and the properties of the transmitted UMTS signals, but also a function of the relative geometric configuration between the target and the multistatic radar system.The analytical expressions for MCRLB will open up a new dimension for passive multistatic radar system by aiding the optimal placement of receive stations to improve the target parameter estimation performance.

  1. Performance Analysis for Joint Target Parameter Estimation in UMTS-Based Passive Multistatic Radar with Antenna Arrays Using Modified Cramér-Rao Lower Bounds

    PubMed Central

    Wang, Fei; Salous, Sana; Zhou, Jianjiang

    2017-01-01

    In this study, the modified Cramér-Rao lower bounds (MCRLBs) on the joint estimation of target position and velocity is investigated for a universal mobile telecommunication system (UMTS)-based passive multistatic radar system with antenna arrays. First, we analyze the log-likelihood redfunction of the received signal for a complex Gaussian extended target. Then, due to the non-deterministic transmitted data symbols, the analytically closed-form expressions of the MCRLBs on the Cartesian coordinates of target position and velocity are derived for a multistatic radar system with Nt UMTS-based transmit station of Lt antenna elements and Nr receive stations of Lr antenna elements. With the aid of numerical simulations, it is shown that increasing the number of receiving elements in each receive station can reduce the estimation errors. In addition, it is demonstrated that the MCRLB is not only a function of signal-to-noise ratio (SNR), the number of receiving antenna elements and the properties of the transmitted UMTS signals, but also a function of the relative geometric configuration between the target and the multistatic radar system.The analytical expressions for MCRLB will open up a new dimension for passive multistatic radar system by aiding the optimal placement of receive stations to improve the target parameter estimation performance. PMID:29057805

  2. Case-Deletion Diagnostics for Nonlinear Structural Equation Models

    ERIC Educational Resources Information Center

    Lee, Sik-Yum; Lu, Bin

    2003-01-01

    In this article, a case-deletion procedure is proposed to detect influential observations in a nonlinear structural equation model. The key idea is to develop the diagnostic measures based on the conditional expectation of the complete-data log-likelihood function in the EM algorithm. An one-step pseudo approximation is proposed to reduce the…

  3. Emitter Number Estimation by the General Information Theoretic Criterion from Pulse Trains

    DTIC Science & Technology

    2002-12-01

    negative log likelihood function plus a penalty function. The general information criteria by Yin and Krishnaiah [11] are different from the regular...548-551, Victoria, BC, Canada, March 1999 DRDC Ottawa TR 2002-156 11 11. L. Zhao, P. P. Krishnaiah and Z. Bai, “On some nonparametric methods for

  4. Detection of Person Misfit in Computerized Adaptive Tests with Polytomous Items.

    ERIC Educational Resources Information Center

    van Krimpen-Stoop, Edith M. L. A.; Meijer, Rob R.

    2002-01-01

    Compared the nominal and empirical null distributions of the standardized log-likelihood statistic for polytomous items for paper-and-pencil (P&P) and computerized adaptive tests (CATs). Results show that the empirical distribution of the statistic differed from the assumed standard normal distribution for both P&P tests and CATs. Also…

  5. The Sequential Probability Ratio Test and Binary Item Response Models

    ERIC Educational Resources Information Center

    Nydick, Steven W.

    2014-01-01

    The sequential probability ratio test (SPRT) is a common method for terminating item response theory (IRT)-based adaptive classification tests. To decide whether a classification test should stop, the SPRT compares a simple log-likelihood ratio, based on the classification bound separating two categories, to prespecified critical values. As has…

  6. Global Convergence of the EM Algorithm for Unconstrained Latent Variable Models with Categorical Indicators

    ERIC Educational Resources Information Center

    Weissman, Alexander

    2013-01-01

    Convergence of the expectation-maximization (EM) algorithm to a global optimum of the marginal log likelihood function for unconstrained latent variable models with categorical indicators is presented. The sufficient conditions under which global convergence of the EM algorithm is attainable are provided in an information-theoretic context by…

  7. Social Networking Sites and Language Learning

    ERIC Educational Resources Information Center

    Brick, Billy

    2011-01-01

    This article examines a study of seven learners who logged their experiences on the language leaning social networking site Livemocha over a period of three months. The features of the site are described and the likelihood of their future success is considered. The learners were introduced to the Social Networking Site (SNS) and asked to learn a…

  8. Estimating the population density of the Asian tapir (Tapirus indicus) in a selectively logged forest in Peninsular Malaysia.

    PubMed

    Rayan, D Mark; Mohamad, Shariff Wan; Dorward, Leejiah; Aziz, Sheema Abdul; Clements, Gopalasamy Reuben; Christopher, Wong Chai Thiam; Traeholt, Carl; Magintan, David

    2012-12-01

    The endangered Asian tapir (Tapirus indicus) is threatened by large-scale habitat loss, forest fragmentation and increased hunting pressure. Conservation planning for this species, however, is hampered by a severe paucity of information on its ecology and population status. We present the first Asian tapir population density estimate from a camera trapping study targeting tigers in a selectively logged forest within Peninsular Malaysia using a spatially explicit capture-recapture maximum likelihood based framework. With a trap effort of 2496 nights, 17 individuals were identified corresponding to a density (standard error) estimate of 9.49 (2.55) adult tapirs/100 km(2) . Although our results include several caveats, we believe that our density estimate still serves as an important baseline to facilitate the monitoring of tapir population trends in Peninsular Malaysia. Our study also highlights the potential of extracting vital ecological and population information for other cryptic individually identifiable animals from tiger-centric studies, especially with the use of a spatially explicit capture-recapture maximum likelihood based framework. © 2012 Wiley Publishing Asia Pty Ltd, ISZS and IOZ/CAS.

  9. Information-theoretic model selection for optimal prediction of stochastic dynamical systems from data

    NASA Astrophysics Data System (ADS)

    Darmon, David

    2018-03-01

    In the absence of mechanistic or phenomenological models of real-world systems, data-driven models become necessary. The discovery of various embedding theorems in the 1980s and 1990s motivated a powerful set of tools for analyzing deterministic dynamical systems via delay-coordinate embeddings of observations of their component states. However, in many branches of science, the condition of operational determinism is not satisfied, and stochastic models must be brought to bear. For such stochastic models, the tool set developed for delay-coordinate embedding is no longer appropriate, and a new toolkit must be developed. We present an information-theoretic criterion, the negative log-predictive likelihood, for selecting the embedding dimension for a predictively optimal data-driven model of a stochastic dynamical system. We develop a nonparametric estimator for the negative log-predictive likelihood and compare its performance to a recently proposed criterion based on active information storage. Finally, we show how the output of the model selection procedure can be used to compare candidate predictors for a stochastic system to an information-theoretic lower bound.

  10. Forest fragmentation and selective logging have inconsistent effects on multiple animal-mediated ecosystem processes in a tropical forest.

    PubMed

    Schleuning, Matthias; Farwig, Nina; Peters, Marcell K; Bergsdorf, Thomas; Bleher, Bärbel; Brandl, Roland; Dalitz, Helmut; Fischer, Georg; Freund, Wolfram; Gikungu, Mary W; Hagen, Melanie; Garcia, Francisco Hita; Kagezi, Godfrey H; Kaib, Manfred; Kraemer, Manfred; Lung, Tobias; Naumann, Clas M; Schaab, Gertrud; Templin, Mathias; Uster, Dana; Wägele, J Wolfgang; Böhning-Gaese, Katrin

    2011-01-01

    Forest fragmentation and selective logging are two main drivers of global environmental change and modify biodiversity and environmental conditions in many tropical forests. The consequences of these changes for the functioning of tropical forest ecosystems have rarely been explored in a comprehensive approach. In a Kenyan rainforest, we studied six animal-mediated ecosystem processes and recorded species richness and community composition of all animal taxa involved in these processes. We used linear models and a formal meta-analysis to test whether forest fragmentation and selective logging affected ecosystem processes and biodiversity and used structural equation models to disentangle direct from biodiversity-related indirect effects of human disturbance on multiple ecosystem processes. Fragmentation increased decomposition and reduced antbird predation, while selective logging consistently increased pollination, seed dispersal and army-ant raiding. Fragmentation modified species richness or community composition of five taxa, whereas selective logging did not affect any component of biodiversity. Changes in the abundance of functionally important species were related to lower predation by antbirds and higher decomposition rates in small forest fragments. The positive effects of selective logging on bee pollination, bird seed dispersal and army-ant raiding were direct, i.e. not related to changes in biodiversity, and were probably due to behavioural changes of these highly mobile animal taxa. We conclude that animal-mediated ecosystem processes respond in distinct ways to different types of human disturbance in Kakamega Forest. Our findings suggest that forest fragmentation affects ecosystem processes indirectly by changes in biodiversity, whereas selective logging influences processes directly by modifying local environmental conditions and resource distributions. The positive to neutral effects of selective logging on ecosystem processes show that the functionality of tropical forests can be maintained in moderately disturbed forest fragments. Conservation concepts for tropical forests should thus include not only remaining pristine forests but also functionally viable forest remnants.

  11. Forest Fragmentation and Selective Logging Have Inconsistent Effects on Multiple Animal-Mediated Ecosystem Processes in a Tropical Forest

    PubMed Central

    Schleuning, Matthias; Farwig, Nina; Peters, Marcell K.; Bergsdorf, Thomas; Bleher, Bärbel; Brandl, Roland; Dalitz, Helmut; Fischer, Georg; Freund, Wolfram; Gikungu, Mary W.; Hagen, Melanie; Garcia, Francisco Hita; Kagezi, Godfrey H.; Kaib, Manfred; Kraemer, Manfred; Lung, Tobias; Schaab, Gertrud; Templin, Mathias; Uster, Dana; Wägele, J. Wolfgang; Böhning-Gaese, Katrin

    2011-01-01

    Forest fragmentation and selective logging are two main drivers of global environmental change and modify biodiversity and environmental conditions in many tropical forests. The consequences of these changes for the functioning of tropical forest ecosystems have rarely been explored in a comprehensive approach. In a Kenyan rainforest, we studied six animal-mediated ecosystem processes and recorded species richness and community composition of all animal taxa involved in these processes. We used linear models and a formal meta-analysis to test whether forest fragmentation and selective logging affected ecosystem processes and biodiversity and used structural equation models to disentangle direct from biodiversity-related indirect effects of human disturbance on multiple ecosystem processes. Fragmentation increased decomposition and reduced antbird predation, while selective logging consistently increased pollination, seed dispersal and army-ant raiding. Fragmentation modified species richness or community composition of five taxa, whereas selective logging did not affect any component of biodiversity. Changes in the abundance of functionally important species were related to lower predation by antbirds and higher decomposition rates in small forest fragments. The positive effects of selective logging on bee pollination, bird seed dispersal and army-ant raiding were direct, i.e. not related to changes in biodiversity, and were probably due to behavioural changes of these highly mobile animal taxa. We conclude that animal-mediated ecosystem processes respond in distinct ways to different types of human disturbance in Kakamega Forest. Our findings suggest that forest fragmentation affects ecosystem processes indirectly by changes in biodiversity, whereas selective logging influences processes directly by modifying local environmental conditions and resource distributions. The positive to neutral effects of selective logging on ecosystem processes show that the functionality of tropical forests can be maintained in moderately disturbed forest fragments. Conservation concepts for tropical forests should thus include not only remaining pristine forests but also functionally viable forest remnants. PMID:22114695

  12. Development of a risk prediction model among professional hockey players with visible signs of concussion.

    PubMed

    Bruce, Jared M; Echemendia, Ruben J; Meeuwisse, Willem; Hutchison, Michael G; Aubry, Mark; Comper, Paul

    2017-04-04

    Little research examines how to best identify concussed athletes. The purpose of the present study was to develop a preliminary risk decision model that uses visible signs (VS) and mechanisms of injury (MOI) to predict the likelihood of subsequent concussion diagnosis. Coders viewed and documented VS and associated MOI for all NHL games over the course of the 2013-2014 and 2014-2015 regular seasons. After coding was completed, player concussions were identified from the NHL injury surveillance system and it was determined whether players exhibiting VS were subsequently diagnosed with concussions by club medical staff as a result of the coded event. Among athletes exhibiting VS, suspected loss of consciousness, motor incoordination or balance problems, being in a fight, having an initial hit from another player's shoulder and having a secondary hit on the ice were all associated with increased risk of subsequent concussion diagnosis. In contrast, having an initial hit with a stick was associated with decreased risk of subsequent concussion diagnosis. A risk prediction model using a combination of the above VS and MOI was superior to approaches that relied on individual VS and associated MOI (sensitivity=81%, specificity=72%, positive predictive value=26%). Combined use of VS and MOI significantly improves a clinician's ability to identify players who need to be evaluated for possible concussion. A preliminary concussion prediction log has been developed from these data. Pending prospective validation, the use of these methods may improve early concussion detection and evaluation. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. Clinical Diagnosis of Bordetella Pertussis Infection: A Systematic Review.

    PubMed

    Ebell, Mark H; Marchello, Christian; Callahan, Maria

    2017-01-01

    Bordetella pertussis (BP) is a common cause of prolonged cough. Our objective was to perform an updated systematic review of the clinical diagnosis of BP without restriction by patient age. We identified prospective cohort studies of patients with cough or suspected pertussis and assessed study quality using QUADAS-2. We performed bivariate meta-analysis to calculate summary estimates of accuracy and created summary receiver operating characteristic curves to explore heterogeneity by vaccination status and age. Of 381 studies initially identified, 22 met our inclusion criteria, of which 14 had a low risk of bias. The overall clinical impression was the most accurate predictor of BP (positive likelihood ratio [LR+], 3.3; negative likelihood ratio [LR-], 0.63). The presence of whooping cough (LR+, 2.1) and posttussive vomiting (LR+, 1.7) somewhat increased the likelihood of BP, whereas the absence of paroxysmal cough (LR-, 0.58) and the absence of sputum (LR-, 0.63) decreased it. Whooping cough and posttussive vomiting have lower sensitivity in adults. Clinical criteria defined by the Centers for Disease Control and Prevention were sensitive (0.90) but nonspecific. Typical signs and symptoms of BP may be more sensitive but less specific in vaccinated patients. The clinician's overall impression was the most accurate way to determine the likelihood of BP infection when a patient initially presented. Clinical decision rules that combine signs, symptoms, and point-of-care tests have not yet been developed or validated. © Copyright 2017 by the American Board of Family Medicine.

  14. Mixed effect Poisson log-linear models for clinical and epidemiological sleep hypnogram data

    PubMed Central

    Swihart, Bruce J.; Caffo, Brian S.; Crainiceanu, Ciprian; Punjabi, Naresh M.

    2013-01-01

    Bayesian Poisson log-linear multilevel models scalable to epidemiological studies are proposed to investigate population variability in sleep state transition rates. Hierarchical random effects are used to account for pairings of subjects and repeated measures within those subjects, as comparing diseased to non-diseased subjects while minimizing bias is of importance. Essentially, non-parametric piecewise constant hazards are estimated and smoothed, allowing for time-varying covariates and segment of the night comparisons. The Bayesian Poisson regression is justified through a re-derivation of a classical algebraic likelihood equivalence of Poisson regression with a log(time) offset and survival regression assuming exponentially distributed survival times. Such re-derivation allows synthesis of two methods currently used to analyze sleep transition phenomena: stratified multi-state proportional hazards models and log-linear models with GEE for transition counts. An example data set from the Sleep Heart Health Study is analyzed. Supplementary material includes the analyzed data set as well as the code for a reproducible analysis. PMID:22241689

  15. 14 CFR 121.709 - Airworthiness release or aircraft log entry.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... aircraft is in condition for safe operation; and (3) Be signed by an authorized certificated mechanic or... mechanic or repairman constitutes that certification. [Doc. No. 6258, 29 FR 19226, Dec. 31, 1964, as...

  16. 14 CFR 121.709 - Airworthiness release or aircraft log entry.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... aircraft is in condition for safe operation; and (3) Be signed by an authorized certificated mechanic or... mechanic or repairman constitutes that certification. [Doc. No. 6258, 29 FR 19226, Dec. 31, 1964, as...

  17. 14 CFR 121.709 - Airworthiness release or aircraft log entry.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... aircraft is in condition for safe operation; and (3) Be signed by an authorized certificated mechanic or... mechanic or repairman constitutes that certification. [Doc. No. 6258, 29 FR 19226, Dec. 31, 1964, as...

  18. 14 CFR 121.709 - Airworthiness release or aircraft log entry.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... aircraft is in condition for safe operation; and (3) Be signed by an authorized certificated mechanic or... mechanic or repairman constitutes that certification. [Doc. No. 6258, 29 FR 19226, Dec. 31, 1964, as...

  19. 14 CFR 121.709 - Airworthiness release or aircraft log entry.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... aircraft is in condition for safe operation; and (3) Be signed by an authorized certificated mechanic or... mechanic or repairman constitutes that certification. [Doc. No. 6258, 29 FR 19226, Dec. 31, 1964, as...

  20. 77 FR 58665 - Significant New Use Rules on Certain Chemical Substances

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-21

    ... show photographic identification, pass through a metal detector, and sign the EPA visitor log. All... clearly as possible, avoiding the use of profanity or personal threats. viii. Make sure to submit your...

  1. 76 FR 59699 - Receipt of Request for Waiver from Testing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-27

    ... are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will...

  2. 77 FR 10512 - Certain New Chemicals; Receipt and Status Information

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-22

    ... show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an...

  3. 77 FR 21769 - Certain New Chemicals; Receipt and Status Information

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-11

    ... show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an...

  4. Modified signed-digit trinary arithmetic by using optical symbolic substitution.

    PubMed

    Awwal, A A; Islam, M N; Karim, M A

    1992-04-10

    Carry-free addition and borrow-free subtraction of modified signed-digit trinary numbers with optical symbolic substitution are presented. The proposed two-step and three-step algorithms can be easily implemented by using phase-only holograms, optical content-addressable memories, a multichannel correlator, or a polarization-encoded optical shadow-casting system.

  5. Modified signed-digit trinary arithmetic by using optical symbolic substitution

    NASA Astrophysics Data System (ADS)

    Awwal, A. A. S.; Islam, M. N.; Karim, M. A.

    1992-04-01

    Carry-free addition and borrow-free subtraction of modified signed-digit trinary numbers with optical symbolic substitution are presented. The proposed two-step and three-step algorithms can be easily implemented by using phase-only holograms, optical content-addressable memories, a multichannel correlator, or a polarization-encoded optical shadow-casting system.

  6. Gaussian Mixture Models of Between-Source Variation for Likelihood Ratio Computation from Multivariate Data

    PubMed Central

    Franco-Pedroso, Javier; Ramos, Daniel; Gonzalez-Rodriguez, Joaquin

    2016-01-01

    In forensic science, trace evidence found at a crime scene and on suspect has to be evaluated from the measurements performed on them, usually in the form of multivariate data (for example, several chemical compound or physical characteristics). In order to assess the strength of that evidence, the likelihood ratio framework is being increasingly adopted. Several methods have been derived in order to obtain likelihood ratios directly from univariate or multivariate data by modelling both the variation appearing between observations (or features) coming from the same source (within-source variation) and that appearing between observations coming from different sources (between-source variation). In the widely used multivariate kernel likelihood-ratio, the within-source distribution is assumed to be normally distributed and constant among different sources and the between-source variation is modelled through a kernel density function (KDF). In order to better fit the observed distribution of the between-source variation, this paper presents a different approach in which a Gaussian mixture model (GMM) is used instead of a KDF. As it will be shown, this approach provides better-calibrated likelihood ratios as measured by the log-likelihood ratio cost (Cllr) in experiments performed on freely available forensic datasets involving different trace evidences: inks, glass fragments and car paints. PMID:26901680

  7. Assessment of advanced warning signs for flagging operations.

    DOT National Transportation Integrated Search

    1999-05-01

    The Virginia Department of Transportation (VDOT) and several other state departments : of transportation have expressed interest in modifying the advanced warning sign for work zone : flagging operations. The advanced warning sign is intended to aler...

  8. Bundling Logging Residues with a Modified John Deere B-380 Slash Bundler

    Treesearch

    Dana Mitchell

    2011-01-01

    A basic problem with processing biomass in the woods is that the machinery must be matched to the final product. If a logging business owner invests in a machine to produce a specific type of biomass product for a limited market, the opportunity for that logging business owner to diversify products to take advantage of market opportunities may also be limited. When...

  9. Repose time and cumulative moment magnitude: A new tool for forecasting eruptions?

    USGS Publications Warehouse

    Thelen, W.A.; Malone, S.D.; West, M.E.

    2010-01-01

    During earthquake swarms on active volcanoes, one of the primary challenges facing scientists is determining the likelihood of an eruption. Here we present the relation between repose time and the cumulative moment magnitude (CMM) as a tool to aid in differentiating between an eruption and a period of unrest. In several case studies, the CMM is lower at shorter repose times than it is at longer repose times. The relationship between repose time and CMM may be linear in log-log space, particularly at Mount St. Helens. We suggest that the volume and competence of the plug within the conduit drives the strength of the precursory CMM.

  10. Estimation Methods for Non-Homogeneous Regression - Minimum CRPS vs Maximum Likelihood

    NASA Astrophysics Data System (ADS)

    Gebetsberger, Manuel; Messner, Jakob W.; Mayr, Georg J.; Zeileis, Achim

    2017-04-01

    Non-homogeneous regression models are widely used to statistically post-process numerical weather prediction models. Such regression models correct for errors in mean and variance and are capable to forecast a full probability distribution. In order to estimate the corresponding regression coefficients, CRPS minimization is performed in many meteorological post-processing studies since the last decade. In contrast to maximum likelihood estimation, CRPS minimization is claimed to yield more calibrated forecasts. Theoretically, both scoring rules used as an optimization score should be able to locate a similar and unknown optimum. Discrepancies might result from a wrong distributional assumption of the observed quantity. To address this theoretical concept, this study compares maximum likelihood and minimum CRPS estimation for different distributional assumptions. First, a synthetic case study shows that, for an appropriate distributional assumption, both estimation methods yield to similar regression coefficients. The log-likelihood estimator is slightly more efficient. A real world case study for surface temperature forecasts at different sites in Europe confirms these results but shows that surface temperature does not always follow the classical assumption of a Gaussian distribution. KEYWORDS: ensemble post-processing, maximum likelihood estimation, CRPS minimization, probabilistic temperature forecasting, distributional regression models

  11. Nonparametric evaluation of birth cohort trends in disease rates.

    PubMed

    Tarone, R E; Chu, K C

    2000-01-01

    Although interpretation of age-period-cohort analyses is complicated by the non-identifiability of maximum likelihood estimates, changes in the slope of the birth-cohort effect curve are identifiable and have potential aetiologic significance. A nonparametric test for a change in the slope of the birth-cohort trend has been developed. The test is a generalisation of the sign test and is based on permutational distributions. A method for identifying interactions between age and calendar-period effects is also presented. The nonparametric method is shown to be powerful in detecting changes in the slope of the birth-cohort trend, although its power can be reduced considerably by calendar-period patterns of risk. The method identifies a previously unidentified decrease in the birth-cohort risk of lung-cancer mortality from 1912 to 1919, which appears to reflect a reduction in the initiation of smoking by young men at the beginning of the Great Depression (1930s). The method also detects an interaction between age and calendar period in leukemia mortality rates, reflecting the better response of children to chemotherapy. The proposed nonparametric method provides a data analytic approach, which is a useful adjunct to log-linear Poisson analysis of age-period-cohort models, either in the initial model building stage, or in the final interpretation stage.

  12. S-Nitrosothiol-Modified Nitric Oxide-Releasing Chitosan Oligosaccharides as Antibacterial Agents

    PubMed Central

    Lu, Yuan; Shah, Anand; Hunter, Rebecca A.; Soto, Robert J.; Schoenfisch, Mark H.

    2017-01-01

    S-nitrosothiol-modified chitosan oligosaccharides were synthesized by reaction with 2-iminothiolane hydrochloride and 3-acetamido-4,4-dimethylthietan-2-one, followed by the thiol nitrosation. The resulting nitric oxide (NO)-releasing chitosan oligosaccharides stored ~0.3 μmol NO/mg chitosan. Both the chemical structure of the nitrosothiol (i.e., primary and tertiary) and the use of ascorbic acid as a trigger for NO donor decomposition were used to control the NO-release kinetics. With ascorbic acid, the S-nitrosothiol-modified chitosan oligosaccharides elicited a 4-log reduction in Pseudomonas aeruginosa (P. aeruginosa) viability. Confocal microscopy indicated that the primary S-nitrosothiol-modified chitosan oligosaccharides associated more with the bacteria relative to the tertiary S-nitrosothiol system. The primary S-nitrosothiol-modified chitosan oligosaccharides elicited minimal toxicity towards L929 mouse fibroblast cells at the concentration necessary for a 4-log reduction in bacterial viability, further demonstrating the potential of S-nitrosothiol-modified chitosan oligosaccharides as NO-release therapeutics. PMID:25449913

  13. 14 CFR 135.443 - Airworthiness release or aircraft maintenance log entry.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... mechanic or repairman, except that a certificated repairman may sign the release or entry only for the work... certificate holder may state in its manual that the signature of an authorized certificated mechanic or...

  14. 14 CFR 135.443 - Airworthiness release or aircraft maintenance log entry.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... mechanic or repairman, except that a certificated repairman may sign the release or entry only for the work... certificate holder may state in its manual that the signature of an authorized certificated mechanic or...

  15. 14 CFR 135.443 - Airworthiness release or aircraft maintenance log entry.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... mechanic or repairman, except that a certificated repairman may sign the release or entry only for the work... certificate holder may state in its manual that the signature of an authorized certificated mechanic or...

  16. 14 CFR 135.443 - Airworthiness release or aircraft maintenance log entry.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... mechanic or repairman, except that a certificated repairman may sign the release or entry only for the work... certificate holder may state in its manual that the signature of an authorized certificated mechanic or...

  17. 14 CFR 135.443 - Airworthiness release or aircraft maintenance log entry.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... mechanic or repairman, except that a certificated repairman may sign the release or entry only for the work... certificate holder may state in its manual that the signature of an authorized certificated mechanic or...

  18. 77 FR 12284 - Access to Confidential Business Information; Protection Strategies Incorporated

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-29

    ... (202) 566-0280. Docket visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject...

  19. 77 FR 21096 - Agency Information Collection Activities; Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-09

    ... show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an...

  20. 76 FR 77224 - Access to Confidential Business Information by Primus Solutions, Inc.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-12

    ... are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will...

  1. 77 FR 47640 - Agency Information Collection Activities; Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-09

    ... (202) 566-0280. Docket visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject...

  2. 78 FR 70037 - Agency Information Collection Activities; Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-22

    ... show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an...

  3. 78 FR 44560 - Agency Information Collection Activities; Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-24

    ... are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will...

  4. 76 FR 70443 - Decision on Waiver Application From 3M

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-14

    ... (202) 566-0280. Docket visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject...

  5. 78 FR 60867 - Agency Information Collection Activities; Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-02

    ... show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an...

  6. 76 FR 46794 - Agency Information Collection Activities; Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-03

    ... show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an...

  7. 75 FR 29429 - Revocation of Significant New Use Rule on a Certain Chemical Substance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-26

    ... detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject... is readily biodegradable, mitigating concerns for chronic toxicity to aquatic organisms. Therefore...

  8. Final Report on Video Log Data Mining Project

    DOT National Transportation Integrated Search

    2012-06-01

    This report describes the development of an automated computer vision system that identities and inventories road signs : from imagery acquired from the Kansas Department of Transportations road profiling system that takes images every 26.4 : feet...

  9. Salvage logging, ecosystem processes, and biodiversity conservation.

    PubMed

    Lindenmayer, D B; Noss, R F

    2006-08-01

    We summarize the documented and potential impacts of salvage logging--a form of logging that removes trees and other biological material from sites after natural disturbance. Such operations may reduce or eliminate biological legacies, modify rare postdisturbance habitats, influence populations, alter community composition, impair natural vegetation recovery, facilitate the colonization of invasive species, alter soil properties and nutrient levels, increase erosion, modify hydrological regimes and aquatic ecosystems, and alter patterns of landscape heterogeneity These impacts can be assigned to three broad and interrelated effects: (1) altered stand structural complexity; (2) altered ecosystem processes and functions; and (3) altered populations of species and community composition. Some impacts may be different from or additional to the effects of traditional logging that is not preceded by a large natural disturbance because the conditions before, during, and after salvage logging may differ from those that characterize traditional timber harvesting. The potential impacts of salvage logging often have been overlooked, partly because the processes of ecosystem recovery after natural disturbance are still poorly understood and partly because potential cumulative effects of natural and human disturbance have not been well documented. Ecologically informed policies regarding salvage logging are needed prior to major natural disturbances so that when they occur ad hoc and crisis-mode decision making can be avoided. These policies should lead to salvage-exemption zones and limits on the amounts of disturbance-derived biological legacies (e.g., burned trees, logs) that are removed where salvage logging takes place. Finally, we believe new terminology is needed. The word salvage implies that something is being saved or recovered, whereas from an ecological perspective this is rarely the case.

  10. An Empirical Comparison of DDF Detection Methods for Understanding the Causes of DIF in Multiple-Choice Items

    ERIC Educational Resources Information Center

    Suh, Youngsuk; Talley, Anna E.

    2015-01-01

    This study compared and illustrated four differential distractor functioning (DDF) detection methods for analyzing multiple-choice items. The log-linear approach, two item response theory-model-based approaches with likelihood ratio tests, and the odds ratio approach were compared to examine the congruence among the four DDF detection methods.…

  11. APPROXIMATION AND ESTIMATION OF s-CONCAVE DENSITIES VIA RÉNYI DIVERGENCES.

    PubMed

    Han, Qiyang; Wellner, Jon A

    2016-01-01

    In this paper, we study the approximation and estimation of s -concave densities via Rényi divergence. We first show that the approximation of a probability measure Q by an s -concave density exists and is unique via the procedure of minimizing a divergence functional proposed by [ Ann. Statist. 38 (2010) 2998-3027] if and only if Q admits full-dimensional support and a first moment. We also show continuity of the divergence functional in Q : if Q n → Q in the Wasserstein metric, then the projected densities converge in weighted L 1 metrics and uniformly on closed subsets of the continuity set of the limit. Moreover, directional derivatives of the projected densities also enjoy local uniform convergence. This contains both on-the-model and off-the-model situations, and entails strong consistency of the divergence estimator of an s -concave density under mild conditions. One interesting and important feature for the Rényi divergence estimator of an s -concave density is that the estimator is intrinsically related with the estimation of log-concave densities via maximum likelihood methods. In fact, we show that for d = 1 at least, the Rényi divergence estimators for s -concave densities converge to the maximum likelihood estimator of a log-concave density as s ↗ 0. The Rényi divergence estimator shares similar characterizations as the MLE for log-concave distributions, which allows us to develop pointwise asymptotic distribution theory assuming that the underlying density is s -concave.

  12. APPROXIMATION AND ESTIMATION OF s-CONCAVE DENSITIES VIA RÉNYI DIVERGENCES

    PubMed Central

    Han, Qiyang; Wellner, Jon A.

    2017-01-01

    In this paper, we study the approximation and estimation of s-concave densities via Rényi divergence. We first show that the approximation of a probability measure Q by an s-concave density exists and is unique via the procedure of minimizing a divergence functional proposed by [Ann. Statist. 38 (2010) 2998–3027] if and only if Q admits full-dimensional support and a first moment. We also show continuity of the divergence functional in Q: if Qn → Q in the Wasserstein metric, then the projected densities converge in weighted L1 metrics and uniformly on closed subsets of the continuity set of the limit. Moreover, directional derivatives of the projected densities also enjoy local uniform convergence. This contains both on-the-model and off-the-model situations, and entails strong consistency of the divergence estimator of an s-concave density under mild conditions. One interesting and important feature for the Rényi divergence estimator of an s-concave density is that the estimator is intrinsically related with the estimation of log-concave densities via maximum likelihood methods. In fact, we show that for d = 1 at least, the Rényi divergence estimators for s-concave densities converge to the maximum likelihood estimator of a log-concave density as s ↗ 0. The Rényi divergence estimator shares similar characterizations as the MLE for log-concave distributions, which allows us to develop pointwise asymptotic distribution theory assuming that the underlying density is s-concave. PMID:28966410

  13. Power calculations for likelihood ratio tests for offspring genotype risks, maternal effects, and parent-of-origin (POO) effects in the presence of missing parental genotypes when unaffected siblings are available.

    PubMed

    Rampersaud, E; Morris, R W; Weinberg, C R; Speer, M C; Martin, E R

    2007-01-01

    Genotype-based likelihood-ratio tests (LRT) of association that examine maternal and parent-of-origin effects have been previously developed in the framework of log-linear and conditional logistic regression models. In the situation where parental genotypes are missing, the expectation-maximization (EM) algorithm has been incorporated in the log-linear approach to allow incomplete triads to contribute to the LRT. We present an extension to this model which we call the Combined_LRT that incorporates additional information from the genotypes of unaffected siblings to improve assignment of incompletely typed families to mating type categories, thereby improving inference of missing parental data. Using simulations involving a realistic array of family structures, we demonstrate the validity of the Combined_LRT under the null hypothesis of no association and provide power comparisons under varying levels of missing data and using sibling genotype data. We demonstrate the improved power of the Combined_LRT compared with the family-based association test (FBAT), another widely used association test. Lastly, we apply the Combined_LRT to a candidate gene analysis in Autism families, some of which have missing parental genotypes. We conclude that the proposed log-linear model will be an important tool for future candidate gene studies, for many complex diseases where unaffected siblings can often be ascertained and where epigenetic factors such as imprinting may play a role in disease etiology.

  14. A Maximum Likelihood Approach to Functional Mapping of Longitudinal Binary Traits

    PubMed Central

    Wang, Chenguang; Li, Hongying; Wang, Zhong; Wang, Yaqun; Wang, Ningtao; Wang, Zuoheng; Wu, Rongling

    2013-01-01

    Despite their importance in biology and biomedicine, genetic mapping of binary traits that change over time has not been well explored. In this article, we develop a statistical model for mapping quantitative trait loci (QTLs) that govern longitudinal responses of binary traits. The model is constructed within the maximum likelihood framework by which the association between binary responses is modeled in terms of conditional log odds-ratios. With this parameterization, the maximum likelihood estimates (MLEs) of marginal mean parameters are robust to the misspecification of time dependence. We implement an iterative procedures to obtain the MLEs of QTL genotype-specific parameters that define longitudinal binary responses. The usefulness of the model was validated by analyzing a real example in rice. Simulation studies were performed to investigate the statistical properties of the model, showing that the model has power to identify and map specific QTLs responsible for the temporal pattern of binary traits. PMID:23183762

  15. Methods of generating synthetic acoustic logs from resistivity logs for gas-hydrate-bearing sediments

    USGS Publications Warehouse

    Lee, Myung W.

    1999-01-01

    Methods of predicting acoustic logs from resistivity logs for hydrate-bearing sediments are presented. Modified time average equations derived from the weighted equation provide a means of relating the velocity of the sediment to the resistivity of the sediment. These methods can be used to transform resistivity logs into acoustic logs with or without using the gas hydrate concentration in the pore space. All the parameters except the unconsolidation constants, necessary for the prediction of acoustic log from resistivity log, can be estimated from a cross plot of resistivity versus porosity values. Unconsolidation constants in equations may be assumed without rendering significant errors in the prediction. These methods were applied to the acoustic and resistivity logs acquired at the Mallik 2L-38 gas hydrate research well drilled at the Mackenzie Delta, northern Canada. The results indicate that the proposed method is simple and accurate.

  16. Higher-Order Asymptotics and Its Application to Testing the Equality of the Examinee Ability Over Two Sets of Items.

    PubMed

    Sinharay, Sandip; Jensen, Jens Ledet

    2018-06-27

    In educational and psychological measurement, researchers and/or practitioners are often interested in examining whether the ability of an examinee is the same over two sets of items. Such problems can arise in measurement of change, detection of cheating on unproctored tests, erasure analysis, detection of item preknowledge, etc. Traditional frequentist approaches that are used in such problems include the Wald test, the likelihood ratio test, and the score test (e.g., Fischer, Appl Psychol Meas 27:3-26, 2003; Finkelman, Weiss, & Kim-Kang, Appl Psychol Meas 34:238-254, 2010; Glas & Dagohoy, Psychometrika 72:159-180, 2007; Guo & Drasgow, Int J Sel Assess 18:351-364, 2010; Klauer & Rettig, Br J Math Stat Psychol 43:193-206, 1990; Sinharay, J Educ Behav Stat 42:46-68, 2017). This paper shows that approaches based on higher-order asymptotics (e.g., Barndorff-Nielsen & Cox, Inference and asymptotics. Springer, London, 1994; Ghosh, Higher order asymptotics. Institute of Mathematical Statistics, Hayward, 1994) can also be used to test for the equality of the examinee ability over two sets of items. The modified signed likelihood ratio test (e.g., Barndorff-Nielsen, Biometrika 73:307-322, 1986) and the Lugannani-Rice approximation (Lugannani & Rice, Adv Appl Prob 12:475-490, 1980), both of which are based on higher-order asymptotics, are shown to provide some improvement over the traditional frequentist approaches in three simulations. Two real data examples are also provided.

  17. 77 FR 68769 - Access to Confidential Business Information by Eastern Research Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-16

    ... (202) 566-0280. Docket visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject...

  18. 76 FR 69722 - Access to Confidential Business Information by Protection Strategies Incorporated

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-09

    ... visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will...

  19. 75 FR 78238 - Access to Confidential Business Information by Science Applications International Corporation and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-15

    ... show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an...

  20. 77 FR 71415 - Agency Information Collection Activities; Proposed Collection; Comment Request; Notification of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-30

    ... are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will...

  1. 76 FR 23586 - Access to Confidential Business Information by Syracuse Research Corporation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-27

    ... (202) 566-0280. Docket visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject...

  2. 78 FR 64435 - Extension of Comment Period for the NPDES Electronic Reporting Rule

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-29

    ...-1752. Docket visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and are subject...

  3. 77 FR 69824 - Certain New Chemicals; Receipt and Status Information

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-21

    ... identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that must be...

  4. 78 FR 66696 - Access to Confidential Business Information by Arcadis U.S., Inc.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-06

    ... (202) 566-0280. Docket visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject...

  5. 75 FR 44249 - Proposed Acute Exposure Guideline Levels for Hazardous Substances; Notice of Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-28

    ... show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an...

  6. 78 FR 48431 - Agency Information Collection Activities; Proposed Collection of Several Currently Approved...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-08

    ... (202) 566-0280. Docket visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject...

  7. 78 FR 71603 - Agency Information Collection Activities; Proposed Renewal of Several Currently Approved...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-29

    ... (202) 566-0280. Docket visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject...

  8. Competency-based residency training and the web log: modeling practice-based learning and enhancing medical knowledge.

    PubMed

    Hollon, Matthew F

    2015-01-01

    By using web-based tools in medical education, there are opportunities to innovatively teach important principles from the general competencies of graduate medical education. Postulating that faculty transparency in learning from uncertainties in clinical work could help residents to incorporate the principles of practice-based learning and improvement (PBLI) in their professional development, faculty in this community-based residency program modeled the steps of PBLI on a weekly basis through the use of a web log. The program confidentially surveyed residents before and after this project about actions consistent with PBLI and knowledge acquired through reading the web log. The frequency that residents encountered clinical situations where they felt uncertain declined over the course of the 24 weeks of the project from a mean frequency of uncertainty of 36% to 28% (Wilcoxon signed rank test, p=0.008); however, the frequency with which residents sought answers when faced with uncertainty did not change (Wilcoxon signed rank test, p=0.39), remaining high at approximately 80%. Residents answered a mean of 52% of knowledge questions correct when tested prior to faculty posts to the blog, rising to a mean of 65% of questions correct when tested at the end of the project (paired t-test, p=0.001). Faculty role modeling of PBLI behaviors and posting clinical questions and answers to a web log led to modest improvements in medical knowledge but did not alter behavior that was already taking place frequently among residents.

  9. Trueness and precision of the real-time RT-PCR method for quantifying the chronic bee paralysis virus genome in bee homogenates evaluated by a comparative inter-laboratory study.

    PubMed

    Schurr, Frank; Cougoule, Nicolas; Rivière, Marie-Pierre; Ribière-Chabert, Magali; Achour, Hamid; Ádám, Dán; Castillo, Carlos; de Graaf, Dirk C; Forsgren, Eva; Granato, Anna; Heinikainen, Sirpa; Jurovčíková, Júlia; Kryger, Per; Manson, Christine; Ménard, Marie-Françoise; Perennes, Stéphane; Schäfer, Marc O; Ibañez, Elena San Miguel; Silva, João; Gajger, Ivana Tlak; Tomkies, Victoria; Toplak, Ivan; Viry, Alain; Zdańska, Dagmara; Dubois, Eric

    2017-10-01

    The Chronic bee paralysis virus (CBPV) is the aetiological agent of chronic bee paralysis, a contagious disease associated with nervous disorders in adult honeybees leading to massive mortalities in front of the hives. Some of the clinical signs frequently reported, such as trembling, may be confused with intoxication syndromes. Therefore, laboratory diagnosis using real-time PCR to quantify CBPV loads is used to confirm disease. Clinical signs of chronic paralysis are usually associated with viral loads higher than 10 8 copies of CBPV genome copies per bee (8 log 10 CBPV/bee). This threshold is used by the European Union Reference Laboratory for Bee Health to diagnose the disease. In 2015, the accuracy of measurements of three CBPV loads (5, 8 and 9 log 10 CBPV/bee) was assessed through an inter-laboratory study. Twenty-one participants, including 16 European National Reference Laboratories, received 13 homogenates of CBPV-infected bees adjusted to the three loads. Participants were requested to use the method usually employed for routine diagnosis. The quantitative results (n=270) were analysed according to international standards NF ISO 13528 (2015) and NF ISO 5725-2 (1994). The standard deviations of measurement reproducibility (S R ) were 0.83, 1.06 and 1.16 at viral loads 5, 8 and 9 log 10 CBPV/bee, respectively. The inter-laboratory confidence of viral quantification (+/- 1.96S R ) at the diagnostic threshold (8 log 10 CBPV/bee) was+/- 2.08 log 10 CBPV/bee. These results highlight the need to take into account the confidence of measurements in epidemiological studies using results from different laboratories. Considering this confidence, viral loads over 6 log 10 CBPV/bee may be considered to indicate probable cases of chronic paralysis. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  10. Modeling of 2D diffusion processes based on microscopy data: parameter estimation and practical identifiability analysis.

    PubMed

    Hock, Sabrina; Hasenauer, Jan; Theis, Fabian J

    2013-01-01

    Diffusion is a key component of many biological processes such as chemotaxis, developmental differentiation and tissue morphogenesis. Since recently, the spatial gradients caused by diffusion can be assessed in-vitro and in-vivo using microscopy based imaging techniques. The resulting time-series of two dimensional, high-resolutions images in combination with mechanistic models enable the quantitative analysis of the underlying mechanisms. However, such a model-based analysis is still challenging due to measurement noise and sparse observations, which result in uncertainties of the model parameters. We introduce a likelihood function for image-based measurements with log-normal distributed noise. Based upon this likelihood function we formulate the maximum likelihood estimation problem, which is solved using PDE-constrained optimization methods. To assess the uncertainty and practical identifiability of the parameters we introduce profile likelihoods for diffusion processes. As proof of concept, we model certain aspects of the guidance of dendritic cells towards lymphatic vessels, an example for haptotaxis. Using a realistic set of artificial measurement data, we estimate the five kinetic parameters of this model and compute profile likelihoods. Our novel approach for the estimation of model parameters from image data as well as the proposed identifiability analysis approach is widely applicable to diffusion processes. The profile likelihood based method provides more rigorous uncertainty bounds in contrast to local approximation methods.

  11. 77 FR 34777 - Seventieth Report of the TSCA Interagency Testing Committee to the Administrator of the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-11

    ... show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an...

  12. 75 FR 8330 - Access to Confidential Business Information by Eastern Research Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-24

    ... identification, pass through a metal detector, and sign the EPA visitor log. All visitors' bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that must be...

  13. 77 FR 32633 - Approval of Test Marketing Exemptions for Certain New Chemicals

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-01

    ... identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that must be...

  14. 77 FR 68769 - Access to Confidential Business Information by Abt Associates, Inc.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-16

    ... identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that must be...

  15. 75 FR 14153 - National Advisory Committee for Acute Exposure Guideline Levels for Hazardous Substances; Notice...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-24

    ... show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an...

  16. 78 FR 67139 - Access to Confidential Business Information by Eastern Research Group and Its Identified...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-08

    ... (202) 566-0280. Docket visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject...

  17. 75 FR 24688 - Access to Confidential Business Information by Guident Technologies Inc.’s Identified Subcontractor

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-05

    ... show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an...

  18. 77 FR 26750 - Agency Information Collection Activities; Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-07

    ... identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that must be...

  19. 77 FR 21766 - Access to Confidential Business Information by CGI Federal Inc.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-11

    ... identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that must be...

  20. 76 FR 38170 - Toxic Substances Control Act Chemical Testing; Receipt of Test Data

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-29

    ... photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that...

  1. 77 FR 71417 - Agency Information Collection Activities; Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-30

    ... identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that must be...

  2. 76 FR 77816 - Access to Confidential Business Information by Guident Technologies, Inc. and Subcontractor...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-14

    ... (202) 566-0280. Docket visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject...

  3. 75 FR 56096 - Access to Confidential Business Information by Industrial Economics Incorporated

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-15

    ... photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that...

  4. 78 FR 20101 - Access to Confidential Business Information by Chemical Abstract Services

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-03

    ... identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that must be...

  5. 77 FR 13506 - Modification of Significant New Uses of Tris Carbamoyl Triazine; Technical Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-07

    ... (202) 566-0280. Docket visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject...

  6. Real-Time Population Health Detector

    DTIC Science & Technology

    2004-11-01

    military and civilian populations. General Dynamics (then Veridian Systems Division), in cooperation with Stanford University, won a competitive DARPA...via the sequence of one-step ahead forecast errors from the Kalman recursions: 1| −−= tttt Hye µ The log-likelihood then follows by treating the... parking in the transient parking structure. Norfolk Area Military Treatment Facility Patient Files GDAIS received historic CHCS data from all

  7. Model-based estimation with boundary side information or boundary regularization [cardiac emission CT].

    PubMed

    Chiao, P C; Rogers, W L; Fessler, J A; Clinthorne, N H; Hero, A O

    1994-01-01

    The authors have previously developed a model-based strategy for joint estimation of myocardial perfusion and boundaries using ECT (emission computed tomography). They have also reported difficulties with boundary estimation in low contrast and low count rate situations. Here they propose using boundary side information (obtainable from high resolution MRI and CT images) or boundary regularization to improve both perfusion and boundary estimation in these situations. To fuse boundary side information into the emission measurements, the authors formulate a joint log-likelihood function to include auxiliary boundary measurements as well as ECT projection measurements. In addition, they introduce registration parameters to align auxiliary boundary measurements with ECT measurements and jointly estimate these parameters with other parameters of interest from the composite measurements. In simulated PET O-15 water myocardial perfusion studies using a simplified model, the authors show that the joint estimation improves perfusion estimation performance and gives boundary alignment accuracy of <0.5 mm even at 0.2 million counts. They implement boundary regularization through formulating a penalized log-likelihood function. They also demonstrate in simulations that simultaneous regularization of the epicardial boundary and myocardial thickness gives comparable perfusion estimation accuracy with the use of boundary side information.

  8. THE DIAGNOSTIC ACCURACY OF THE LEVER SIGN FOR DETECTING ANTERIOR CRUCIATE LIGAMENT INJURY

    PubMed Central

    Anderson, Amanda; Watson, Seth; Dimeff, Robert J.

    2017-01-01

    Background An alternative physical examination procedure for evaluating the integrity of the anterior cruciate ligament (ACL) has been proposed in the literature but has not been validated in a broad population of patients with a symptomatic complaint of knee pain for its diagnostic value. Purpose To investigate the diagnostic accuracy of the Lever Sign to detect ACL tears and compare the results to Lachman testing in both supine and prone positions. Study design Prospective, blinded, diagnostic accuracy study. Methods Sixty-two consecutive patients with a complaint of knee pain were independently evaluated for the status of the ACL's integrity with the Lever Sign and the Lachman test in a prone and supine by a blinded examiner before any other diagnostic assessments were completed. Results Twenty-four of the 60 patients included in the analysis had a torn ACL resulting in a prevalence of 40%. The sensitivity of the Lever Sign, prone, and supine Lachman tests were 38, 83, and 67 % respectively and the specificity was 72, 89, and 97% resulting in positive likelihood ratios of 1.4, 7.5, and 24 and negative likelihood ratios of 0.86, 0.19, and 0.34 respectively. The positive predictive values were 47, 83, and 94% and the negative predictive values were 63, 89, and 81% respectively. The diagnostic odds ratios were 1.6, 40, and 70 with a number needed to diagnose of 10.3, 1.4, and 1.6 respectively. Conclusions The results of this study suggest that Lever Sign, in isolation, does not accurately detect the status of the ACL. During the clinical examination, the Lever Sign should be used as an adjunct to the gold standard assessment technique of anterior tibial translation assessment as employed in the Lachman tests in either prone or supine position. Level of Evidence 2 PMID:29234557

  9. Computing Maximum Likelihood Estimates of Loglinear Models from Marginal Sums with Special Attention to Loglinear Item Response Theory.

    ERIC Educational Resources Information Center

    Kelderman, Henk

    1992-01-01

    Describes algorithms used in the computer program LOGIMO for obtaining maximum likelihood estimates of the parameters in loglinear models. These algorithms are also useful for the analysis of loglinear item-response theory models. Presents modified versions of the iterative proportional fitting and Newton-Raphson algorithms. Simulated data…

  10. The nurse response to abnormal vital sign recording in the emergency department.

    PubMed

    Johnson, Kimberly D; Mueller, Lindsey; Winkelman, Chris

    2017-01-01

    To examine what occurs after a recorded observation of at least one abnormal vital sign in the emergency department. The aims were to determine how often abnormal vital signs were recorded, what interventions were documented, and what factors were associated with documented follow-up for abnormal vital signs. Monitoring quality of care, and preventing or intervening before harm occurs to patients are central to nurses' roles. Abnormal vital signs have been associated with poor patient outcomes and require follow-up after the observation of abnormal readings to prevent patient harm related to a deteriorating status. This documentation is important to quality and safety of care. Observational, retrospective chart review. Modified Early Warning Score was calculated for all recorded vital signs for 195 charts. Comparisons were made between groups: (1) no abnormal vital signs, (2) abnormal vital sign present, but normal Modified Early Warning Score and (3) critically abnormal Modified Early Warning Score. About 62·1% of charts had an abnormal vital sign documented. Critically abnormal values were present in 14·9%. No documentation was present in 44·6% of abnormal cases. When interventions were documented, it was usually to notify the physician. The timing within the emergency department visit when the abnormalities were observed and the degree of abnormality had significant relationships to the presence of documentation. It is doubtful that nurses do not recognise abnormalities because more severely abnormal vital signs were more likely to have documented follow-up. Perhaps the interruptive nature of the emergency department or the prioritised actions of the nurse impacted documentation within this study. Further research is required to determine why follow-up is not being documented. To ensure safety and quality of patient care, accurate documentation of responses to abnormal vital signs is required. © 2016 John Wiley & Sons Ltd.

  11. Do warning signs on electronic gaming machines influence irrational cognitions?

    PubMed

    Monaghan, Sally; Blaszczynski, Alex; Nower, Lia

    2009-08-01

    Electronic gaming machines are popular among problem gamblers; in response, governments have introduced "responsible gaming" legislation incorporating the mandatory display of warning signs on or near electronic gaming machines. These signs are designed to correct irrational and erroneous beliefs through the provision of accurate information on probabilities of winning and the concept of randomness. There is minimal empirical data evaluating the effectiveness of such signs. In this study, 93 undergraduate students were randomly allocated to standard and informative messages displayed on an electronic gaming machine during play in a laboratory setting. Results revealed that a majority of participants incorrectly estimated gambling odds and reported irrational gambling-related cognitions prior to play. In addition, there were no significant between-group differences, and few participants recalled the content of messages or modified their gambling-related cognitions. Signs placed on electronic gaming machines may not modify irrational beliefs or alter gambling behaviour.

  12. 76 FR 71018 - Access to Confidential Business Information by the U.S. Consumer Product Safety Commission

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-16

    ... visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will...

  13. 75 FR 57768 - Access to Confidential Business Information by Eastern Research Group and Its Identified...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-22

    ... photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that...

  14. 75 FR 51734 - Testing of Certain High Production Volume Chemical Substances; Third Group of Chemical Substances...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-23

    ... visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will...

  15. 77 FR 10451 - Fishing Tackle Containing Lead; Disposition of Petition Filed Pursuant to TSCA Section 21

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-22

    ... visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will...

  16. 75 FR 80665 - Sixty-Seventh Report of the TSCA Interagency Testing Committee to the Administrator of the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-22

    ... are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will...

  17. 76 FR 9012 - Access to Confidential Business Information by Electronic Consulting Services, Inc.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-16

    ... identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that must be...

  18. 75 FR 42441 - Sixty-Sixth Report of the TSCA Interagency Testing Committee to the Administrator of the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-21

    ... show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an...

  19. 76 FR 57734 - Certain New Chemicals; Receipt and Status Information

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-16

    ... are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will... metals and nutrients in water. [[Page 57736

  20. Basal jawed vertebrate phylogeny inferred from multiple nuclear DNA-coded genes

    PubMed Central

    Kikugawa, Kanae; Katoh, Kazutaka; Kuraku, Shigehiro; Sakurai, Hiroshi; Ishida, Osamu; Iwabe, Naoyuki; Miyata, Takashi

    2004-01-01

    Background Phylogenetic analyses of jawed vertebrates based on mitochondrial sequences often result in confusing inferences which are obviously inconsistent with generally accepted trees. In particular, in a hypothesis by Rasmussen and Arnason based on mitochondrial trees, cartilaginous fishes have a terminal position in a paraphyletic cluster of bony fishes. No previous analysis based on nuclear DNA-coded genes could significantly reject the mitochondrial trees of jawed vertebrates. Results We have cloned and sequenced seven nuclear DNA-coded genes from 13 vertebrate species. These sequences, together with sequences available from databases including 13 jawed vertebrates from eight major groups (cartilaginous fishes, bichir, chondrosteans, gar, bowfin, teleost fishes, lungfishes and tetrapods) and an outgroup (a cyclostome and a lancelet), have been subjected to phylogenetic analyses based on the maximum likelihood method. Conclusion Cartilaginous fishes have been inferred to be basal to other jawed vertebrates, which is consistent with the generally accepted view. The minimum log-likelihood difference between the maximum likelihood tree and trees not supporting the basal position of cartilaginous fishes is 18.3 ± 13.1. The hypothesis by Rasmussen and Arnason has been significantly rejected with the minimum log-likelihood difference of 123 ± 23.3. Our tree has also shown that living holosteans, comprising bowfin and gar, form a monophyletic group which is the sister group to teleost fishes. This is consistent with a formerly prevalent view of vertebrate classification, although inconsistent with both of the current morphology-based and mitochondrial sequence-based trees. Furthermore, the bichir has been shown to be the basal ray-finned fish. Tetrapods and lungfish have formed a monophyletic cluster in the tree inferred from the concatenated alignment, being consistent with the currently prevalent view. It also remains possible that tetrapods are more closely related to ray-finned fishes than to lungfishes. PMID:15070407

  1. Statistical methods of fracture characterization using acoustic borehole televiewer log interpretation

    NASA Astrophysics Data System (ADS)

    Massiot, Cécile; Townend, John; Nicol, Andrew; McNamara, David D.

    2017-08-01

    Acoustic borehole televiewer (BHTV) logs provide measurements of fracture attributes (orientations, thickness, and spacing) at depth. Orientation, censoring, and truncation sampling biases similar to those described for one-dimensional outcrop scanlines, and other logging or drilling artifacts specific to BHTV logs, can affect the interpretation of fracture attributes from BHTV logs. K-means, fuzzy K-means, and agglomerative clustering methods provide transparent means of separating fracture groups on the basis of their orientation. Fracture spacing is calculated for each of these fracture sets. Maximum likelihood estimation using truncated distributions permits the fitting of several probability distributions to the fracture attribute data sets within truncation limits, which can then be extrapolated over the entire range where they naturally occur. Akaike Information Criterion (AIC) and Schwartz Bayesian Criterion (SBC) statistical information criteria rank the distributions by how well they fit the data. We demonstrate these attribute analysis methods with a data set derived from three BHTV logs acquired from the high-temperature Rotokawa geothermal field, New Zealand. Varying BHTV log quality reduces the number of input data points, but careful selection of the quality levels where fractures are deemed fully sampled increases the reliability of the analysis. Spacing data analysis comprising up to 300 data points and spanning three orders of magnitude can be approximated similarly well (similar AIC rankings) with several distributions. Several clustering configurations and probability distributions can often characterize the data at similar levels of statistical criteria. Thus, several scenarios should be considered when using BHTV log data to constrain numerical fracture models.

  2. Log-normal frailty models fitted as Poisson generalized linear mixed models.

    PubMed

    Hirsch, Katharina; Wienke, Andreas; Kuss, Oliver

    2016-12-01

    The equivalence of a survival model with a piecewise constant baseline hazard function and a Poisson regression model has been known since decades. As shown in recent studies, this equivalence carries over to clustered survival data: A frailty model with a log-normal frailty term can be interpreted and estimated as a generalized linear mixed model with a binary response, a Poisson likelihood, and a specific offset. Proceeding this way, statistical theory and software for generalized linear mixed models are readily available for fitting frailty models. This gain in flexibility comes at the small price of (1) having to fix the number of pieces for the baseline hazard in advance and (2) having to "explode" the data set by the number of pieces. In this paper we extend the simulations of former studies by using a more realistic baseline hazard (Gompertz) and by comparing the model under consideration with competing models. Furthermore, the SAS macro %PCFrailty is introduced to apply the Poisson generalized linear mixed approach to frailty models. The simulations show good results for the shared frailty model. Our new %PCFrailty macro provides proper estimates, especially in case of 4 events per piece. The suggested Poisson generalized linear mixed approach for log-normal frailty models based on the %PCFrailty macro provides several advantages in the analysis of clustered survival data with respect to more flexible modelling of fixed and random effects, exact (in the sense of non-approximate) maximum likelihood estimation, and standard errors and different types of confidence intervals for all variance parameters. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. Association between the findings on magnetic resonance imaging screening for syringomyelia in asymptomatic Cavalier King Charles spaniels and observation of clinical signs consistent with syringomyelia in later life.

    PubMed

    Ives, E J; Doyle, L; Holmes, M; Williams, T L; Vanhaesebrouck, A E

    2015-01-01

    A questionnaire-based study was used to investigate the association between the findings on magnetic resonance imaging (MRI) screening for syringomyelia (SM) in 79 asymptomatic Cavalier King Charles spaniels (CKCS) and the subsequent development of clinical signs consistent with SM in later life. Owners reported clinical signs consistent with SM in 13/79 (16%) dogs at the time of the questionnaire. A significantly greater proportion of CKCS with a syrinx visible on MRI screening showed clinical signs in later life (9/25, 36%) than dogs without a visible syrinx (4/54, 7%; odds ratio 6.9). Whether the findings of MRI screening can be used to indicate the likelihood of an asymptomatic CKCS developing clinical signs consistent with SM in later life warrants further prospective study in a larger cohort of dogs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Selective Logging, Fire, and Biomass in Amazonia

    NASA Technical Reports Server (NTRS)

    Houghton, R. A.

    1999-01-01

    Biomass and rates of disturbance are major factors in determining the net flux of carbon between terrestrial ecosystems and the atmosphere, and neither of them is well known for most of the earth's surface. Satellite data over large areas are beginning to be used systematically to measure rates of two of the most important types of disturbance, deforestation and reforestation, but these are not the only types of disturbance that affect carbon storage. Other examples include selective logging and fire. In northern mid-latitude forests, logging and subsequent regrowth of forests have, in recent decades, contributed more to the net flux of carbon between terrestrial ecosystems and the atmosphere than any other type of land use. In the tropics logging is also becoming increasingly important. According to the FAO/UNEP assessment of tropical forests, about 25% of total area of productive forests have been logged one or more times in the 60-80 years before 1980. The fraction must be considerably greater at present. Thus, deforestation by itself accounts for only a portion of the emissions carbon from land. Furthermore, as rates of deforestation become more accurately measured with satellites, uncertainty in biomass will become the major factor accounting for the remaining uncertainty in estimates of carbon flux. An approach is needed for determining the biomass of terrestrial ecosystems. 3 Selective logging is increasingly important in Amazonia, yet it has not been included in region-wide, satellite-based assessments of land-cover change, in part because it is not as striking as deforestation. Nevertheless, logging affects terrestrial carbon storage both directly and indirectly. Besides the losses of carbon directly associated with selective logging, logging also increases the likelihood of fire.

  5. Food insecurity and malnutrition in Chinese elementary school students.

    PubMed

    Shen, Xiuhua; Gao, Xiang; Tang, Wenjing; Mao, Xuanxia; Huang, Jingyan; Cai, Wei

    2015-09-28

    It has been shown that food insecurity is associated with poor diet quality and unfavourable health outcomes. However, little is known about the potential effects of food insecurity on the overall malnutrition status among children. In this study, we investigated the prevalence of food insecurity among 1583 elementary school students, aged 6-14 years, living in Chinese rural areas and examined its association with four malnutrition signs, including rickets sequelae, anaemia, stunting and wasting. Information on food security was collected via questionnaires. Rickets sequelae were assessed by an experienced paediatrician during the interview. Anaemia was determined by the WHO Hb thresholds adjusted by the local altitude. Weight and height were measured during the interview. Stunting and wasting were then evaluated according to WHO child growth standards (2007). We examined the association between food insecurity and the number of malnutrition signs (total number = 4), and the likelihood of having severe malnutrition (presence of 3+ signs), after adjusting for potential confounders, such as age, social-economic status and dietary intakes. During the previous 12 months, the overall prevalence of food insecurity was 6.1% in the entire studied population and 16.3% in participants with severe malnutrition. Participants with food insecurity had a slightly higher number of malnutrition signs (1.14 v. 0.96; P=0.043) relative to those who were food secure, after adjusting for potential confounders. Food insecurity was also associated with increased likelihood of having severe malnutrition (adjusted OR 3.08; 95% CI 1.47, 6.46; P=0.003). In conclusion, food insecurity is significantly associated with malnutrition among Chinese children in this community.

  6. Child Modifiability as a Predictor of Language Abilities in Deaf Children Who Use American Sign Language.

    PubMed

    Mann, Wolfgang; Peña, Elizabeth D; Morgan, Gary

    2015-08-01

    This research explored the use of dynamic assessment (DA) for language-learning abilities in signing deaf children from deaf and hearing families. Thirty-seven deaf children, aged 6 to 11 years, were identified as either stronger (n = 26) or weaker (n = 11) language learners according to teacher or speech-language pathologist report. All children received 2 scripted, mediated learning experience sessions targeting vocabulary knowledge—specifically, the use of semantic categories that were carried out in American Sign Language. Participant responses to learning were measured in terms of an index of child modifiability. This index was determined separately at the end of the 2 individual sessions. It combined ratings reflecting each child's learning abilities and responses to mediation, including social-emotional behavior, cognitive arousal, and cognitive elaboration. Group results showed that modifiability ratings were significantly better for stronger language learners than for weaker language learners. The strongest predictors of language ability were cognitive arousal and cognitive elaboration. Mediator ratings of child modifiability (i.e., combined score of social-emotional factors and cognitive factors) are highly sensitive to language-learning abilities in deaf children who use sign language as their primary mode of communication. This method can be used to design targeted interventions.

  7. 77 FR 24697 - Access to Confidential Business Information by CGI Federal Inc. and Its Identified Subcontractor...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-25

    ... identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that must be...

  8. 75 FR 70672 - Access to Confidential Business Information by Computer Sciences Corporation and Its Identified...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-18

    ... identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that must be...

  9. 76 FR 37111 - Access to Confidential Business Information by Computer Sciences Corporation and Its Identified...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-24

    ... identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that must be...

  10. 76 FR 10360 - Access to Confidential Business Information by Guident Technologies Inc. and Its Identified...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-24

    ... identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that must be...

  11. 77 FR 10506 - Access to Confidential Business Information by Syracuse Research Corporation, Inc., and Its...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-22

    ... identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that must be...

  12. 76 FR 77817 - Access to Confidential Business Information by CGI Federal, Inc. and Subcontractor, Innovate, Inc.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-14

    ... identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that must be...

  13. 77 FR 69820 - Access to Confidential Business Information by Electronic Consulting Services, Inc., and Its...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-21

    ... identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that must be...

  14. 78 FR 59679 - Antimony Trioxide TSCA Chemical Risk Assessment; Notice of Public Meetings and Opportunity To...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-27

    ... identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X- ray machine and subject to search. Visitors will be provided an EPA/DC badge that must be...

  15. 20 CFR 655.201 - Temporary labor certification applications.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Temporary labor certification applications... applications. (a)(1) An employer who anticipates a labor shortage of workers for agricultural or logging... an agent file, in duplicate, a temporary labor certification application, signed by the employer...

  16. Competency-based residency training and the web log: modeling practice-based learning and enhancing medical knowledge†

    PubMed Central

    Hollon, Matthew F.

    2015-01-01

    Background By using web-based tools in medical education, there are opportunities to innovatively teach important principles from the general competencies of graduate medical education. Objectives Postulating that faculty transparency in learning from uncertainties in clinical work could help residents to incorporate the principles of practice-based learning and improvement (PBLI) in their professional development, faculty in this community-based residency program modeled the steps of PBLI on a weekly basis through the use of a web log. Method The program confidentially surveyed residents before and after this project about actions consistent with PBLI and knowledge acquired through reading the web log. Results The frequency that residents encountered clinical situations where they felt uncertain declined over the course of the 24 weeks of the project from a mean frequency of uncertainty of 36% to 28% (Wilcoxon signed rank test, p=0.008); however, the frequency with which residents sought answers when faced with uncertainty did not change (Wilcoxon signed rank test, p=0.39), remaining high at approximately 80%. Residents answered a mean of 52% of knowledge questions correct when tested prior to faculty posts to the blog, rising to a mean of 65% of questions correct when tested at the end of the project (paired t-test, p=0.001). Conclusions Faculty role modeling of PBLI behaviors and posting clinical questions and answers to a web log led to modest improvements in medical knowledge but did not alter behavior that was already taking place frequently among residents. PMID:26653701

  17. The presence of Waddell signs depends on age and gender, not diagnosis.

    PubMed

    Yoo, J U; McIver, T C; Hiratzka, J; Carlson, H; Carlson, N; Radoslovich, S S; Gernhart, T; Boshears, E; Kane, M S

    2018-02-01

    The aim of this study was to determine if positive Waddell signs were related to patients' demographics or to perception of their quality of life. This prospective cross-sectional study included 479 adult patients with back pain from a university spine centre. Each completed SF-12 and Oswestry Disability Index (ODI) questionnaires and underwent standard spinal examinations to elicit Waddell signs. The relationship between Waddell signs and age, gender, ODI, Mental Component Score (MCS), and Physical Component Score (PCS) scores was determined. Of the 479 patients, 128 (27%) had at least one positive Waddell sign. There were significantly more women with two or more Waddell signs than men. The proportion of patients with at least one positive Waddell sign increased with age until 55 years, and then declined rapidly; none had a positive sign over the age of 75 years. Functional outcome scores were significantly worse in those with a single Waddell sign (p < 0.01). With one or more Waddell signs, patients' PCS and ODI scores indicated a perception of severe disability; with three or more Waddell signs, patients' MCS scores indicated severe disability. With five Waddell signs, ODI scores indicated that patients perceived themselves as crippled. Positive Waddell signs, a potential indicator of central sensitization, indicated a likelihood of having functional limitations and an impaired quality of life, particularly in young women. Cite this article: Bone Joint J 2018;100-B:219-25. ©2018 The British Editorial Society of Bone & Joint Surgery.

  18. Removal of polycyclic aromatic hydrocarbons from aqueous solution by raw and modified plant residue materials as biosorbents.

    PubMed

    Xi, Zemin; Chen, Baoliang

    2014-04-01

    Removal of polycyclic aromatic hydrocarbons (PAHs), e.g., naphthalene, acenaphthene, phenanthrene and pyrene, from aqueous solution by raw and modified plant residues was investigated to develop low cost biosorbents for organic pollutant abatement. Bamboo wood, pine wood, pine needles and pine bark were selected as plant residues, and acid hydrolysis was used as an easily modification method. The raw and modified biosorbents were characterized by elemental analysis, Fourier transform infrared spectroscopy and scanning electron microscopy. The sorption isotherms of PAHs to raw biosorbents were apparently linear, and were dominated by a partitioning process. In comparison, the isotherms of the hydrolyzed biosorbents displayed nonlinearity, which was controlled by partitioning and the specific interaction mechanism. The sorption kinetic curves of PAHs to the raw and modified plant residues fit well with the pseudo second-order kinetics model. The sorption rates were faster for the raw biosorbents than the corresponding hydrolyzed biosorbents, which was attributed to the latter having more condensed domains (i.e., exposed aromatic core). By the consumption of the amorphous cellulose component under acid hydrolysis, the sorption capability of the hydrolyzed biosorbents was notably enhanced, i.e., 6-18 fold for phenanthrene, 6-8 fold for naphthalene and pyrene and 5-8 fold for acenaphthene. The sorption coefficients (Kd) were negatively correlated with the polarity index [(O+N)/C], and positively correlated with the aromaticity of the biosorbents. For a given biosorbent, a positive linear correlation between logKoc and logKow for different PAHs was observed. Interestingly, the linear plots of logKoc-logKow were parallel for different biosorbents. These observations suggest that the raw and modified plant residues have great potential as biosorbents to remove PAHs from wastewater. Copyright © 2014 The Research Centre for Eco-Environmental Sciences, Chinese Academy of Sciences. Published by Elsevier B.V. All rights reserved.

  19. Patient and Parent-Reported Signs and Symptoms for Group A Streptococcal Pharyngitis.

    PubMed

    Lindgren, Christina; Neuman, Mark I; Monuteaux, Michael C; Mandl, Kenneth D; Fine, Andrew M

    2016-07-01

    Identifying symptomatic patients who are at low risk for group A streptococcal (GAS) pharyngitis could reduce unnecessary visits and antibiotic use. The accuracy with which patients and parents report signs and symptoms of GAS has not been studied. Our objectives were to measure agreement between patient or parent and physician-reported signs and symptoms of GAS and to evaluate the performance of a modified Centor score, based on patient or parent and physician reports, for identifying patients at low risk for GAS pharyngitis. Children 3 to 21 years old presenting to a single tertiary care emergency department between October 2013 and January 2015 were included if they complained of a sore throat and were tested for GAS. Patients or parents and physicians completed surveys assessing signs and symptoms to determine a modified age-adjusted Centor score for GAS. We evaluated the overall agreement and κ between patient or parent and physician-reported signs and symptoms and compared the performance of the scores based on assessments by patients or parents and physicians and the risk of GAS. Of 320 patients enrolled, 107 (33%) tested GAS positive. Agreement was higher for symptoms (fever [agreement = 82%, κ = 0.64] and cough [72%, 0.45]) than for signs (exudate [80%, 0.41] and tender cervical nodes [73%, 0.18]). Agreement was highest when no signs and symptoms contained in the Centor score were present (94%, κ = 0.61). The proportion of patients testing GAS positive rose as the modified Centor score increased. For identifying GAS pharyngitis, patients or parents and physicians showed moderate to substantial agreement for 3 of 4 key pharyngitis signs and symptoms. Copyright © 2016 by the American Academy of Pediatrics.

  20. Validating Emergency Department Vital Signs Using a Data Quality Engine for Data Warehouse

    PubMed Central

    Genes, N; Chandra, D; Ellis, S; Baumlin, K

    2013-01-01

    Background : Vital signs in our emergency department information system were entered into free-text fields for heart rate, respiratory rate, blood pressure, temperature and oxygen saturation. Objective : We sought to convert these text entries into a more useful form, for research and QA purposes, upon entry into a data warehouse. Methods : We derived a series of rules and assigned quality scores to the transformed values, conforming to physiologic parameters for vital signs across the age range and spectrum of illness seen in the emergency department. Results : Validating these entries revealed that 98% of free-text data had perfect quality scores, conforming to established vital sign parameters. Average vital signs varied as expected by age. Degradations in quality scores were most commonly attributed logging temperature in Fahrenheit instead of Celsius; vital signs with this error could still be transformed for use. Errors occurred more frequently during periods of high triage, though error rates did not correlate with triage volume. Conclusions : In developing a method for importing free-text vital sign data from our emergency department information system, we now have a data warehouse with a broad array of quality-checked vital signs, permitting analysis and correlation with demographics and outcomes. PMID:24403981

  1. Validating emergency department vital signs using a data quality engine for data warehouse.

    PubMed

    Genes, N; Chandra, D; Ellis, S; Baumlin, K

    2013-01-01

    Vital signs in our emergency department information system were entered into free-text fields for heart rate, respiratory rate, blood pressure, temperature and oxygen saturation. We sought to convert these text entries into a more useful form, for research and QA purposes, upon entry into a data warehouse. We derived a series of rules and assigned quality scores to the transformed values, conforming to physiologic parameters for vital signs across the age range and spectrum of illness seen in the emergency department. Validating these entries revealed that 98% of free-text data had perfect quality scores, conforming to established vital sign parameters. Average vital signs varied as expected by age. Degradations in quality scores were most commonly attributed logging temperature in Fahrenheit instead of Celsius; vital signs with this error could still be transformed for use. Errors occurred more frequently during periods of high triage, though error rates did not correlate with triage volume. In developing a method for importing free-text vital sign data from our emergency department information system, we now have a data warehouse with a broad array of quality-checked vital signs, permitting analysis and correlation with demographics and outcomes.

  2. Maternal dietary intake of polyunsaturated fatty acids modifies association between prenatal DDT exposure and child neurodevelopment: A cohort study.

    PubMed

    Ogaz-González, Rafael; Mérida-Ortega, Ángel; Torres-Sánchez, Luisa; Schnaas, Lourdes; Hernández-Alcaraz, César; Cebrián, Mariano E; Rothenberg, Stephen J; García-Hernández, Rosa María; López-Carrillo, Lizbeth

    2018-07-01

    Maternal 1,1-dichloro-2,2-bis(p-chlorophenyl)ethylene (DDE) serum levels during pregnancy have been negatively linked to child neurodevelopment in contrast to intake of omega-3 and -6 (ω-3 and ω-6) fatty acids. To assess whether maternal dietary intake of ω-3 and ω-6 during pregnancy modifies the association between exposure to DDE and child neurodevelopment from age 42-60 months. Prospective cohort study with 142 mother-child pairs performed in Mexico. DDE serum levels were determined by electron capture gas chromatography. Dietary ω-3 and ω-6 intake was estimated by questionnaire. Child neurodevelopment was assessed by McCarthy Scales. Docosahexaenoic (DHA) fatty acid intake significantly modified the association between DDE and motor component: increased maternal DDE was associated with lower motor development in children whose mothers had lower DHA intake (β log2DDE  = -1.25; 95% CI: -2.62, 0.12), in contrast to the non-significant increase among children whose mothers had higher DHA intake (β log2DDE-motor  = 0.50; 95% CI: 0.55, 1.56). Likewise, arachidonic fatty acid (ARA) intake modified the association between DDE and memory component: increased maternal DDE was associated with a significantly larger reduction in the memory component in children whose mothers had lower ARA intake (β log2DDE  = -1.31; 95% CI: -2.29, -0.32) than children whose mothers had higher ARA intake (β log2DDE-memory  = 0.17; 95% CI: -0.78, 1.11). Dietary intake of DHA and ARA during pregnancy may protect against child neurodevelopment damage associated with prenatal maternal DDE levels. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. A parametric method for determining the number of signals in narrow-band direction finding

    NASA Astrophysics Data System (ADS)

    Wu, Qiang; Fuhrmann, Daniel R.

    1991-08-01

    A novel and more accurate method to determine the number of signals in the multisource direction finding problem is developed. The information-theoretic criteria of Yin and Krishnaiah (1988) are applied to a set of quantities which are evaluated from the log-likelihood function. Based on proven asymptotic properties of the maximum likelihood estimation, these quantities have the properties required by the criteria. Since the information-theoretic criteria use these quantities instead of the eigenvalues of the estimated correlation matrix, this approach possesses the advantage of not requiring a subjective threshold, and also provides higher performance than when eigenvalues are used. Simulation results are presented and compared to those obtained from the nonparametric method given by Wax and Kailath (1985).

  4. Patch-based image reconstruction for PET using prior-image derived dictionaries

    NASA Astrophysics Data System (ADS)

    Tahaei, Marzieh S.; Reader, Andrew J.

    2016-09-01

    In PET image reconstruction, regularization is often needed to reduce the noise in the resulting images. Patch-based image processing techniques have recently been successfully used for regularization in medical image reconstruction through a penalized likelihood framework. Re-parameterization within reconstruction is another powerful regularization technique in which the object in the scanner is re-parameterized using coefficients for spatially-extensive basis vectors. In this work, a method for extracting patch-based basis vectors from the subject’s MR image is proposed. The coefficients for these basis vectors are then estimated using the conventional MLEM algorithm. Furthermore, using the alternating direction method of multipliers, an algorithm for optimizing the Poisson log-likelihood while imposing sparsity on the parameters is also proposed. This novel method is then utilized to find sparse coefficients for the patch-based basis vectors extracted from the MR image. The results indicate the superiority of the proposed methods to patch-based regularization using the penalized likelihood framework.

  5. Bayesian image reconstruction for improving detection performance of muon tomography.

    PubMed

    Wang, Guobao; Schultz, Larry J; Qi, Jinyi

    2009-05-01

    Muon tomography is a novel technology that is being developed for detecting high-Z materials in vehicles or cargo containers. Maximum likelihood methods have been developed for reconstructing the scattering density image from muon measurements. However, the instability of maximum likelihood estimation often results in noisy images and low detectability of high-Z targets. In this paper, we propose using regularization to improve the image quality of muon tomography. We formulate the muon reconstruction problem in a Bayesian framework by introducing a prior distribution on scattering density images. An iterative shrinkage algorithm is derived to maximize the log posterior distribution. At each iteration, the algorithm obtains the maximum a posteriori update by shrinking an unregularized maximum likelihood update. Inverse quadratic shrinkage functions are derived for generalized Laplacian priors and inverse cubic shrinkage functions are derived for generalized Gaussian priors. Receiver operating characteristic studies using simulated data demonstrate that the Bayesian reconstruction can greatly improve the detection performance of muon tomography.

  6. DECONV-TOOL: An IDL based deconvolution software package

    NASA Technical Reports Server (NTRS)

    Varosi, F.; Landsman, W. B.

    1992-01-01

    There are a variety of algorithms for deconvolution of blurred images, each having its own criteria or statistic to be optimized in order to estimate the original image data. Using the Interactive Data Language (IDL), we have implemented the Maximum Likelihood, Maximum Entropy, Maximum Residual Likelihood, and sigma-CLEAN algorithms in a unified environment called DeConv_Tool. Most of the algorithms have as their goal the optimization of statistics such as standard deviation and mean of residuals. Shannon entropy, log-likelihood, and chi-square of the residual auto-correlation are computed by DeConv_Tool for the purpose of determining the performance and convergence of any particular method and comparisons between methods. DeConv_Tool allows interactive monitoring of the statistics and the deconvolved image during computation. The final results, and optionally, the intermediate results, are stored in a structure convenient for comparison between methods and review of the deconvolution computation. The routines comprising DeConv_Tool are available via anonymous FTP through the IDL Astronomy User's Library.

  7. Statistical modelling of thermal annealing of fission tracks in apatite

    NASA Astrophysics Data System (ADS)

    Laslett, G. M.; Galbraith, R. F.

    1996-12-01

    We develop an improved methodology for modelling the relationship between mean track length, temperature, and time in fission track annealing experiments. We consider "fanning Arrhenius" models, in which contours of constant mean length on an Arrhenius plot are straight lines meeting at a common point. Features of our approach are explicit use of subject matter knowledge, treating mean length as the response variable, modelling of the mean-variance relationship with two components of variance, improved modelling of the control sample, and using information from experiments in which no tracks are seen. This approach overcomes several weaknesses in previous models and provides a robust six parameter model that is widely applicable. Estimation is via direct maximum likelihood which can be implemented using a standard numerical optimisation package. Because the model is highly nonlinear, some reparameterisations are needed to achieve stable estimation and calculation of precisions. Experience suggests that precisions are more convincingly estimated from profile log-likelihood functions than from the information matrix. We apply our method to the B-5 and Sr fluorapatite data of Crowley et al. (1991) and obtain well-fitting models in both cases. For the B-5 fluorapatite, our model exhibits less fanning than that of Crowley et al. (1991), although fitted mean values above 12 μm are fairly similar. However, predictions can be different, particularly for heavy annealing at geological time scales, where our model is less retentive. In addition, the refined error structure of our model results in tighter prediction errors, and has components of error that are easier to verify or modify. For the Sr fluorapatite, our fitted model for mean lengths does not differ greatly from that of Crowley et al. (1991), but our error structure is quite different.

  8. Log polar image sensor in CMOS technology

    NASA Astrophysics Data System (ADS)

    Scheffer, Danny; Dierickx, Bart; Pardo, Fernando; Vlummens, Jan; Meynants, Guy; Hermans, Lou

    1996-08-01

    We report on the design, design issues, fabrication and performance of a log-polar CMOS image sensor. The sensor is developed for the use in a videophone system for deaf and hearing impaired people, who are not capable of communicating through a 'normal' telephone. The system allows 15 detailed images per second to be transmitted over existing telephone lines. This framerate is sufficient for conversations by means of sign language or lip reading. The pixel array of the sensor consists of 76 concentric circles with (up to) 128 pixels per circle, in total 8013 pixels. The interior pixels have a pitch of 14 micrometers, up to 250 micrometers at the border. The 8013-pixels image is mapped (log-polar transformation) in a X-Y addressable 76 by 128 array.

  9. Nonlinear phase noise tolerance for coherent optical systems using soft-decision-aided ML carrier phase estimation enhanced with constellation partitioning

    NASA Astrophysics Data System (ADS)

    Li, Yan; Wu, Mingwei; Du, Xinwei; Xu, Zhuoran; Gurusamy, Mohan; Yu, Changyuan; Kam, Pooi-Yuen

    2018-02-01

    A novel soft-decision-aided maximum likelihood (SDA-ML) carrier phase estimation method and its simplified version, the decision-aided and soft-decision-aided maximum likelihood (DA-SDA-ML) methods are tested in a nonlinear phase noise-dominant channel. The numerical performance results show that both the SDA-ML and DA-SDA-ML methods outperform the conventional DA-ML in systems with constant-amplitude modulation formats. In addition, modified algorithms based on constellation partitioning are proposed. With partitioning, the modified SDA-ML and DA-SDA-ML are shown to be useful for compensating the nonlinear phase noise in multi-level modulation systems.

  10. Demonstration of an optoelectronic interconnect architecture for a parallel modified signed-digit adder and subtracter

    NASA Astrophysics Data System (ADS)

    Sun, Degui; Wang, Na-Xin; He, Li-Ming; Weng, Zhao-Heng; Wang, Daheng; Chen, Ray T.

    1996-06-01

    A space-position-logic-encoding scheme is proposed and demonstrated. This encoding scheme not only makes the best use of the convenience of binary logic operation, but is also suitable for the trinary property of modified signed- digit (MSD) numbers. Based on the space-position-logic-encoding scheme, a fully parallel modified signed-digit adder and subtractor is built using optoelectronic switch technologies in conjunction with fiber-multistage 3D optoelectronic interconnects. Thus an effective combination of a parallel algorithm and a parallel architecture is implemented. In addition, the performance of the optoelectronic switches used in this system is experimentally studied and verified. Both the 3-bit experimental model and the experimental results of a parallel addition and a parallel subtraction are provided and discussed. Finally, the speed ratio between the MSD adder and binary adders is discussed and the advantage of the MSD in operating speed is demonstrated.

  11. 14 CFR 21.289 - Major repairs, rebuilding and alteration.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... TRANSPORTATION AIRCRAFT CERTIFICATION PROCEDURES FOR PRODUCTS AND PARTS Delegation Option Authorization Procedures § 21.289 Major repairs, rebuilding and alteration. For types covered by a delegation option... any employee to execute and sign FAA Form 337 and make required log book entries if that employee— (1...

  12. 42 CFR 424.36 - Signature requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... beneficiary's legal guardian. (2) A relative or other person who receives social security or other... any of the following: (i) The signed patient care/trip report; (ii) The facility or hospital registration/admission sheet; (iii) The patient medical record; (iv) The facility or hospital log; or (v) Other...

  13. 75 FR 3462 - Claims of Confidentiality of Certain Chemical Identities Submitted under Section 8(e) of the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-21

    ... identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that must be...

  14. 75 FR 32754 - Certain New Chemicals; Receipt and Status Information

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-09

    ... are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will...; ultra hydroxyethyl ester, violet curable metal reaction products coatings with dicyclopentadiene, 5...

  15. Hip Implant Modified To Increase Probability Of Retention

    NASA Technical Reports Server (NTRS)

    Canabal, Francisco, III

    1995-01-01

    Modification in design of hip implant proposed to increase likelihood of retention of implant in femur after hip-repair surgery. Decreases likelihood of patient distress and expense associated with repetition of surgery after failed implant procedure. Intended to provide more favorable flow of cement used to bind implant in proximal extreme end of femur, reducing structural flaws causing early failure of implant/femur joint.

  16. Wildlife Warning Signs: Public Assessment of Components, Placement and Designs to Optimise Driver Response

    PubMed Central

    Bond, Amy R. F.; Jones, Darryl N.

    2013-01-01

    Simple Summary Wildlife warning signs are aimed at reducing wildlife–vehicle collisions but there is little evidence that they are effective. Improving these sign designs to increase driver response may reduce wildlife–vehicle collisions. We examined drivers’ responses to different wildlife warning sign designs through a public survey. The presences of some sign components and sign position were assessed. Drivers’ responses to eight graphically displayed signs and animal- and vehicle-activated signs were ranked and participants indicated the sign to which they were most likely to respond. Three signs ranked highly. Animal- and vehicle-activated signs were also ranked highly by participants. More research into optimising wildlife warning sign designs is needed. Abstract Wildlife warning signs are the most commonly used and widespread form of road impact mitigation, aimed at reducing the incidence of wildlife–vehicle collisions. Evidence of the effectiveness of currently used signs is rare and often indicates minimal change in driver behaviour. Improving the design of these signs to increase the likelihood of appropriate driver response has the potential to reduce the incidence of wildlife–vehicle collisions. This study aimed to examine and assess the opinions of drivers on wildlife warning sign designs through a public opinion survey. Three currently used sign designs and five alternative sign designs were compared in the survey. A total of 134 drivers were surveyed. The presence of temporal specifications and an updated count of road-killed animals on wildlife warning signs were assessed, as well as the position of the sign. Drivers’ responses to the eight signs were scaled separately at three speed limits and participants indicated the sign to which they were most likely to respond. Three signs consistently ranked high. The messages conveyed by these signs and their prominent features were explored. Animal-activated and vehicle speed-activated signs were ranked very highly by participants. Extensive field trials of various sign designs are needed to further this research into optimizing wildlife warning sign designs. PMID:26479756

  17. Impact of in-woods product merchandizing on profitable logging opportunities in southern upland hardwood forests

    Treesearch

    Dennis M. May; Chris B. LeDoux; John B. Tansey; Richard Widmann

    1994-01-01

    Procedures developed to assess available timber supplies from upland hardwood forest statistics reported by the U.S. Department of Agriculture, Forest Service, Forest Inventory and Analysis (FIA) units, were modified to demonstrate the impact of three in-woods product-merchandizing options on profitable logging opportunities in upland hardwood forests in 14 Southern...

  18. Environmental and dietary risk factors for infantile atopic eczema among a Slovak birth cohort.

    PubMed

    Dunlop, Anne L; Reichrtova, Eva; Palcovicova, Luba; Ciznar, Peter; Adamcakova-Dodd, Andrea; Smith, S J; McNabb, Scott J N

    2006-03-01

    Infantile atopic eczema (AE) is a risk marker for future asthma. This study assesses the contribution of modifiable exposures to infantile AE. If modifiable exposures contribute substantially to infantile AE, its prevention might be a sensible approach to asthma prevention. Pregnant women (n = 1978) were systematically recruited from maternity hospitals of the Slovak Republic; their birthed cohort of 1990 children were prospectively followed for 1 yr. Children's exposures to selected environmental and dietary factors were assessed via maternal questionnaires administered at delivery and 1 yr of age. A child was considered to have AE, based on physical examination (SCORAD index >2) or mother's report of a previous physician diagnosis. Multivariate logistic regression was used to calculate adjusted odds ratios and percent total regression scores (TRS) for each variable. At 1 yr of age 1326 (67%) of the children remained in the cohort and 207 (15.6%) developed AE. Various modifiable environmental and dietary exposures increased the likelihood of AE (ownership of cats; consumption of infant formula, eggs, and fish) while others decreased the likelihood of AE (ownership of livestock; exclusive breast feeding for > or =4 months). Overall, modifiable exposures contributed less to the TRS than did non-modifiable exposures (38% vs. 62%, respectively). The modifiable exposure category that contributed most to the TRS was infant feeding practices (27.5% TRS). Modifiable exposures -- especially those related to infant feeding practices -- significantly contribute to infantile AE, although modifiable factors contribute less overall than do non-modifiable exposures.

  19. Mental Health Recovery in the Patient-Centered Medical Home

    PubMed Central

    Aarons, Gregory A.; O’Connell, Maria; Davidson, Larry; Groessl, Erik J.

    2015-01-01

    Objectives. We examined the impact of transitioning clients from a mental health clinic to a patient-centered medical home (PCMH) on mental health recovery. Methods. We drew data from a large US County Behavioral Health Services administrative data set. We used propensity score analysis and multilevel modeling to assess the impact of the PCMH on mental health recovery by comparing PCMH participants (n = 215) to clients receiving service as usual (SAU; n = 22 394) from 2011 to 2013 in San Diego County, California. We repeatedly assessed mental health recovery over time (days since baseline assessment range = 0–1639; mean = 186) with the Illness Management and Recovery (IMR) scale and Recovery Markers Questionnaire. Results. For total IMR (log-likelihood ratio χ2[1] = 4696.97; P < .001) and IMR Factor 2 Management scores (log-likelihood ratio χ2[1] = 7.9; P = .005), increases in mental health recovery over time were greater for PCMH than SAU participants. Increases on all other measures over time were similar for PCMH and SAU participants. Conclusions. Greater increases in mental health recovery over time can be expected when patients with severe mental illness are provided treatment through the PCMH. Evaluative efforts should be taken to inform more widespread adoption of the PCMH. PMID:26180945

  20. Likelihood analysis of spatial capture-recapture models for stratified or class structured populations

    USGS Publications Warehouse

    Royle, J. Andrew; Sutherland, Christopher S.; Fuller, Angela K.; Sun, Catherine C.

    2015-01-01

    We develop a likelihood analysis framework for fitting spatial capture-recapture (SCR) models to data collected on class structured or stratified populations. Our interest is motivated by the necessity of accommodating the problem of missing observations of individual class membership. This is particularly problematic in SCR data arising from DNA analysis of scat, hair or other material, which frequently yields individual identity but fails to identify the sex. Moreover, this can represent a large fraction of the data and, given the typically small sample sizes of many capture-recapture studies based on DNA information, utilization of the data with missing sex information is necessary. We develop the class structured likelihood for the case of missing covariate values, and then we address the scaling of the likelihood so that models with and without class structured parameters can be formally compared regardless of missing values. We apply our class structured model to black bear data collected in New York in which sex could be determined for only 62 of 169 uniquely identified individuals. The models containing sex-specificity of both the intercept of the SCR encounter probability model and the distance coefficient, and including a behavioral response are strongly favored by log-likelihood. Estimated population sex ratio is strongly influenced by sex structure in model parameters illustrating the importance of rigorous modeling of sex differences in capture-recapture models.

  1. 76 FR 38169 - Toxic Substances Control Act Chemical Testing; Receipt of Test Data

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-29

    ... are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X- ray machine and subject to search. Visitors will... animal tissue, metal- cleaning compounds, hydraulic compression fluids, stripping agent (textiles...

  2. 76 FR 58498 - Certain New Chemicals; Receipt and Status Information

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-21

    ... are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will.../2011 10/9/2011 GE Water & Process (S) Heavy metal (G) Sodium polyethylenimine Technologies. precipitant...

  3. 78 FR 17656 - Certain New Chemicals; Receipt and Status Information

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-22

    ... show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an... (S) Automotive metal recovery. If you are interested in information that is not included in these...

  4. 77 FR 74006 - Polychlorinated Biphenyls (PCBs); Recycling Plastics From Shredder Residue

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-12

    ... show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an... from metals recycling facilities (referred to by ISRI as automobile shredder residue (ASR) aggregate...

  5. 78 FR 67142 - HHCB (1,3,4,6,7,8-Hexahydro-4,6,6,7,8,8,-hexamethylcyclopenta[γ]-2-benzopyran) TSCA Risk...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-08

    ... show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X- ray machine and subject to search. Visitors will be provided an...

  6. On non-parametric maximum likelihood estimation of the bivariate survivor function.

    PubMed

    Prentice, R L

    The likelihood function for the bivariate survivor function F, under independent censorship, is maximized to obtain a non-parametric maximum likelihood estimator &Fcirc;. &Fcirc; may or may not be unique depending on the configuration of singly- and doubly-censored pairs. The likelihood function can be maximized by placing all mass on the grid formed by the uncensored failure times, or half lines beyond the failure time grid, or in the upper right quadrant beyond the grid. By accumulating the mass along lines (or regions) where the likelihood is flat, one obtains a partially maximized likelihood as a function of parameters that can be uniquely estimated. The score equations corresponding to these point mass parameters are derived, using a Lagrange multiplier technique to ensure unit total mass, and a modified Newton procedure is used to calculate the parameter estimates in some limited simulation studies. Some considerations for the further development of non-parametric bivariate survivor function estimators are briefly described.

  7. Tests for detecting overdispersion in models with measurement error in covariates.

    PubMed

    Yang, Yingsi; Wong, Man Yu

    2015-11-30

    Measurement error in covariates can affect the accuracy in count data modeling and analysis. In overdispersion identification, the true mean-variance relationship can be obscured under the influence of measurement error in covariates. In this paper, we propose three tests for detecting overdispersion when covariates are measured with error: a modified score test and two score tests based on the proposed approximate likelihood and quasi-likelihood, respectively. The proposed approximate likelihood is derived under the classical measurement error model, and the resulting approximate maximum likelihood estimator is shown to have superior efficiency. Simulation results also show that the score test based on approximate likelihood outperforms the test based on quasi-likelihood and other alternatives in terms of empirical power. By analyzing a real dataset containing the health-related quality-of-life measurements of a particular group of patients, we demonstrate the importance of the proposed methods by showing that the analyses with and without measurement error correction yield significantly different results. Copyright © 2015 John Wiley & Sons, Ltd.

  8. Likelihood ratio meta-analysis: New motivation and approach for an old method.

    PubMed

    Dormuth, Colin R; Filion, Kristian B; Platt, Robert W

    2016-03-01

    A 95% confidence interval (CI) in an updated meta-analysis may not have the expected 95% coverage. If a meta-analysis is simply updated with additional data, then the resulting 95% CI will be wrong because it will not have accounted for the fact that the earlier meta-analysis failed or succeeded to exclude the null. This situation can be avoided by using the likelihood ratio (LR) as a measure of evidence that does not depend on type-1 error. We show how an LR-based approach, first advanced by Goodman, can be used in a meta-analysis to pool data from separate studies to quantitatively assess where the total evidence points. The method works by estimating the log-likelihood ratio (LogLR) function from each study. Those functions are then summed to obtain a combined function, which is then used to retrieve the total effect estimate, and a corresponding 'intrinsic' confidence interval. Using as illustrations the CAPRIE trial of clopidogrel versus aspirin in the prevention of ischemic events, and our own meta-analysis of higher potency statins and the risk of acute kidney injury, we show that the LR-based method yields the same point estimate as the traditional analysis, but with an intrinsic confidence interval that is appropriately wider than the traditional 95% CI. The LR-based method can be used to conduct both fixed effect and random effects meta-analyses, it can be applied to old and new meta-analyses alike, and results can be presented in a format that is familiar to a meta-analytic audience. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Sign legibility for modified messages : final report.

    DOT National Transportation Integrated Search

    1987-01-01

    This study was conducted to investigate ways of increasing the legibility of signs with high background brightness. Research was limited to silver, yellow, and orange encapsulated lens sheeting materials, and modifications were made within the standa...

  10. A concise evidence-based physical examination for diagnosis of acromioclavicular joint pathology: a systematic review.

    PubMed

    Krill, Michael K; Rosas, Samuel; Kwon, KiHyun; Dakkak, Andrew; Nwachukwu, Benedict U; McCormick, Frank

    2018-02-01

    The clinical examination of the shoulder joint is an undervalued diagnostic tool for evaluating acromioclavicular (AC) joint pathology. Applying evidence-based clinical tests enables providers to make an accurate diagnosis and minimize costly imaging procedures and potential delays in care. The purpose of this study was to create a decision tree analysis enabling simple and accurate diagnosis of AC joint pathology. A systematic review of the Medline, Ovid and Cochrane Review databases was performed to identify level one and two diagnostic studies evaluating clinical tests for AC joint pathology. Individual test characteristics were combined in series and in parallel to improve sensitivities and specificities. A secondary analysis utilized subjective pre-test probabilities to create a clinical decision tree algorithm with post-test probabilities. The optimal special test combination to screen and confirm AC joint pathology combined Paxinos sign and O'Brien's Test, with a specificity of 95.8% when performed in series; whereas, Paxinos sign and Hawkins-Kennedy Test demonstrated a sensitivity of 93.7% when performed in parallel. Paxinos sign and O'Brien's Test demonstrated the greatest positive likelihood ratio (2.71); whereas, Paxinos sign and Hawkins-Kennedy Test reported the lowest negative likelihood ratio (0.35). No combination of special tests performed in series or in parallel creates more than a small impact on post-test probabilities to screen or confirm AC joint pathology. Paxinos sign and O'Brien's Test is the only special test combination that has a small and sometimes important impact when used both in series and in parallel. Physical examination testing is not beneficial for diagnosis of AC joint pathology when pretest probability is unequivocal. In these instances, it is of benefit to proceed with procedural tests to evaluate AC joint pathology. Ultrasound-guided corticosteroid injections are diagnostic and therapeutic. An ultrasound-guided AC joint corticosteroid injection may be an appropriate new standard for treatment and surgical decision-making. II - Systematic Review.

  11. The design and implementation of web mining in web sites security

    NASA Astrophysics Data System (ADS)

    Li, Jian; Zhang, Guo-Yin; Gu, Guo-Chang; Li, Jian-Li

    2003-06-01

    The backdoor or information leak of Web servers can be detected by using Web Mining techniques on some abnormal Web log and Web application log data. The security of Web servers can be enhanced and the damage of illegal access can be avoided. Firstly, the system for discovering the patterns of information leakages in CGI scripts from Web log data was proposed. Secondly, those patterns for system administrators to modify their codes and enhance their Web site security were provided. The following aspects were described: one is to combine web application log with web log to extract more information, so web data mining could be used to mine web log for discovering the information that firewall and Information Detection System cannot find. Another approach is to propose an operation module of web site to enhance Web site security. In cluster server session, Density-Based Clustering technique is used to reduce resource cost and obtain better efficiency.

  12. A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment

    ERIC Educational Resources Information Center

    Finch, Holmes; Monahan, Patrick

    2008-01-01

    This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…

  13. MAIL LOG, program summary and specifications

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    The summary and specifications to obtain the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS are provided. The MAIL LOG program has four modes of operation: (1) input - putting new records into the data base; (2) revise - changing or modifying existing records in the data base; (3) search - finding special records existing in the data base; and (4) archive - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the input and search modes.

  14. A Gene Signature to Determine Metastatic Behavior in Thymomas

    PubMed Central

    Gökmen-Polar, Yesim; Wilkinson, Jeff; Maetzold, Derek; Stone, John F.; Oelschlager, Kristen M.; Vladislav, Ioan Tudor; Shirar, Kristen L.; Kesler, Kenneth A.; Loehrer, Patrick J.; Badve, Sunil

    2013-01-01

    Purpose Thymoma represents one of the rarest of all malignancies. Stage and completeness of resection have been used to ascertain postoperative therapeutic strategies albeit with limited prognostic accuracy. A molecular classifier would be useful to improve the assessment of metastatic behaviour and optimize patient management. Methods qRT-PCR assay for 23 genes (19 test and four reference genes) was performed on multi-institutional archival primary thymomas (n = 36). Gene expression levels were used to compute a signature, classifying tumors into classes 1 and 2, corresponding to low or high likelihood for metastases. The signature was validated in an independent multi-institutional cohort of patients (n = 75). Results A nine-gene signature that can predict metastatic behavior of thymomas was developed and validated. Using radial basis machine modeling in the training set, 5-year and 10-year metastasis-free survival rates were 77% and 26% for predicted low (class 1) and high (class 2) risk of metastasis (P = 0.0047, log-rank), respectively. For the validation set, 5-year metastasis-free survival rates were 97% and 30% for predicted low- and high-risk patients (P = 0.0004, log-rank), respectively. The 5-year metastasis-free survival rates for the validation set were 49% and 41% for Masaoka stages I/II and III/IV (P = 0.0537, log-rank), respectively. In univariate and multivariate Cox models evaluating common prognostic factors for thymoma metastasis, the nine-gene signature was the only independent indicator of metastases (P = 0.036). Conclusion A nine-gene signature was established and validated which predicts the likelihood of metastasis more accurately than traditional staging. This further underscores the biologic determinants of the clinical course of thymoma and may improve patient management. PMID:23894276

  15. Soil respiration and carbon responses to logging debris and competing vegetation

    Treesearch

    Robert A. Slesak; Stephen H. Schoenholtz; Timothy B. Harrington

    2010-01-01

    Management practices following forest harvesting that modify organic matter (OM) inputs and influence changes in the soil environment have the potential to alter soil C pools, but there is still much uncertainty regarding how these practices influence soil C flux. We examined the influence of varying amounts of logging-debris retention (0, 40, and 80% coverage) and...

  16. Variation in logging debris cover influences competitor abundance, resource availability, and early growth of planted Douglas-fir

    Treesearch

    Timothy B. Harrington; Robert A. Slesak; Stephen H. Schoenholtz

    2013-01-01

    Logging debris remaining after timber harvest can modify the microclimate and growing conditions for forest regeneration. Debris also can influence tree seedlings indirectly through its effects on development of competing vegetation, although the mechanisms are poorly understood. At two sites in Washington and Oregon (USA) that differed in availability of soil water...

  17. On a modification method of Lefschetz thimbles

    NASA Astrophysics Data System (ADS)

    Tsutsui, Shoichiro; Doi, Takahiro M.

    2018-03-01

    The QCD at finite density is not well understood yet, where standard Monte Carlo simulation suffers from the sign problem. In order to overcome the sign problem, the method of Lefschetz thimble has been explored. Basically, the original sign problem can be less severe in a complexified theory due to the constancy of the imaginary part of an action on each thimble. However, global phase factors assigned on each thimble still remain. Their interference is not negligible in a situation where a large number of thimbles contribute to the partition function, and this could also lead to a sign problem. In this study, we propose a method to resolve this problem by modifying the structure of Lefschetz thimbles such that only a single thimble is relevant to the partition function. It can be shown that observables measured in the original and modified theories are connected by a simple identity. We exemplify that our method works well in a toy model.

  18. Family therapy with deaf persons: the systemic utilization of an interpreter.

    PubMed

    Harvey, M A

    1984-06-01

    This paper discusses the theory and practice of providing family therapy to families in which there are hearing parents and at least one Deaf child, particularly regarding the optimal utilization of an interpreter. The therapist must be knowledgeable about the psychosocial effects of deafness, the cultural aspects of deafness, and preferably be able to use American Sign Language and Signed English. The therapeutic benefit of utilizing an interpreter extends far beyond simply facilitating communication between each family member whose primary-language is either spoken English or Sign Language. The presence of an interpreter helps the therapist to modify family rules that deny the implications of deafness and prohibit the use of Sign Language, to modify the balance of power in the family, and to encourage participants to exhibit the ego defense mechanisms of projection and transference. The family therapist can utilize those subtle yet profound influences to therapeutic advantage.

  19. Efficacy of the World Health Organization-recommended handwashing technique and a modified washing technique to remove Clostridium difficile from hands.

    PubMed

    Deschênes, Philippe; Chano, Frédéric; Dionne, Léa-Laurence; Pittet, Didier; Longtin, Yves

    2017-08-01

    The efficacy of the World Health Organization (WHO)-recommended handwashing technique against Clostridium difficile is uncertain, and whether it could be improved remains unknown. Also, the benefit of using a structured technique instead of an unstructured technique remains unclear. This study was a prospective comparison of 3 techniques (unstructured, WHO, and a novel technique dubbed WHO shortened repeated [WHO-SR] technique) to remove C difficile. Ten participants were enrolled and performed each technique. Hands were contaminated with 3 × 10 6 colony forming units (CFU) of a nontoxigenic strain containing 90% spores. Efficacy was assessed using the whole-hand method. The relative efficacy of each technique and of a structured (either WHO or WHO-SR) vs an unstructured technique were assessed by Mann-Whitney U test and Wilcoxon signed-rank test. The median effectiveness of the unstructured, WHO, and WHO-SR techniques in log 10 CFU reduction was 1.30 (interquartile range [IQR], 1.27-1.43), 1.71 (IQR, 1.34-1.91), and 1.70 (IQR, 1.54-2.42), respectively. The WHO-SR technique was significantly more efficacious than the unstructured technique (P = .01). Washing hands with a structured technique was more effective than washing with an unstructured technique (median, 1.70 vs 1.30 log 10 CFU reduction, respectively; P = .007). A structured washing technique is more effective than an unstructured technique against C difficile. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  20. The spot sign and tranexamic acid on preventing ICH growth--AUStralasia Trial (STOP-AUST): protocol of a phase II randomized, placebo-controlled, double-blind, multicenter trial.

    PubMed

    Meretoja, Atte; Churilov, Leonid; Campbell, Bruce C V; Aviv, Richard I; Yassi, Nawaf; Barras, Christen; Mitchell, Peter; Yan, Bernard; Nandurkar, Harshal; Bladin, Christopher; Wijeratne, Tissa; Spratt, Neil J; Jannes, Jim; Sturm, Jonathan; Rupasinghe, Jayantha; Zavala, Jorge; Lee, Andrew; Kleinig, Timothy; Markus, Romesh; Delcourt, Candice; Mahant, Neil; Parsons, Mark W; Levi, Christopher; Anderson, Craig S; Donnan, Geoffrey A; Davis, Stephen M

    2014-06-01

    No evidence-based acute therapies exist for intracerebral hemorrhage. Intracerebral hemorrhage growth is an important determinant of patient outcome. Tranexamic acid is known to reduce hemorrhage in other conditions. The study aims to test the hypothesis that intracerebral hemorrhage patients selected with computed tomography angiography contrast extravasation 'spot sign' will have lower rates of hematoma growth when treated with intravenous tranexamic acid within 4.5-hours of stroke onset compared with placebo. The Spot sign and Tranexamic acid On Preventing ICH growth--AUStralasia Trial is a multicenter, prospective, 1:1 randomized, double-blind, placebo-controlled, investigator-initiated, academic Phase II trial. Intracerebral hemorrhage patients fulfilling clinical criteria (e.g. Glasgow Coma Scale >7, intracerebral hemorrhage volume <70 ml, no identified secondary cause of intracerebral hemorrhage, no thrombotic events within the previous 12 months, no planned surgery) and demonstrating contrast extravasation on computed tomography angiography will receive either intravenous tranexamic acid 1 g 10-min bolus followed by 1 g eight-hour infusion or placebo. A second computed tomography will be performed at 24 ± 3 hours to evaluate intracerebral hemorrhage growth and patients followed up for three-months. The primary outcome measure is presence of intracerebral hemorrhage growth by 24 ± 3 hours, defined as either >33% or >6 ml increase from baseline, and will be adjusted for baseline intracerebral hemorrhage volume. Secondary outcome measures include growth as a continuous measure, thromboembolic events, and the three-month modified Rankin Scale score. This is the first trial to evaluate the efficacy of tranexamic acid in intracerebral hemorrhage patients selected based on an imaging biomarker of high likelihood of hematoma growth. The trial is registered as NCT01702636. © 2013 The Authors. International Journal of Stroke © 2013 World Stroke Organization.

  1. A FORTRAN program for multivariate survival analysis on the personal computer.

    PubMed

    Mulder, P G

    1988-01-01

    In this paper a FORTRAN program is presented for multivariate survival or life table regression analysis in a competing risks' situation. The relevant failure rate (for example, a particular disease or mortality rate) is modelled as a log-linear function of a vector of (possibly time-dependent) explanatory variables. The explanatory variables may also include the variable time itself, which is useful for parameterizing piecewise exponential time-to-failure distributions in a Gompertz-like or Weibull-like way as a more efficient alternative to Cox's proportional hazards model. Maximum likelihood estimates of the coefficients of the log-linear relationship are obtained from the iterative Newton-Raphson method. The program runs on a personal computer under DOS; running time is quite acceptable, even for large samples.

  2. A Study of Dim Object Detection for the Space Surveillance Telescope

    DTIC Science & Technology

    2013-03-21

    ENG-13-M-32 Abstract Current methods of dim object detection for space surveillance make use of a Gaussian log-likelihood-ratio-test-based...quantitatively comparing the efficacy of two methods for dim object detection , termed in this paper the point detector and the correlator, both of which rely... applications . It is used in national defense for detecting satellites. It is used to detecting space debris, which threatens both civilian and

  3. Binary Detection using Multi-Hypothesis Log-Likelihood, Image Processing

    DTIC Science & Technology

    2014-03-27

    geosynchronous orbit and other scenarios important to the USAF. 2 1.3 Research objectives The question posed in this thesis is how well, if at all, can a...is important to compare them to another modern technique. The third objective is to compare results from another image detection method, specifically...Although adaptive optics is an important technique in moving closer to diffraction limited imaging, it is not currently a practical solution for all

  4. "Sign-on/off" sensing interface design and fabrication for propyl gallate recognition and sensitive detection.

    PubMed

    Dai, Yunlong; Li, Xueyan; Fan, Limei; Lu, Xiaojing; Kan, Xianwen

    2016-12-15

    A new strategy based on sign-on and sign-off was proposed for propyl gallate (PG) determination by an electrochemical sensor. The successively modified poly(thionine) (PTH) and molecular imprinted polymer (MIP) showed an obvious electrocatalysis and a good recognition toward PG, respectively. Furthermore, the rebound PG molecules in imprinted cavities not only were oxidized but also blocked the electron transmission channels for PTH redox. Thus, a sign-on from PG current and a sign-off from PTH current were combined as a dual-sign for PG detection. Meanwhile, the modified MIP endowed the sensor with recognition capacity. The electrochemical experimental results demonstrated that the prepared sensor possessed good selectivity and high sensitivity. A linear ranging from 5.0×10(-8) to 1.0×10(-4)mol/L for PG detection was obtained with a limit of detection of 2.4×10(-8)mol/L. And the sensor has been applied to analyze PG in real samples with satisfactory results. The simple, low cost, and effective strategy reported here can be further used to prepare electrochemical sensors for other compounds selective recognition and sensitive detection. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. 77 FR 24613 - Significant New Use Rules on Certain Chemical Substances

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-25

    ... detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject.... vii. Explain your views as clearly as possible, avoiding the use of profanity or personal threats... highly persistent in the environment and that it may be bioavailable based on data on related substances...

  6. 75 FR 32760 - Certain New Chemicals; Receipt and Status Information

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-09

    ... (202) 566-0280. Docket visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject...) Aromatic catalyst synthesis bisphosphite P-10-0364 04/30/10 07/28/10 CBI (G) Soluble metal (G) Bisphospite...

  7. 77 FR 52325 - Certain New Chemicals; Receipt and Status Information

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-29

    ... show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an... dust, Mfg., Inc. component to automotive achieve desired metal recovery. zinc content. P-12-0483 08/01...

  8. 78 FR 11871 - Certain New Chemicals; Receipt and Status Information

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-20

    ... are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will... polyol polyester P-13-0152 12/7/2012 3/6/2013 CBI (G) Contained use (G) Metal, substituted...

  9. 75 FR 32751 - Certain New Chemicals; Receipt and Status Information

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-09

    ... (202) 566-0280. Docket visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject... prepolymer for heat prepolymer curing metal assembly P-10-0245 02/19/10 05/19/10 Alberdingk Boley, (S...

  10. 75 FR 57770 - Certain New Chemicals; Receipt and Status Information

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-22

    ... identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that must be...- different substrates oil fatty acid, like plastics, alkyl diacid and metals, wood, alkyldiamines packaging...

  11. 78 FR 70938 - Draft Guidelines; Product Environmental Performance Standards and Ecolabels for Voluntary Use in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-27

    ...-0280. Docket visitors are required to show photographic identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to... existing mandates and government standards and ecolabels. [For illustration purposes, ``product category x...

  12. Vitreoretinal Complications and Outcomes in 92 Eyes Undergoing Surgery for Modified Osteo-Odonto-Keratoprosthesis: A 10-Year Review.

    PubMed

    Rishi, Pukhraj; Rishi, Ekta; Agarwal, Vishvesh; Nair, Sridevi; Iyer, Geetha; Srinivasan, Bhaskar; Agarwal, Shweta

    2018-06-01

    To analyze vitreoretinal (VR) complications and treatment outcomes in eyes undergoing modified osteo-odonto-keratoprosthesis (OOKP) surgery. Retrospective case series. All patients who underwent modified OOKP (mOOKP) surgery at a tertiary eye-care center from March 2003 to February 2013 were included. Medical records were reviewed for relevant medical history, best-corrected visual acuity (BCVA), slit-lamp examination, ultrasound scan, oral examination findings, and VR complications. The BCVA at the last visit. Optimal anatomic outcome was attached retina with a normal intraocular pressure at the last visit. A total of 92 eyes of 90 patients were included. Indications for OOKP included Stevens-Johnson syndrome (n = 53), chemical injury (n = 36), and ocular cicatricial pemphigoid (n = 3). A total of 41 eyes of 39 patients developed VR complications, including vitritis (n = 21), retinal detachment (RD) (n = 12; primary RD = 5), retroprosthetic membrane (RPM) (n = 10; primary RPM = 2), endophthalmitis (n = 8), vitreous hemorrhage (VH) (n = 5; primary VH = 1), serous choroidal detachment (n = 5), hemorrhagic choroidal detachment (n = 2), and leak-related hypotony (n = 1). Mean interval from mOOKP surgery to occurrence of VR complication(s) was 43.8 months (median, 41.9 months; range, 0.2-95.5 months). After treatment of VR complication, visual improvement was seen in 17 eyes (42%) (mean improvement = 1.2 logarithm of the minimum angle of resolution [logMAR]; median, 0.8 logMAR; range, 0.1-2.5 logMAR), visual decline in 7 eyes (14%) (mean decline in BCVA = 0.6 logMAR; median, 0.4 logMAR; range, 0.3-1.8 logMAR), and no change in BCVA in 17 eyes (42%). However, BCVA ≥6/60 was retained in 19 eyes and ≥6/18 was retained in 9 eyes after final VR treatment. Vitreoretinal complications constitute a significant cause of visual morbidity in eyes undergoing mOOKP surgery and pose a challenging situation to manage. However, appropriate and timely intervention can achieve encouraging results. Copyright © 2018 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  13. Robust Multi-Frame Adaptive Optics Image Restoration Algorithm Using Maximum Likelihood Estimation with Poisson Statistics.

    PubMed

    Li, Dongming; Sun, Changming; Yang, Jinhua; Liu, Huan; Peng, Jiaqi; Zhang, Lijuan

    2017-04-06

    An adaptive optics (AO) system provides real-time compensation for atmospheric turbulence. However, an AO image is usually of poor contrast because of the nature of the imaging process, meaning that the image contains information coming from both out-of-focus and in-focus planes of the object, which also brings about a loss in quality. In this paper, we present a robust multi-frame adaptive optics image restoration algorithm via maximum likelihood estimation. Our proposed algorithm uses a maximum likelihood method with image regularization as the basic principle, and constructs the joint log likelihood function for multi-frame AO images based on a Poisson distribution model. To begin with, a frame selection method based on image variance is applied to the observed multi-frame AO images to select images with better quality to improve the convergence of a blind deconvolution algorithm. Then, by combining the imaging conditions and the AO system properties, a point spread function estimation model is built. Finally, we develop our iterative solutions for AO image restoration addressing the joint deconvolution issue. We conduct a number of experiments to evaluate the performances of our proposed algorithm. Experimental results show that our algorithm produces accurate AO image restoration results and outperforms the current state-of-the-art blind deconvolution methods.

  14. The Equivalence of Information-Theoretic and Likelihood-Based Methods for Neural Dimensionality Reduction

    PubMed Central

    Williamson, Ross S.; Sahani, Maneesh; Pillow, Jonathan W.

    2015-01-01

    Stimulus dimensionality-reduction methods in neuroscience seek to identify a low-dimensional space of stimulus features that affect a neuron’s probability of spiking. One popular method, known as maximally informative dimensions (MID), uses an information-theoretic quantity known as “single-spike information” to identify this space. Here we examine MID from a model-based perspective. We show that MID is a maximum-likelihood estimator for the parameters of a linear-nonlinear-Poisson (LNP) model, and that the empirical single-spike information corresponds to the normalized log-likelihood under a Poisson model. This equivalence implies that MID does not necessarily find maximally informative stimulus dimensions when spiking is not well described as Poisson. We provide several examples to illustrate this shortcoming, and derive a lower bound on the information lost when spiking is Bernoulli in discrete time bins. To overcome this limitation, we introduce model-based dimensionality reduction methods for neurons with non-Poisson firing statistics, and show that they can be framed equivalently in likelihood-based or information-theoretic terms. Finally, we show how to overcome practical limitations on the number of stimulus dimensions that MID can estimate by constraining the form of the non-parametric nonlinearity in an LNP model. We illustrate these methods with simulations and data from primate visual cortex. PMID:25831448

  15. Robust Multi-Frame Adaptive Optics Image Restoration Algorithm Using Maximum Likelihood Estimation with Poisson Statistics

    PubMed Central

    Li, Dongming; Sun, Changming; Yang, Jinhua; Liu, Huan; Peng, Jiaqi; Zhang, Lijuan

    2017-01-01

    An adaptive optics (AO) system provides real-time compensation for atmospheric turbulence. However, an AO image is usually of poor contrast because of the nature of the imaging process, meaning that the image contains information coming from both out-of-focus and in-focus planes of the object, which also brings about a loss in quality. In this paper, we present a robust multi-frame adaptive optics image restoration algorithm via maximum likelihood estimation. Our proposed algorithm uses a maximum likelihood method with image regularization as the basic principle, and constructs the joint log likelihood function for multi-frame AO images based on a Poisson distribution model. To begin with, a frame selection method based on image variance is applied to the observed multi-frame AO images to select images with better quality to improve the convergence of a blind deconvolution algorithm. Then, by combining the imaging conditions and the AO system properties, a point spread function estimation model is built. Finally, we develop our iterative solutions for AO image restoration addressing the joint deconvolution issue. We conduct a number of experiments to evaluate the performances of our proposed algorithm. Experimental results show that our algorithm produces accurate AO image restoration results and outperforms the current state-of-the-art blind deconvolution methods. PMID:28383503

  16. THE ABSENCE OF RADIO EMISSION FROM THE GLOBULAR CLUSTER G1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller-Jones, J. C. A.; Wrobel, J. M.; Sivakoff, G. R.

    2012-08-10

    The detections of both X-ray and radio emission from the cluster G1 in M31 have provided strong support for existing dynamical evidence for an intermediate-mass black hole (IMBH) of mass (1.8 {+-} 0.5) Multiplication-Sign 10{sup 4} M{sub Sun} at the cluster center. However, given the relatively low significance and astrometric accuracy of the radio detection, and the non-simultaneity of the X-ray and radio measurements, this identification required further confirmation. Here we present deep, high angular resolution, strictly simultaneous X-ray and radio observations of G1. While the X-ray emission (L{sub X} = 1.74{sup +0.53}{sub -0.44} Multiplication-Sign 10{sup 36} (d/750 kpc){sup 2}more » erg s{sup -1} in the 0.5-10 keV band) remained fully consistent with previous observations, we detected no radio emission from the cluster center down to a 3{sigma} upper limit of 4.7 {mu}Jy beam{sup -1}. Our favored explanation for the previous radio detection is flaring activity from a black hole low-mass X-ray binary (LMXB). We performed a new regression of the 'Fundamental Plane' of black hole activity, valid for determining black hole mass from radio and X-ray observations of sub-Eddington black holes, finding log M{sub BH} = (1.638 {+-} 0.070)log L{sub R} - (1.136 {+-} 0.077)log L{sub X} - (6.863 {+-} 0.790), with an empirically determined uncertainty of 0.44 dex. This constrains the mass of the X-ray source in G1, if a black hole, to be <9.7 Multiplication-Sign 10{sup 3} M{sub Sun} at 95% confidence, suggesting that it is a persistent LMXB. This annuls what was previously the most convincing evidence from radiation for an IMBH in the Local Group, though the evidence for an IMBH in G1 from velocity dispersion measurements remains unaffected by these results.« less

  17. Prediction of Nonalcoholic Fatty Liver Disease Via a Novel Panel of Serum Adipokines

    PubMed Central

    Jamali, Raika; Arj, Abbas; Razavizade, Mohsen; Aarabi, Mohammad Hossein

    2016-01-01

    Abstract Considering limitations of liver biopsy for diagnosis of nonalcoholic liver disease (NAFLD), biomarkers’ panels were proposed. The aims of this study were to establish models based on serum adipokines for discriminating NAFLD from healthy individuals and nonalcoholic steatohepatitis (NASH) from simple steatosis. This case-control study was conducted in patients with persistent elevated serum aminotransferase levels and fatty liver on ultrasound. Individuals with evidence of alcohol consumption, hepatotoxic medication, viral hepatitis, and known liver disease were excluded. Liver biopsy was performed in the remaining patients to distinguish NAFLD/NASH. Histologic findings were interpreted using “nonalcoholic fatty liver activity score.” Control group consisted of healthy volunteers with normal physical examination, liver function tests, and liver ultrasound. Binary logistic regression analysis was applied to ascertain the effects of independent variables on the likelihood that participants have NAFLD/NASH. Decreased serum adiponectin and elevated serum visfatin, IL-6, TNF-a were associated with an increased likelihood of exhibiting NAFLD. NAFLD discriminant score was developed as the following: [(−0.298 × adiponectin) + (0.022 × TNF-a) + (1.021 × Log visfatin) + (0.709 × Log IL-6) + 1.154]. In NAFLD discriminant score, 86.4% of original grouped cases were correctly classified. Discriminant score threshold value of (−0.29) yielded a sensitivity and specificity of 91% and 83% respectively, for discriminating NAFLD from healthy controls. Decreased serum adiponectin and elevated serum visfatin, IL-8, TNF-a were correlated with an increased probability of NASH. NASH discriminant score was proposed as the following: [(−0.091 × adiponectin) + (0.044 × TNF-a) + (1.017 × Log visfatin) + (0.028 × Log IL-8) − 1.787] In NASH model, 84% of original cases were correctly classified. Discriminant score threshold value of (−0.22) yielded a sensitivity and specificity of 90% and 66% respectively, for separating NASH from simple steatosis. New discriminant scores were introduced for differentiating NAFLD/NASH patients with a high accuracy. If verified by future studies, application of suggested models for screening of NAFLD/NASH seems reasonable. PMID:26844476

  18. Systems identification using a modified Newton-Raphson method: A FORTRAN program

    NASA Technical Reports Server (NTRS)

    Taylor, L. W., Jr.; Iliff, K. W.

    1972-01-01

    A FORTRAN program is offered which computes a maximum likelihood estimate of the parameters of any linear, constant coefficient, state space model. For the case considered, the maximum likelihood estimate can be identical to that which minimizes simultaneously the weighted mean square difference between the computed and measured response of a system and the weighted square of the difference between the estimated and a priori parameter values. A modified Newton-Raphson or quasilinearization method is used to perform the minimization which typically requires several iterations. A starting technique is used which insures convergence for any initial values of the unknown parameters. The program and its operation are described in sufficient detail to enable the user to apply the program to his particular problem with a minimum of difficulty.

  19. Estimation of longitudinal stability and control derivatives for an icing research aircraft from flight data

    NASA Technical Reports Server (NTRS)

    Batterson, James G.; Omara, Thomas M.

    1989-01-01

    The results of applying a modified stepwise regression algorithm and a maximum likelihood algorithm to flight data from a twin-engine commuter-class icing research aircraft are presented. The results are in the form of body-axis stability and control derivatives related to the short-period, longitudinal motion of the aircraft. Data were analyzed for the baseline (uniced) and for the airplane with an artificial glaze ice shape attached to the leading edge of the horizontal tail. The results are discussed as to the accuracy of the derivative estimates and the difference between the derivative values found for the baseline and the iced airplane. Additional comparisons were made between the maximum likelihood results and the modified stepwise regression results with causes for any discrepancies postulated.

  20. CT Angiography Spot Sign, Hematoma Expansion, and Outcome in Primary Pontine Intracerebral Hemorrhage.

    PubMed

    Morotti, Andrea; Jessel, Michael J; Brouwers, H Bart; Falcone, Guido J; Schwab, Kristin; Ayres, Alison M; Vashkevich, Anastasia; Anderson, Christopher D; Viswanathan, Anand; Greenberg, Steven M; Gurol, M Edip; Romero, Javier M; Rosand, Jonathan; Goldstein, Joshua N

    2016-08-01

    The computed tomography angiography (CTA) spot sign is a validated predictor of hematoma expansion and poor outcome in supratentorial intracerebral hemorrhage (ICH), but patients with brainstem ICH have typically been excluded from the analyses. We investigated the frequency of spot sign and its relationship with hematoma expansion and outcome in patients with primary pontine hemorrhage (PPH). We performed a retrospective analysis of PPH cases obtained from a prospectively collected cohort of consecutive ICH patients who underwent CTA. CTA first-pass readings for spot sign presence were analyzed by two trained readers. Baseline and follow-up hematoma volumes on non-contrast CT scans were assessed by semi-automated computer-assisted volumetric analysis. Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), positive and negative likelihood ratio, and accuracy of spot sign for prediction of in-hospital mortality were calculated. 49 subjects met the inclusion criteria of whom 11 (22.4 %) showed a spot sign. In-hospital mortality was higher in spot sign-positive versus spot sign-negative subjects (90.9 vs 47.4 %, p = 0.020). Spot sign showed excellent specificity (95 %) and PPV (91 %) in predicting in-hospital mortality. Absolute hematoma growth, defined as parenchymal and intraventricular hematoma expansion of any amount, was significantly higher in spot sign-positive versus spot sign-negative subjects (13.72 ± 20.93 vs 3.76 ± 8.55 mL, p = 0.045). As with supratentorial ICH, the CTA spot sign is a common finding and is associated with higher risk of hematoma expansion and mortality in PPH. This marker may assist clinicians in prognostic stratification.

  1. Estimating residual fault hitting rates by recapture sampling

    NASA Technical Reports Server (NTRS)

    Lee, Larry; Gupta, Rajan

    1988-01-01

    For the recapture debugging design introduced by Nayak (1988) the problem of estimating the hitting rates of the faults remaining in the system is considered. In the context of a conditional likelihood, moment estimators are derived and are shown to be asymptotically normal and fully efficient. Fixed sample properties of the moment estimators are compared, through simulation, with those of the conditional maximum likelihood estimators. Properties of the conditional model are investigated such as the asymptotic distribution of linear functions of the fault hitting frequencies and a representation of the full data vector in terms of a sequence of independent random vectors. It is assumed that the residual hitting rates follow a log linear rate model and that the testing process is truncated when the gaps between the detection of new errors exceed a fixed amount of time.

  2. General Multidecision Theory: Hypothesis Testing and Changepoint Detection with Applications to Homeland Security

    DTIC Science & Technology

    2016-01-19

    complete convergence of the LLR to a finite and positive number which can be regarded as the Kullback – Leibler information number. Fourth, we...complete convergence of the LLR to a finite and positive number which can be regarded as the Kullback – Leibler information number. Fourth, we developed a...limiting Kullback – Leibler information numbers. We additionally show that if the local log-likelihood ratios also have independent increments, both the G

  3. Using phrases and document metadata to improve topic modeling of clinical reports.

    PubMed

    Speier, William; Ong, Michael K; Arnold, Corey W

    2016-06-01

    Probabilistic topic models provide an unsupervised method for analyzing unstructured text, which have the potential to be integrated into clinical automatic summarization systems. Clinical documents are accompanied by metadata in a patient's medical history and frequently contains multiword concepts that can be valuable for accurately interpreting the included text. While existing methods have attempted to address these problems individually, we present a unified model for free-text clinical documents that integrates contextual patient- and document-level data, and discovers multi-word concepts. In the proposed model, phrases are represented by chained n-grams and a Dirichlet hyper-parameter is weighted by both document-level and patient-level context. This method and three other Latent Dirichlet allocation models were fit to a large collection of clinical reports. Examples of resulting topics demonstrate the results of the new model and the quality of the representations are evaluated using empirical log likelihood. The proposed model was able to create informative prior probabilities based on patient and document information, and captured phrases that represented various clinical concepts. The representation using the proposed model had a significantly higher empirical log likelihood than the compared methods. Integrating document metadata and capturing phrases in clinical text greatly improves the topic representation of clinical documents. The resulting clinically informative topics may effectively serve as the basis for an automatic summarization system for clinical reports. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Analyzing repeated measures semi-continuous data, with application to an alcohol dependence study.

    PubMed

    Liu, Lei; Strawderman, Robert L; Johnson, Bankole A; O'Quigley, John M

    2016-02-01

    Two-part random effects models (Olsen and Schafer,(1) Tooze et al.(2)) have been applied to repeated measures of semi-continuous data, characterized by a mixture of a substantial proportion of zero values and a skewed distribution of positive values. In the original formulation of this model, the natural logarithm of the positive values is assumed to follow a normal distribution with a constant variance parameter. In this article, we review and consider three extensions of this model, allowing the positive values to follow (a) a generalized gamma distribution, (b) a log-skew-normal distribution, and (c) a normal distribution after the Box-Cox transformation. We allow for the possibility of heteroscedasticity. Maximum likelihood estimation is shown to be conveniently implemented in SAS Proc NLMIXED. The performance of the methods is compared through applications to daily drinking records in a secondary data analysis from a randomized controlled trial of topiramate for alcohol dependence treatment. We find that all three models provide a significantly better fit than the log-normal model, and there exists strong evidence for heteroscedasticity. We also compare the three models by the likelihood ratio tests for non-nested hypotheses (Vuong(3)). The results suggest that the generalized gamma distribution provides the best fit, though no statistically significant differences are found in pairwise model comparisons. © The Author(s) 2012.

  5. Assessment of subjective intraocular forward scattering and quality of vision after posterior chamber phakic intraocular lens with a central hole (Hole ICL) implantation.

    PubMed

    Iijima, Ayaka; Shimizu, Kimiya; Yamagishi, Mayumi; Kobashi, Hidenaga; Igarashi, Akihito; Kamiya, Kazutaka

    2016-12-01

    To evaluate the subjective intraocular forward scattering and quality of vision after posterior chamber phakic intraocular lens with a central hole (Hole ICL, STAAR Surgical) implantation. We prospectively examined 29 eyes of 29 consecutive patients (15 men and 14 women; ages, 37.2 ± 8.8 years) undergoing Hole ICL implantation. We assessed the values of the logarithmic straylight value [log (s)] using a straylight meter (C-Quant ™ , Oculus) preoperatively and 3 months postoperatively. The patients completed a questionnaire detailing symptoms on a quantitative grading scale (National Eye Institute Refractive Error Quality of Life Instrument-42; NEI RQL-42) 3 months postoperatively. We compared the preoperative and postoperative values of the log(s) and evaluated the correlation of these values with patient subjective symptoms. The mean log(s) was not significantly changed, from 1.07 ± 0.20 preoperatively, to 1.06 ± 0.17 postoperatively (Wilcoxon signed-rank test, p = 0.641). There was a significant correlation between the preoperative and postoperative log(s) (Spearman's correlation coefficient r = 0.695, p < 0.001). The postoperative log(s) was significantly associated with the scores of glare in the questionnaire (Spearman's correlation coefficient r = -0.575, p = 0.017). According to our experience, Hole ICL implantation does not induce a significant additional change in the subjective intraocular forward scattering. The symptom of glare after Hole ICL implantation was significantly correlated with the postoperative intraocular forward scattering in relation to the preoperative one. © 2016 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  6. Multiple Cognitive Control Effects of Error Likelihood and Conflict

    PubMed Central

    Brown, Joshua W.

    2010-01-01

    Recent work on cognitive control has suggested a variety of performance monitoring functions of the anterior cingulate cortex, such as errors, conflict, error likelihood, and others. Given the variety of monitoring effects, a corresponding variety of control effects on behavior might be expected. This paper explores whether conflict and error likelihood produce distinct cognitive control effects on behavior, as measured by response time. A change signal task (Brown & Braver, 2005) was modified to include conditions of likely errors due to tardy as well as premature responses, in conditions with and without conflict. The results discriminate between competing hypotheses of independent vs. interacting conflict and error likelihood control effects. Specifically, the results suggest that the likelihood of premature vs. tardy response errors can lead to multiple distinct control effects, which are independent of cognitive control effects driven by response conflict. As a whole, the results point to the existence of multiple distinct cognitive control mechanisms and challenge existing models of cognitive control that incorporate only a single control signal. PMID:19030873

  7. Proton adsorption onto alumina: extension of multisite complexation (MUSIC) theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagashima, K.; Blum, F.D.

    1999-09-01

    The adsorption isotherm of protons onto a commercial {gamma}-alumina sample was determined in aqueous nitric acid with sodium nitrate as a background electrolyte. Three discrete regions could be discerned in the log-log plots of the proton isotherm determined at the solution pH 5 to 2. The multisite complexation (MUSIC) model was modified to analyze the simultaneous adsorption of protons onto various kinds of surface species.

  8. Evaluation of the artificial membrane permeability of drugs by digital simulation.

    PubMed

    Nakamura, Mayumi; Osakai, Toshiyuki

    2016-08-25

    A digital simulation method has been developed for evaluating the membrane permeability of drugs in the parallel artificial membrane permeation assay (PAMPA). The simulation results have shown that the permeability coefficient (log Ppampa) of drugs is linearly increased with increasing their distribution coefficient (log KD,M) to the lipid membrane, i.e., the hydrophobicity of the drug molecules. However, log Ppampa shows signs of leveling off for highly hydrophobic drugs. Such a dependence of log Ppampa is in harmony with the reported experimental data, and has been well explained in terms of the change in the rate-determining step from the diffusion in the membrane to that in the unstirred water layer (UWL) on both sides of the membrane. Additionally, the effects of several factors, including lag time, diffusion coefficient, pH, and pKa, on the permeability coefficient have been well simulated. It has thus been suggested that the proposed method should be promising for in silico evaluation of the membrane permeability of drugs. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Prediction of distal residue participation in enzyme catalysis

    PubMed Central

    Brodkin, Heather R; DeLateur, Nicholas A; Somarowthu, Srinivas; Mills, Caitlyn L; Novak, Walter R; Beuning, Penny J; Ringe, Dagmar; Ondrechen, Mary Jo

    2015-01-01

    A scoring method for the prediction of catalytically important residues in enzyme structures is presented and used to examine the participation of distal residues in enzyme catalysis. Scores are based on the Partial Order Optimum Likelihood (POOL) machine learning method, using computed electrostatic properties, surface geometric features, and information obtained from the phylogenetic tree as input features. Predictions of distal residue participation in catalysis are compared with experimental kinetics data from the literature on variants of the featured enzymes; some additional kinetics measurements are reported for variants of Pseudomonas putida nitrile hydratase (ppNH) and for Escherichia coli alkaline phosphatase (AP). The multilayer active sites of P. putida nitrile hydratase and of human phosphoglucose isomerase are predicted by the POOL log ZP scores, as is the single-layer active site of P. putida ketosteroid isomerase. The log ZP score cutoff utilized here results in over-prediction of distal residue involvement in E. coli alkaline phosphatase. While fewer experimental data points are available for P. putida mandelate racemase and for human carbonic anhydrase II, the POOL log ZP scores properly predict the previously reported participation of distal residues. PMID:25627867

  10. 76 FR 11243 - Solicitation of Input From Stakeholders To Inform the National Framework for Electronics Stewardship

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-01

    ... a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X- ray... equipment from solid waste landfills in the United States. EPA does, however, control how cathode ray tube... cell phone and computers/laptops or recover valuable resources, such as precious metals, plastics or...

  11. Summer Bookaneers: Sign on with Captain Book. 1990 Florida Summer Library Program.

    ERIC Educational Resources Information Center

    Fiore, Carole D., Comp.; Fine, Jana R., Comp.

    Designed for use by children's librarians in organizing and conducting a summer reading program for children 5 through 12 years of age, this "log book" contains suggestions for activities related to a seafaring theme together with lists of selected materials relevant to the particular activities. Samples of a press release and several…

  12. A Walk to the Well.

    ERIC Educational Resources Information Center

    Weir, Phil

    1994-01-01

    During a walk, an outdoor education teacher reflects on the status of outdoor education in Ottawa (Canada) and importance of maintaining a close relationship with nature. He looks for signs of an old log home site, observes a hawk's flight, discovers remains of a plastic bag in an owl pellet, and realizes that everyone is working on survival. (LP)

  13. A joint frailty-copula model between tumour progression and death for meta-analysis.

    PubMed

    Emura, Takeshi; Nakatochi, Masahiro; Murotani, Kenta; Rondeau, Virginie

    2017-12-01

    Dependent censoring often arises in biomedical studies when time to tumour progression (e.g., relapse of cancer) is censored by an informative terminal event (e.g., death). For meta-analysis combining existing studies, a joint survival model between tumour progression and death has been considered under semicompeting risks, which induces dependence through the study-specific frailty. Our paper here utilizes copulas to generalize the joint frailty model by introducing additional source of dependence arising from intra-subject association between tumour progression and death. The practical value of the new model is particularly evident for meta-analyses in which only a few covariates are consistently measured across studies and hence there exist residual dependence. The covariate effects are formulated through the Cox proportional hazards model, and the baseline hazards are nonparametrically modeled on a basis of splines. The estimator is then obtained by maximizing a penalized log-likelihood function. We also show that the present methodologies are easily modified for the competing risks or recurrent event data, and are generalized to accommodate left-truncation. Simulations are performed to examine the performance of the proposed estimator. The method is applied to a meta-analysis for assessing a recently suggested biomarker CXCL12 for survival in ovarian cancer patients. We implement our proposed methods in R joint.Cox package.

  14. Potential of the octanol-water partition coefficient (logP) to predict the dermal penetration behaviour of amphiphilic compounds in aqueous solutions.

    PubMed

    Korinth, Gintautas; Wellner, Tanja; Schaller, Karl Heinz; Drexler, Hans

    2012-11-23

    Aqueous amphiphilic compounds may exhibit enhanced skin penetration compared with neat compounds. Conventional models do not predict this percutaneous penetration behaviour. We investigated the potential of the octanol-water partition coefficient (logP) to predict dermal fluxes for eight compounds applied neat and as 50% aqueous solutions in diffusion cell experiments using human skin. Data for seven other compounds were accessed from literature. In total, seven glycol ethers, three alcohols, two glycols, and three other chemicals were considered. Of these 15 compounds, 10 penetrated faster through the skin as aqueous solutions than as neat compounds. The other five compounds exhibited larger fluxes as neat applications. For 13 of the 15 compounds, a consistent relationship was identified between the percutaneous penetration behaviour and the logP. Compared with the neat applications, positive logP were associated with larger fluxes for eight of the diluted compounds, and negative logP were associated with smaller fluxes for five of the diluted compounds. Our study demonstrates that decreases or enhancements in dermal penetration upon aqueous dilution can be predicted for many compounds from the sign of logP (i.e., positive or negative). This approach may be suitable as a first approximation in risk assessments of dermal exposure. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  15. The NetLogger Toolkit V2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gunter, Dan; Lee, Jason; Stoufer, Martin

    2003-03-28

    The NetLogger Toolkit is designed to monitor, under actual operating conditions, the behavior of all the elements of the application-to-application communication path in order to determine exactly where time is spent within a complex system Using NetLogger, distnbuted application components are modified to produce timestamped logs of "interesting" events at all the critical points of the distributed system Events from each component are correlated, which allov^ one to characterize the performance of all aspects of the system and network in detail. The NetLogger Toolkit itself consists of four components an API and library of functions to simplify the generation ofmore » application-level event logs, a set of tools for collecting and sorting log files, an event archive system, and a tool for visualization and analysis of the log files In order to instrument an application to produce event logs, the application developer inserts calls to the NetLogger API at all the critical points in the code, then links the application with the NetLogger library All the tools in the NetLogger Toolkit share a common log format, and assume the existence of accurate and synchronized system clocks NetLogger messages can be logged using an easy-to-read text based format based on the lETF-proposed ULM format, or a binary format that can still be used through the same API but that is several times faster and smaller, with performance comparable or better than binary message formats such as MPI, XDR, SDDF-Binary, and PBIO. The NetLogger binary format is both highly efficient and self-describing, thus optimized for the dynamic message construction and parsing of application instrumentation. NetLogger includes an "activation" API that allows NetLogger logging to be turned on, off, or modified by changing an external file This IS useful for activating logging in daemons/services (e g GndFTP server). The NetLogger reliability API provides the ability to specify backup logging locations and penodically try to reconnect broken TCP pipe. A typical use for this is to store data on local disk while net is down. An event archiver can log one or more incoming NetLogger streams to a local disk file (netlogd) or to a mySQL database (netarchd). We have found exploratory, visual analysis of the log event data to be the most useful means of determining the causes of performance anomalies The NetLogger Visualization tool, niv, has been developed to provide a flexible and interactive graphical representation of system-level and application-level events.« less

  16. Social support modifies association between forward bending of the trunk and low-back pain: Cross-sectional field study of blue-collar workers.

    PubMed

    Villumsen, Morten; Holtermann, Andreas; Samani, Afshin; Madeleine, Pascal; Jørgensen, Marie Birk

    2016-03-01

    This study aimed to investigate the association between forward bending of the trunk and low-back pain intensity (LBPi) among blue-collar workers in Denmark as well as whether the level of social support modifies the association. In total, 457 workers were included in the study. The forward bending of ≥ 30° was computed from accelerometer recordings for several consecutive days during work, categorized into long (highest tertile) and short-moderate (remaining tertiles) duration. LBPi was measured on a 0-10 scale and categorized into low (≤ 5) and high (>5) pain. Self-reported social support was categorized into low, moderate, and high levels. Multi-adjusted logistic regressions estimated the association between forward bending and LBPi and the effect modification by social support. Forward bending and LBPi were not significantly associated but modified by social support. Workers with low social support and long duration of forward bending had higher likelihood of high LBPi [odds ratio (OR) 2.97, 95% confidence interval (95% CI) 1.11-7.95] compared to workers with high social support and long duration of forward bending. Among workers with low social support, workers with long duration of forward bending had higher likelihood of high LBPi (OR 3.28, 95% CI 0.99-10.90) compared to workers with short-moderate duration of forward bending. Among workers with high social support, workers with long duration of forward bending had reduced likelihood of high LBPi (OR 0.39, 95% CI 0.16-0.95) compared to workers with short-moderate duration of forward bending. Social support modifies the association between objectively measured forward bending and LBPi among blue-collar workers.

  17. Spoilage and safety characteristics of ground beef packaged in traditional and modified atmosphere packages.

    PubMed

    Brooks, J C; Alvarado, M; Stephens, T P; Kellermeier, J D; Tittor, A W; Miller, M F; Brashears, M M

    2008-02-01

    Two separate studies, one with pathogen-inoculated product and one with noninoculated product, were conducted to determine the safety and spoilage characteristics of modified atmosphere packaging (MAP) and traditional packaging of ground beef patties. Ground beef patties were allotted to five packaging treatments (i) control (foam tray with film overwrap; traditional), (ii) high-oxygen MAP (80% 02, 20% CO2), (iii) high-oxygen MAP with added rosemary extract, (iv) low-oxygen carbon monoxide MAP (0.4% CO, 30% CO2, 69.6% N2), and (v) low-oxygen carbon monoxide MAP with added rosemary extract. Beef patties were evaluated for changes over time (0, 1, 3, 5, 7, 14, and 21 days) during lighted display. Results indicated low-oxygen carbon monoxide gas flush had a stabilizing effect on meat color after the formation of carboxymyoglobin and was effective for preventing the development of surface discoloration. Consumers indicated that beef patties packaged in atmospheres containing carbon monoxide were more likely to smell fresh at 7, 14, and 21 days of display, but the majority would probably not consume these products after 14 days of display because of their odor. MAP suppressed the growth of psychrophilic aerobic bacteria when compared with control packages. Generally, control packages had significantly higher total aerobic bacteria and Lactobacillus counts than did modified atmosphere packages. In the inoculated ground beef (approximately 10(5) CFU/g) in MAP, Escherichia coli O157 populations ranged from 4.51 to 4.73 log CFU/g with no differences among the various packages, but the total E. coli O157:H7 in the ground beef in the control packages was significantly higher at 5.61 log CFU/g after 21 days of storage. On days 14 and 21, the total Salmonella in the ground beef in control packages was at 5.29 and 5.27 log CFU/g, respectively, which was significantly higher than counts in the modified atmosphere packages (3.99 to 4.31 log CFU/g on day 14 and 3.76 to 4.02 log CFU/g on day 21). Data from these studies indicate that MAP suppresses pathogen growth compared with controls and that spoilage characteristics developed in MAP packages.

  18. Accoustic waveform logging--Advances in theory and application

    USGS Publications Warehouse

    Paillet, F.L.; Cheng, C.H.; Pennington , W.D.

    1992-01-01

    Full-waveform acoustic logging has made significant advances in both theory and application in recent years, and these advances have greatly increased the capability of log analysts to measure the physical properties of formations. Advances in theory provide the analytical tools required to understand the properties of measured seismic waves, and to relate those properties to such quantities as shear and compressional velocity and attenuation, and primary and fracture porosity and permeability of potential reservoir rocks. The theory demonstrates that all parts of recorded waveforms are related to various modes of propagation, even in the case of dipole and quadrupole source logging. However, the theory also indicates that these mode properties can be used to design velocity and attenuation picking schemes, and shows how source frequency spectra can be selected to optimize results in specific applications. Synthetic microseismogram computations are an effective tool in waveform interpretation theory; they demonstrate how shear arrival picks and mode attenuation can be used to compute shear velocity and intrinsic attenuation, and formation permeability for monopole, dipole and quadrupole sources. Array processing of multi-receiver data offers the opportunity to apply even more sophisticated analysis techniques. Synthetic microseismogram data is used to illustrate the application of the maximum-likelihood method, semblance cross-correlation, and Prony's method analysis techniques to determine seismic velocities and attenuations. The interpretation of acoustic waveform logs is illustrated by reviews of various practical applications, including synthetic seismogram generation, lithology determination, estimation of geomechanical properties in situ, permeability estimation, and design of hydraulic fracture operations.

  19. Predicting clicks of PubMed articles.

    PubMed

    Mao, Yuqing; Lu, Zhiyong

    2013-01-01

    Predicting the popularity or access usage of an article has the potential to improve the quality of PubMed searches. We can model the click trend of each article as its access changes over time by mining the PubMed query logs, which contain the previous access history for all articles. In this article, we examine the access patterns produced by PubMed users in two years (July 2009 to July 2011). We explore the time series of accesses for each article in the query logs, model the trends with regression approaches, and subsequently use the models for prediction. We show that the click trends of PubMed articles are best fitted with a log-normal regression model. This model allows the number of accesses an article receives and the time since it first becomes available in PubMed to be related via quadratic and logistic functions, with the model parameters to be estimated via maximum likelihood. Our experiments predicting the number of accesses for an article based on its past usage demonstrate that the mean absolute error and mean absolute percentage error of our model are 4.0% and 8.1% lower than the power-law regression model, respectively. The log-normal distribution is also shown to perform significantly better than a previous prediction method based on a human memory theory in cognitive science. This work warrants further investigation on the utility of such a log-normal regression approach towards improving information access in PubMed.

  20. Predicting clicks of PubMed articles

    PubMed Central

    Mao, Yuqing; Lu, Zhiyong

    2013-01-01

    Predicting the popularity or access usage of an article has the potential to improve the quality of PubMed searches. We can model the click trend of each article as its access changes over time by mining the PubMed query logs, which contain the previous access history for all articles. In this article, we examine the access patterns produced by PubMed users in two years (July 2009 to July 2011). We explore the time series of accesses for each article in the query logs, model the trends with regression approaches, and subsequently use the models for prediction. We show that the click trends of PubMed articles are best fitted with a log-normal regression model. This model allows the number of accesses an article receives and the time since it first becomes available in PubMed to be related via quadratic and logistic functions, with the model parameters to be estimated via maximum likelihood. Our experiments predicting the number of accesses for an article based on its past usage demonstrate that the mean absolute error and mean absolute percentage error of our model are 4.0% and 8.1% lower than the power-law regression model, respectively. The log-normal distribution is also shown to perform significantly better than a previous prediction method based on a human memory theory in cognitive science. This work warrants further investigation on the utility of such a log-normal regression approach towards improving information access in PubMed. PMID:24551386

  1. Testing multiple statistical hypotheses resulted in spurious associations: a study of astrological signs and health.

    PubMed

    Austin, Peter C; Mamdani, Muhammad M; Juurlink, David N; Hux, Janet E

    2006-09-01

    To illustrate how multiple hypotheses testing can produce associations with no clinical plausibility. We conducted a study of all 10,674,945 residents of Ontario aged between 18 and 100 years in 2000. Residents were randomly assigned to equally sized derivation and validation cohorts and classified according to their astrological sign. Using the derivation cohort, we searched through 223 of the most common diagnoses for hospitalization until we identified two for which subjects born under one astrological sign had a significantly higher probability of hospitalization compared to subjects born under the remaining signs combined (P<0.05). We tested these 24 associations in the independent validation cohort. Residents born under Leo had a higher probability of gastrointestinal hemorrhage (P=0.0447), while Sagittarians had a higher probability of humerus fracture (P=0.0123) compared to all other signs combined. After adjusting the significance level to account for multiple comparisons, none of the identified associations remained significant in either the derivation or validation cohort. Our analyses illustrate how the testing of multiple, non-prespecified hypotheses increases the likelihood of detecting implausible associations. Our findings have important implications for the analysis and interpretation of clinical studies.

  2. Intrastromal femtosecond laser surgical compensation of presbyopia with six intrastromal ring cuts: 3-year results.

    PubMed

    Khoramnia, Ramin; Fitting, Anna; Rabsilber, Tanja M; Thomas, Bettina C; Auffarth, Gerd U; Holzer, Mike P

    2015-02-01

    To assess over a 36-month period functional results of the modified INTRACOR femtosecond laser-based intrastromal procedure to treat presbyopia. 20 eyes of 20 presbyopic patients with mild hyperopia were included. The INTRACOR procedure with a modified pattern (six concentric intrastromal ring cuts) was performed using the FEMTEC femtosecond laser (Bausch+Lomb/Technolas Perfect Vision, Munich, Germany). Patients were also randomly divided into three subgroups to compare the effect of three different small inner ring diameters (1.8/2.0/2.2 mm (Groups A/B/C)). Follow-up examinations were performed at 1, 3, 6, 12, 24 and 36 months, and included near and distance visual acuity tests, slit-lamp examinations and corneal topography. Median uncorrected near visual acuity (UNVA) increased from 0.7/0.7/0.7 logMAR (Groups A/B/C) to -0.1/0.1/0.1 logMAR 36 months after surgery. Uncorrected distance visual acuity changed slightly from 0.1/0.2/0.1 logMAR to 0.2/0.3/0.1 logMAR. Losses of two lines of binocular corrected distance visual acuity (CDVA) were noted in 0/25/0% of eyes. Median spherical equivalent changed from 0.75/0.75/0.75 dioptres to -0.19/0.13/-0.19 dioptres. Overall patient satisfaction with the procedure was 80%. INTRACOR with a modified pattern improved UNVA in all patients over a 36-month follow-up period. The possibility of reduced CDVA underlines the need for careful patient selection. NCT00928122. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  3. Mail LOG: Program operating instructions

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    The operating instructions for the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, are provided. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG program has the following four modes of operation: (1) INPUT - putting new records into the data base (2) REVISE - changing or modifying existing records in the data base (3) SEARCH - finding special records existing in the data base (4) ARCHIVE - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the INPUT and SEARCH modes. The MAIL LOG data base consists of three main subfiles: Incoming and outgoing mail correspondence; Design Information Releases and Releases and Reports; and Drawings and Engineering orders.

  4. Modified Maxium Likelihood Estimation Method for Completely Separated and Quasi-Completely Separated Data for a Dose-Response Model

    DTIC Science & Technology

    2015-08-01

    McCullagh, P.; Nelder, J.A. Generalized Linear Model , 2nd ed.; Chapman and Hall: London, 1989. 7. Johnston, J. Econometric Methods, 3rd ed.; McGraw...FOR A DOSE-RESPONSE MODEL ECBC-TN-068 Kyong H. Park Steven J. Lagan RESEARCH AND TECHNOLOGY DIRECTORATE August 2015 Approved for public release...Likelihood Estimation Method for Completely Separated and Quasi-Completely Separated Data for a Dose-Response Model 5a. CONTRACT NUMBER 5b. GRANT

  5. Effects of selective logging on bat communities in the southeastern Amazon.

    PubMed

    Peters, Sandra L; Malcolm, Jay R; Zimmerman, Barbara L

    2006-10-01

    Although extensive areas of tropical forest are selectively logged each year, the responses of bat communities to this form of disturbance have rarely been examined. Our objectives were to (1) compare bat abundance, species composition, and feeding guild structure between unlogged and low-intensity selectively logged (1-4 logged stems/ha) sampling grids in the southeastern Amazon and (2) examine correlations between logging-induced changes in bat communities and forest structure. We captured bats in understory and canopy mist nets set in five 1-ha study grids in both logged and unlogged forest. We captured 996 individuals, representing 5 families, 32 genera, and 49 species. Abundances of nectarivorous and frugivorous taxa (Glossophaginae, Lonchophyllinae, Stenodermatinae, and Carolliinae) were higher at logged sites, where canopy openness and understory foliage density were greatest. In contrast, insectivorous and omnivorous species (Emballonuridae, Mormoopidae, Phyllostominae, and Vespertilionidae) were more abundant in unlogged sites, where canopy foliage density and variability in the understory stratum were greatest. Multivariate analyses indicated that understory bat species composition differed strongly between logged and unlogged sites but provided little evidence of logging effects for the canopy fauna. Different responses among feeding guilds and taxonomic groups appeared to be related to foraging and echolocation strategies and to changes in canopy cover and understory foliage densities. Our results suggest that even low-intensity logging modifies habitat structure, leading to changes in bat species composition.

  6. Greenery in the university environment: Students’ preferences and perceived restoration likelihood

    PubMed Central

    2018-01-01

    A large body of evidence shows that interaction with greenery can be beneficial for human stress reduction, emotional states, and improved cognitive function. It can, therefore, be expected that university students might benefit from greenery in the university environment. Before investing in real-life interventions in a university environment, it is necessary to first explore students’ perceptions of greenery in the university environment. This study examined (1) preference for university indoor and outdoor spaces with and without greenery (2) perceived restoration likelihood of university outdoor spaces with and without greenery and (3) if preference and perceived restoration likelihood ratings were modified by demographic characteristics or connectedness to nature in Dutch university students (N = 722). Digital photographic stimuli represented four university spaces (lecture hall, classroom, study area, university outdoor space). For each of the three indoor spaces there were four or five stimuli conditions: (1) the standard design (2) the standard design with a colorful poster (3) the standard design with a nature poster (4) the standard design with a green wall (5) the standard design with a green wall plus interior plants. The university outdoor space included: (1) the standard design (2) the standard design with seating (3) the standard design with colorful artifacts (4) the standard design with green elements (5) the standard design with extensive greenery. Multi-level analyses showed that students gave higher preference ratings to the indoor spaces with a nature poster, a green wall, or a green wall plus interior plants than to the standard designs and the designs with the colorful posters. Students also rated preference and perceived restoration likelihood of the outdoor spaces that included greenery higher than those without. Preference and perceived restoration likelihood were not modified by demographic characteristics, but students with strong connectedness to nature rated preference and perceived restoration likelihood overall higher than students with weak connectedness to nature. The findings suggest that students would appreciate the integration of greenery in the university environment. PMID:29447184

  7. Greenery in the university environment: Students' preferences and perceived restoration likelihood.

    PubMed

    van den Bogerd, Nicole; Dijkstra, S Coosje; Seidell, Jacob C; Maas, Jolanda

    2018-01-01

    A large body of evidence shows that interaction with greenery can be beneficial for human stress reduction, emotional states, and improved cognitive function. It can, therefore, be expected that university students might benefit from greenery in the university environment. Before investing in real-life interventions in a university environment, it is necessary to first explore students' perceptions of greenery in the university environment. This study examined (1) preference for university indoor and outdoor spaces with and without greenery (2) perceived restoration likelihood of university outdoor spaces with and without greenery and (3) if preference and perceived restoration likelihood ratings were modified by demographic characteristics or connectedness to nature in Dutch university students (N = 722). Digital photographic stimuli represented four university spaces (lecture hall, classroom, study area, university outdoor space). For each of the three indoor spaces there were four or five stimuli conditions: (1) the standard design (2) the standard design with a colorful poster (3) the standard design with a nature poster (4) the standard design with a green wall (5) the standard design with a green wall plus interior plants. The university outdoor space included: (1) the standard design (2) the standard design with seating (3) the standard design with colorful artifacts (4) the standard design with green elements (5) the standard design with extensive greenery. Multi-level analyses showed that students gave higher preference ratings to the indoor spaces with a nature poster, a green wall, or a green wall plus interior plants than to the standard designs and the designs with the colorful posters. Students also rated preference and perceived restoration likelihood of the outdoor spaces that included greenery higher than those without. Preference and perceived restoration likelihood were not modified by demographic characteristics, but students with strong connectedness to nature rated preference and perceived restoration likelihood overall higher than students with weak connectedness to nature. The findings suggest that students would appreciate the integration of greenery in the university environment.

  8. Adhesion and removal kinetics of Bacillus cereus biofilms on Ni-PTFE modified stainless steel.

    PubMed

    Huang, Kang; McLandsborough, Lynne A; Goddard, Julie M

    2016-01-01

    Biofilm control remains a challenge to food safety. A well-studied non-fouling coating involves codeposition of polytetrafluoroethylene (PTFE) during electroless plating. This coating has been reported to reduce foulant build-up during pasteurization, but opportunities remain in demonstrating its efficacy in inhibiting biofilm formation. Herein, the initial adhesion, biofilm formation, and removal kinetics of Bacillus cereus on Ni-PTFE-modified stainless steel (SS) are characterized. Coatings lowered the surface energy of SS and reduced biofilm formation by > 2 log CFU cm(-2). Characterization of the kinetics of biofilm removal during cleaning demonstrated improved cleanability on the Ni-PTFE coated steel. There was no evidence of biofilm after cleaning by either solution on the Ni-PTFE coated steel, whereas more than 3 log and 1 log CFU cm(-2) of bacteria remained on the native steel after cleaning with water and an alkaline cleaner, respectively. This work demonstrates the potential application of Ni-PTFE non-fouling coatings on SS to improve food safety by reducing biofilm formation and improving the cleaning efficiency of food processing equipment.

  9. A scaling transformation for classifier output based on likelihood ratio: Applications to a CAD workstation for diagnosis of breast cancer

    PubMed Central

    Horsch, Karla; Pesce, Lorenzo L.; Giger, Maryellen L.; Metz, Charles E.; Jiang, Yulei

    2012-01-01

    Purpose: The authors developed scaling methods that monotonically transform the output of one classifier to the “scale” of another. Such transformations affect the distribution of classifier output while leaving the ROC curve unchanged. In particular, they investigated transformations between radiologists and computer classifiers, with the goal of addressing the problem of comparing and interpreting case-specific values of output from two classifiers. Methods: Using both simulated and radiologists’ rating data of breast imaging cases, the authors investigated a likelihood-ratio-scaling transformation, based on “matching” classifier likelihood ratios. For comparison, three other scaling transformations were investigated that were based on matching classifier true positive fraction, false positive fraction, or cumulative distribution function, respectively. The authors explored modifying the computer output to reflect the scale of the radiologist, as well as modifying the radiologist’s ratings to reflect the scale of the computer. They also evaluated how dataset size affects the transformations. Results: When ROC curves of two classifiers differed substantially, the four transformations were found to be quite different. The likelihood-ratio scaling transformation was found to vary widely from radiologist to radiologist. Similar results were found for the other transformations. Our simulations explored the effect of database sizes on the accuracy of the estimation of our scaling transformations. Conclusions: The likelihood-ratio-scaling transformation that the authors have developed and evaluated was shown to be capable of transforming computer and radiologist outputs to a common scale reliably, thereby allowing the comparison of the computer and radiologist outputs on the basis of a clinically relevant statistic. PMID:22559651

  10. The l z ( p ) * Person-Fit Statistic in an Unfolding Model Context.

    PubMed

    Tendeiro, Jorge N

    2017-01-01

    Although person-fit analysis has a long-standing tradition within item response theory, it has been applied in combination with dominance response models almost exclusively. In this article, a popular log likelihood-based parametric person-fit statistic under the framework of the generalized graded unfolding model is used. Results from a simulation study indicate that the person-fit statistic performed relatively well in detecting midpoint response style patterns and not so well in detecting extreme response style patterns.

  11. Selenium- or vitamin E-related gene variants, interaction with supplementation, and risk of high-grade prostate cancer in SELECT

    PubMed Central

    Chan, June M.; Darke, Amy K.; Penney, Kathryn L.; Tangen, Catherine M.; Goodman, Phyllis J.; Lee, Gwo-Shu Mary; Sun, Tong; Peisch, Sam; Tinianow, Alex M.; Rae, James M.; Klein, Eric A.; Thompson, Ian M.

    2016-01-01

    Background Epidemiological studies and secondary analyses of randomized trials supported the hypothesis that selenium and vitamin E lower prostate cancer risk. However, the Selenium and Vitamin E Cancer Prevention Trial (SELECT) showed no benefit of either supplement. Genetic variants involved in selenium or vitamin E metabolism or transport may underlie the complex associations of selenium and vitamin E. Methods We undertook a case-cohort study of SELECT participants randomized to placebo, selenium or vitamin E. The subcohort included 1,434 men; our primary outcome was high-grade prostate cancer (N=278 cases, Gleason 7 or higher cancer). We used weighted Cox regression to examine the association between SNPs and high-grade prostate cancer risk. To assess effect modification, we created interaction terms between randomization arm and genotype and calculated log likelihood statistics. Results We noted statistically significant (p<0.05) interactions between selenium assignment, SNPs in CAT, SOD2, PRDX6, SOD3, and TXNRD2 and high-grade prostate cancer risk. Statistically significant SNPs that modified the association of vitamin E assignment and high-grade prostate cancer included SEC14L2, SOD1, and TTPA. In the placebo arm, several SNPs, hypothesized to interact with supplement assignment and risk of high-grade prostate cancer, were also directly associated with outcome. Conclusion Variants in selenium and vitamin E metabolism/transport genes may influence risk of overall and high-grade prostate cancer, and may modify an individual man’s response to vitamin E or selenium supplementation with regards to these risks. Impact The effect of selenium or vitamin E supplementation on high-grade prostate cancer risk may vary by genotype. PMID:27197287

  12. 47 CFR 73.3550 - Requests for new or modified call sign assignments.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... RADIO SERVICES RADIO BROADCAST SERVICES Rules Applicable to All Broadcast Stations § 73.3550 Requests... for radio and television broadcast stations shall be made via the FCC's on-line call sign reservation and authorization system accessible through the Internet's World Wide Web by specifying http://www.fcc...

  13. 47 CFR 73.3550 - Requests for new or modified call sign assignments.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... RADIO SERVICES RADIO BROADCAST SERVICES Rules Applicable to All Broadcast Stations § 73.3550 Requests... for radio and television broadcast stations shall be made via the FCC's on-line call sign reservation and authorization system accessible through the Internet's World Wide Web by specifying http://www.fcc...

  14. 47 CFR 73.3550 - Requests for new or modified call sign assignments.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... RADIO SERVICES RADIO BROADCAST SERVICES Rules Applicable to All Broadcast Stations § 73.3550 Requests... for radio and television broadcast stations shall be made via the FCC's on-line call sign reservation and authorization system accessible through the Internet's World Wide Web by specifying http://www.fcc...

  15. 47 CFR 73.3550 - Requests for new or modified call sign assignments.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... RADIO SERVICES RADIO BROADCAST SERVICES Rules Applicable to All Broadcast Stations § 73.3550 Requests... for radio and television broadcast stations shall be made via the FCC's on-line call sign reservation and authorization system accessible through the Internet's World Wide Web by specifying http://www.fcc...

  16. Safety performance testing of a modified Oregon multidirectional slip-base sign support : FOIL test numbers 98F002 and 98F004

    DOT National Transportation Integrated Search

    1997-07-01

    This project evaluated the effectiveness of symbol traffic signs for young, middle-aged and elderly drivers. Daytime legibility distance and comprehension of 85 symbols in the Manual on Uniform Traffic Control Devices (MUTCD) were measured. Legibilit...

  17. The Shetland Islands scrapie monitoring and control programme: analysis of the clinical data collected from 772 scrapie suspects 1985-1997.

    PubMed

    Cockcroft, P D; Clark, A M

    2006-02-01

    There were 574 scrapie positive suspects (histopathological scrapie lesions present) and 198 scrapie negative suspects (histopathological scrapie lesions absent). The greatest number of scrapie cases were recorded in sheep of 2, 3 and 4 years of age which represented 17%, 36% and 23% of the scrapie positive suspects, respectively. The sign sensitivities and specificities for the ten recorded signs were, respectively: pruritus (62%, 42%), ataxia (23%, 74%), hyperaesthesia (32%, 74%), wool loss (25%, 73%), fleece discolouration (29%, 85%), bruxism (23%, 69%), nibbling reflex (17%, 58%), head rubbing (47%, 78%), poll rubbing (25%, 83%). These single signs had poor discriminatory values with likelihood ratios close to one (range 0.89-1.21); combinations of the four signs, pruritus, wool loss, ataxia, hyperaesthesia and emaciation were more discriminatory (range 0.30-4.3). This study covered a time period when bovine spongiform encephalopathy (BSE) might have been introduced into the sheep population on the Shetland Islands via contaminated feed. No temporal changes could be detected in the age structure of the affected animals.

  18. All-optical negabinary adders using Mach-Zehnder interferometer

    NASA Astrophysics Data System (ADS)

    Cherri, A. K.

    2011-02-01

    In contrast to optoelectronics, all-optical adders are proposed where all-optical signals are used to represent the input numbers and the control signals. In addition, the all-optical adders use the negabinary modified signed-digit number representation (an extension of the negabinary number system) to represent the input digits. Further, the ultra-speed of the designed circuits is achieved due to the use of ultra-fast all-optical switching property of the semiconductor optical amplifier and Mach-Zehnder interferometer (SOA-MZI). Furthermore, two-bit per digit binary encoding scheme is employed to represent the trinary values of the negabinary modified signed-digits.

  19. Modified signed-digit trinary addition using synthetic wavelet filter

    NASA Astrophysics Data System (ADS)

    Iftekharuddin, K. M.; Razzaque, M. A.

    2000-09-01

    The modified signed-digit (MSD) number system has been a topic of interest as it allows for parallel carry-free addition of two numbers for digital optical computing. In this paper, harmonic wavelet joint transform (HWJT)-based correlation technique is introduced for optical implementation of MSD trinary adder implementation. The realization of the carry-propagation-free addition of MSD trinary numerals is demonstrated using synthetic HWJT correlator model. It is also shown that the proposed synthetic wavelet filter-based correlator shows high performance in logic processing. Simulation results are presented to validate the performance of the proposed technique.

  20. Occurrence of potentially pathogenic Vibrio in oysters (Crassostrea gigas) and waters from bivalve mollusk cultivations in the South Bay of Santa Catarina.

    PubMed

    Ramos, Roberta Juliano; Miotto, Letícia Adélia; Miotto, Marília; Silveira Junior, Nelson; Cirolini, Andréia; Silva, Helen Silvestre da; Rodrigues, Dália dos Prazeres; Vieira, Cleide Rosana Werneck

    2014-01-01

    This research aimed to identify and quantify potentially pathogenic Vibrio from different cultivations of bivalve shellfish in the State of Santa Catarina, Brazil, and water regions in the South Bay, as well as correlate the incidence of these microorganisms with the physicochemical parameters of marine waters. Between October 2008 and March 2009, 60 oyster and seawater samples were collected from six regions of bivalve mollusk cultivation, and these samples were submitted for Vibrio counts. Twenty-nine (48.3%) oyster samples were revealed to be contaminated with one or more Vibrio species. The Vibrio parahaemolyticus and Vibrio vulnificus counts in the samples ranged from < 0.5 log10 Most Probable Number (MPN) g(-1) to 2.3 log10 MPN g(-1) oyster and from < 0.5 log10 MPN g(-1) to 2.1 log10 MPN g(-1) oyster, respectively. Of the 60 seawater samples analyzed, 44 (73.3%) showed signs of contamination with one or more vibrio species. The counts of V. parahaemolyticus and V. vulnificus in the samples ranged from < 0.3 log10 MPN·100mL(-1) to 1.7 log10MPN·100mL(-1) seawater and from < 0.3 log10 MPN·100mL(-1) to 2.0 log10 MPN·100mL(-1) seawater, respectively. A positive correlation between V. vulnificus counts and the seawater temperature as well as a negative correlation between the V. parahaemolyticus counts and salinity were observed. The results suggest the need to implement strategies to prevent vibrio diseases from being transmitted by the consumption of contaminated bivalve shellfish.

  1. User Accounts | High-Performance Computing | NREL

    Science.gov Websites

    see information on user account policies. ACCOUNT PASSWORDS Logging in for the first time? Forgot your Accounts User Accounts Learn how to request an NREL HPC user account. Request an HPC Account To request an HPC account, please complete our request form. This form is provided using DocuSign. REQUEST

  2. 77 FR 5091 - Certain New Chemicals; Receipt and Status Information

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-01

    ... identification, pass through a metal detector, and sign the EPA visitor log. All visitor bags are processed through an X-ray machine and subject to search. Visitors will be provided an EPA/DC badge that must be... clear coatings for wood, plastic and metal. P-12-0055 11/17/2011 02/14/2012 CBI (G) Foam stabilizer and...

  3. USA PATRIOT Improvement and Reauthorization Act of 2005 (H.R. 3199): A Brief Look

    DTIC Science & Technology

    2005-12-09

    sales of ephedrine, pseudoephedrine and phenylpropanolamine (EPP) products, 21 U.S.C. 830(d); (b) requires that the products be available only "behind the...counter" and that purchasers of products containing more than 60 mg of pseudoephedrine present a photo Id and sign a log book, 21 U.S.C. 830(e); (c

  4. Reality check: Shedding new light on the restoration needs of mixed-conifer forests

    Treesearch

    Marie Oliver; Thomas Spies; Andrew. Merschel

    2014-01-01

    Until recently, scientific understanding of the history and ecology of the Pacific Northwest's mixed-conifer forests east of the Cascade Range was minimal. As a result, forest managers have had limited ability to restore the health of publicly owned forests that show signs of acute stress caused by insects, disease, grazing, logging, and wildfire. A...

  5. Likelihood ratios for glaucoma diagnosis using spectral-domain optical coherence tomography.

    PubMed

    Lisboa, Renato; Mansouri, Kaweh; Zangwill, Linda M; Weinreb, Robert N; Medeiros, Felipe A

    2013-11-01

    To present a methodology for calculating likelihood ratios for glaucoma diagnosis for continuous retinal nerve fiber layer (RNFL) thickness measurements from spectral-domain optical coherence tomography (spectral-domain OCT). Observational cohort study. A total of 262 eyes of 187 patients with glaucoma and 190 eyes of 100 control subjects were included in the study. Subjects were recruited from the Diagnostic Innovations Glaucoma Study. Eyes with preperimetric and perimetric glaucomatous damage were included in the glaucoma group. The control group was composed of healthy eyes with normal visual fields from subjects recruited from the general population. All eyes underwent RNFL imaging with Spectralis spectral-domain OCT. Likelihood ratios for glaucoma diagnosis were estimated for specific global RNFL thickness measurements using a methodology based on estimating the tangents to the receiver operating characteristic (ROC) curve. Likelihood ratios could be determined for continuous values of average RNFL thickness. Average RNFL thickness values lower than 86 μm were associated with positive likelihood ratios (ie, likelihood ratios greater than 1), whereas RNFL thickness values higher than 86 μm were associated with negative likelihood ratios (ie, likelihood ratios smaller than 1). A modified Fagan nomogram was provided to assist calculation of posttest probability of disease from the calculated likelihood ratios and pretest probability of disease. The methodology allowed calculation of likelihood ratios for specific RNFL thickness values. By avoiding arbitrary categorization of test results, it potentially allows for an improved integration of test results into diagnostic clinical decision making. Copyright © 2013. Published by Elsevier Inc.

  6. Clinical signs suggestive of pharyngeal dysphagia in preschool children with cerebral palsy.

    PubMed

    Benfer, Katherine A; Weir, Kelly A; Bell, Kristie L; Ware, Robert S; Davies, Peter S W; Boyd, Roslyn N

    2015-03-01

    This study aimed to determine the discriminative validity, reproducibility, and prevalence of clinical signs suggestive of pharyngeal dysphagia according to gross motor function in children with cerebral palsy (CP). It was a cross-sectional population-based study of 130 children diagnosed with CP at 18-36 months (mean=27.4, 81 males) and 40 children with typical development (TD, mean=26.2, 18 males). Sixteen signs suggestive of pharyngeal phase impairment were directly observed in a videoed mealtime by a speech pathologist, and reported by parents on a questionnaire. Gross motor function was classified using the Gross Motor Function Classification System. The study found that 67.7% of children had clinical signs, and this increased with poorer gross motor function (OR=1.7, p<0.01). Parents reported clinical signs in 46.2% of children, with 60% agreement with direct clinical mealtime assessment (kappa=0.2, p<0.01). The most common signs on direct assessment were coughing (44.7%), multiple swallows (25.2%), gurgly voice (20.3%), wet breathing (18.7%) and gagging (11.4%). 37.5% of children with TD had clinical signs, mostly observed on fluids. Dysphagia cut-points were modified to exclude a single cough on fluids, with a modified prevalence estimate proposed as 50.8%. Clinical signs suggestive of pharyngeal dysphagia are common in children with CP, even those with ambulatory CP. Parent-report on 16 specific signs remains a feasible screening method. While coughing was consistently identified by clinicians, it may not reflect children's regular performance, and was not sufficiently discriminative in children aged 18-36 months. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  7. Evaluation of modified work zone traffic control devices at business accesses

    DOT National Transportation Integrated Search

    2001-01-01

    Modified work zone traffic control devices at business accesses were evaluated on two Oregon Department of Transportation (ODOT) projects in 1999 and 2000. On one section project, blue Temporary Business Access" signs were used at business accesses d...

  8. Generalized look-ahead number conversion from signed digit to complement representation with optical logic operations

    NASA Astrophysics Data System (ADS)

    Qian, Feng; Li, Guoqiang

    2001-12-01

    In this paper a generalized look-ahead logic algorithm for number conversion from signed-digit to its complement representation is developed. By properly encoding the signed digits, all the operations are performed by binary logic, and unified logical expressions can be obtained for conversion from modified-signed-digit (MSD) to 2's complement, trinary signed-digit (TSD) to 3's complement, and quaternary signed-digit (QSD) to 4's complement. For optical implementation, a parallel logical array module using electron-trapping device is employed, which is suitable for realizing complex logic functions in the form of sum-of-product. The proposed algorithm and architecture are compatible with a general-purpose optoelectronic computing system.

  9. Cardiorespiratory dynamics measured from continuous ECG monitoring improves detection of deterioration in acute care patients: A retrospective cohort study

    PubMed Central

    Clark, Matthew T.; Calland, James Forrest; Enfield, Kyle B.; Voss, John D.; Lake, Douglas E.; Moorman, J. Randall

    2017-01-01

    Background Charted vital signs and laboratory results represent intermittent samples of a patient’s dynamic physiologic state and have been used to calculate early warning scores to identify patients at risk of clinical deterioration. We hypothesized that the addition of cardiorespiratory dynamics measured from continuous electrocardiography (ECG) monitoring to intermittently sampled data improves the predictive validity of models trained to detect clinical deterioration prior to intensive care unit (ICU) transfer or unanticipated death. Methods and findings We analyzed 63 patient-years of ECG data from 8,105 acute care patient admissions at a tertiary care academic medical center. We developed models to predict deterioration resulting in ICU transfer or unanticipated death within the next 24 hours using either vital signs, laboratory results, or cardiorespiratory dynamics from continuous ECG monitoring and also evaluated models using all available data sources. We calculated the predictive validity (C-statistic), the net reclassification improvement, and the probability of achieving the difference in likelihood ratio χ2 for the additional degrees of freedom. The primary outcome occurred 755 times in 586 admissions (7%). We analyzed 395 clinical deteriorations with continuous ECG data in the 24 hours prior to an event. Using only continuous ECG measures resulted in a C-statistic of 0.65, similar to models using only laboratory results and vital signs (0.63 and 0.69 respectively). Addition of continuous ECG measures to models using conventional measurements improved the C-statistic by 0.01 and 0.07; a model integrating all data sources had a C-statistic of 0.73 with categorical net reclassification improvement of 0.09 for a change of 1 decile in risk. The difference in likelihood ratio χ2 between integrated models with and without cardiorespiratory dynamics was 2158 (p value: <0.001). Conclusions Cardiorespiratory dynamics from continuous ECG monitoring detect clinical deterioration in acute care patients and improve performance of conventional models that use only laboratory results and vital signs. PMID:28771487

  10. Cardiorespiratory dynamics measured from continuous ECG monitoring improves detection of deterioration in acute care patients: A retrospective cohort study.

    PubMed

    Moss, Travis J; Clark, Matthew T; Calland, James Forrest; Enfield, Kyle B; Voss, John D; Lake, Douglas E; Moorman, J Randall

    2017-01-01

    Charted vital signs and laboratory results represent intermittent samples of a patient's dynamic physiologic state and have been used to calculate early warning scores to identify patients at risk of clinical deterioration. We hypothesized that the addition of cardiorespiratory dynamics measured from continuous electrocardiography (ECG) monitoring to intermittently sampled data improves the predictive validity of models trained to detect clinical deterioration prior to intensive care unit (ICU) transfer or unanticipated death. We analyzed 63 patient-years of ECG data from 8,105 acute care patient admissions at a tertiary care academic medical center. We developed models to predict deterioration resulting in ICU transfer or unanticipated death within the next 24 hours using either vital signs, laboratory results, or cardiorespiratory dynamics from continuous ECG monitoring and also evaluated models using all available data sources. We calculated the predictive validity (C-statistic), the net reclassification improvement, and the probability of achieving the difference in likelihood ratio χ2 for the additional degrees of freedom. The primary outcome occurred 755 times in 586 admissions (7%). We analyzed 395 clinical deteriorations with continuous ECG data in the 24 hours prior to an event. Using only continuous ECG measures resulted in a C-statistic of 0.65, similar to models using only laboratory results and vital signs (0.63 and 0.69 respectively). Addition of continuous ECG measures to models using conventional measurements improved the C-statistic by 0.01 and 0.07; a model integrating all data sources had a C-statistic of 0.73 with categorical net reclassification improvement of 0.09 for a change of 1 decile in risk. The difference in likelihood ratio χ2 between integrated models with and without cardiorespiratory dynamics was 2158 (p value: <0.001). Cardiorespiratory dynamics from continuous ECG monitoring detect clinical deterioration in acute care patients and improve performance of conventional models that use only laboratory results and vital signs.

  11. Joint penalized-likelihood reconstruction of time-activity curves and regions-of-interest from projection data in brain PET

    NASA Astrophysics Data System (ADS)

    Krestyannikov, E.; Tohka, J.; Ruotsalainen, U.

    2008-06-01

    This paper presents a novel statistical approach for joint estimation of regions-of-interest (ROIs) and the corresponding time-activity curves (TACs) from dynamic positron emission tomography (PET) brain projection data. It is based on optimizing the joint objective function that consists of a data log-likelihood term and two penalty terms reflecting the available a priori information about the human brain anatomy. The developed local optimization strategy iteratively updates both the ROI and TAC parameters and is guaranteed to monotonically increase the objective function. The quantitative evaluation of the algorithm is performed with numerically and Monte Carlo-simulated dynamic PET brain data of the 11C-Raclopride and 18F-FDG tracers. The results demonstrate that the method outperforms the existing sequential ROI quantification approaches in terms of accuracy, and can noticeably reduce the errors in TACs arising due to the finite spatial resolution and ROI delineation.

  12. Testing the causality of Hawkes processes with time reversal

    NASA Astrophysics Data System (ADS)

    Cordi, Marcus; Challet, Damien; Muni Toke, Ioane

    2018-03-01

    We show that univariate and symmetric multivariate Hawkes processes are only weakly causal: the true log-likelihoods of real and reversed event time vectors are almost equal, thus parameter estimation via maximum likelihood only weakly depends on the direction of the arrow of time. In ideal (synthetic) conditions, tests of goodness of parametric fit unambiguously reject backward event times, which implies that inferring kernels from time-symmetric quantities, such as the autocovariance of the event rate, only rarely produce statistically significant fits. Finally, we find that fitting financial data with many-parameter kernels may yield significant fits for both arrows of time for the same event time vector, sometimes favouring the backward time direction. This goes to show that a significant fit of Hawkes processes to real data with flexible kernels does not imply a definite arrow of time unless one tests it.

  13. Hyperspectral image reconstruction for x-ray fluorescence tomography

    DOE PAGES

    Gürsoy, Doǧa; Biçer, Tekin; Lanzirotti, Antonio; ...

    2015-01-01

    A penalized maximum-likelihood estimation is proposed to perform hyperspectral (spatio-spectral) image reconstruction for X-ray fluorescence tomography. The approach minimizes a Poisson-based negative log-likelihood of the observed photon counts, and uses a penalty term that has the effect of encouraging local continuity of model parameter estimates in both spatial and spectral dimensions simultaneously. The performance of the reconstruction method is demonstrated with experimental data acquired from a seed of arabidopsis thaliana collected at the 13-ID-E microprobe beamline at the Advanced Photon Source. The resulting element distribution estimates with the proposed approach show significantly better reconstruction quality than the conventional analytical inversionmore » approaches, and allows for a high data compression factor which can reduce data acquisition times remarkably. In particular, this technique provides the capability to tomographically reconstruct full energy dispersive spectra without compromising reconstruction artifacts that impact the interpretation of results.« less

  14. Gaussian statistics of the cosmic microwave background: Correlation of temperature extrema in the COBE DMR two-year sky maps

    NASA Technical Reports Server (NTRS)

    Kogut, A.; Banday, A. J.; Bennett, C. L.; Hinshaw, G.; Lubin, P. M.; Smoot, G. F.

    1995-01-01

    We use the two-point correlation function of the extrema points (peaks and valleys) in the Cosmic Background Explorer (COBE) Differential Microwave Radiometers (DMR) 2 year sky maps as a test for non-Gaussian temperature distribution in the cosmic microwave background anisotropy. A maximum-likelihood analysis compares the DMR data to n = 1 toy models whose random-phase spherical harmonic components a(sub lm) are drawn from either Gaussian, chi-square, or log-normal parent populations. The likelihood of the 53 GHz (A+B)/2 data is greatest for the exact Gaussian model. There is less than 10% chance that the non-Gaussian models tested describe the DMR data, limited primarily by type II errors in the statistical inference. The extrema correlation function is a stronger test for this class of non-Gaussian models than topological statistics such as the genus.

  15. Narrative review: should teaching of the respiratory physical examination be restricted only to signs with proven reliability and validity?

    PubMed

    Benbassat, Jochanan; Baumal, Reuben

    2010-08-01

    To review the reported reliability (reproducibility, inter-examiner agreement) and validity (sensitivity, specificity and likelihood ratios) of respiratory physical examination (PE) signs, and suggest an approach to teaching these signs to medical students. Review of the literature. We searched Paper Chase between 1966 and June 2009 to identify and evaluate published studies on the diagnostic accuracy of respiratory PE signs. Most studies have reported low to fair reliability and sensitivity values. However, some studies have found high specificites for selected PE signs. None of the studies that we reviewed adhered to all of the STARD criteria for reporting diagnostic accuracy. Possible flaws in study designs may have led to underestimates of the observed diagnostic accuracy of respiratory PE signs. The reported poor reliabilities may have been due to differences in the PE skills of the participating examiners, while the sensitivities may have been confounded by variations in the severity of the diseases of the participating patients. IMPLICATION FOR PRACTICE AND MEDICAL EDUCATION: Pending the results of properly controlled studies, the reported poor reliability and sensitivity of most respiratory PE signs do not necessarily detract from their clinical utility. Therefore, we believe that a meticulously performed respiratory PE, which aims to explore a diagnostic hypothesis, as opposed to a PE that aims to detect a disease in an asymptomatic person, remains a cornerstone of clinical practice. We propose teaching the respiratory PE signs according to their importance, beginning with signs of life-threatening conditions and those that have been reported to have a high specificity, and ending with signs that are "nice to know," but are no longer employed because of the availability of more easily performed tests.

  16. Narrative Review: Should Teaching of the Respiratory Physical Examination Be Restricted Only to Signs with Proven Reliability and Validity?

    PubMed Central

    Baumal, Reuben

    2010-01-01

    OBJECTIVE To review the reported reliability (reproducibility, inter-examiner agreement) and validity (sensitivity, specificity and likelihood ratios) of respiratory physical examination (PE) signs, and suggest an approach to teaching these signs to medical students. METHODS Review of the literature. We searched Paper Chase between 1966 and June 2009 to identify and evaluate published studies on the diagnostic accuracy of respiratory PE signs. RESULTS Most studies have reported low to fair reliability and sensitivity values. However, some studies have found high specificites for selected PE signs. None of the studies that we reviewed adhered to all of the STARD criteria for reporting diagnostic accuracy. CONCLUSIONS Possible flaws in study designs may have led to underestimates of the observed diagnostic accuracy of respiratory PE signs. The reported poor reliabilities may have been due to differences in the PE skills of the participating examiners, while the sensitivities may have been confounded by variations in the severity of the diseases of the participating patients. IMPLICATION FOR PRACTICE AND MEDICAL EDUCATION Pending the results of properly controlled studies, the reported poor reliability and sensitivity of most respiratory PE signs do not necessarily detract from their clinical utility. Therefore, we believe that a meticulously performed respiratory PE, which aims to explore a diagnostic hypothesis, as opposed to a PE that aims to detect a disease in an asymptomatic person, remains a cornerstone of clinical practice. We propose teaching the respiratory PE signs according to their importance, beginning with signs of life-threatening conditions and those that have been reported to have a high specificity, and ending with signs that are "nice to know," but are no longer employed because of the availability of more easily performed tests. PMID:20349154

  17. Causality constraints in conformal field theory

    DOE PAGES

    Hartman, Thomas; Jain, Sachin; Kundu, Sandipan

    2016-05-17

    Causality places nontrivial constraints on QFT in Lorentzian signature, for example fixing the signs of certain terms in the low energy Lagrangian. In d dimensional conformal field theory, we show how such constraints are encoded in crossing symmetry of Euclidean correlators, and derive analogous constraints directly from the conformal bootstrap (analytically). The bootstrap setup is a Lorentzian four-point function corresponding to propagation through a shockwave. Crossing symmetry fixes the signs of certain log terms that appear in the conformal block expansion, which constrains the interactions of low-lying operators. As an application, we use the bootstrap to rederive the well knownmore » sign constraint on the (Φ) 4 coupling in effective field theory, from a dual CFT. We also find constraints on theories with higher spin conserved currents. As a result, our analysis is restricted to scalar correlators, but we argue that similar methods should also impose nontrivial constraints on the interactions of spinning operators« less

  18. Using information technology to improve the management of chronic disease.

    PubMed

    Celler, Branko G; Lovell, Nigel H; Basilakis, Jim

    2003-09-01

    Information and communications technology (ICT) is increasingly being used in management of chronic illness to facilitate shared services (virtual health networks and electronic health records), knowledge management (care rules and protocols, scheduling, information directories), as well as consumer-based health education and evidence-based clinical protocols. Common applications of ICT include home monitoring of vital signs for patients with chronic disease, as well as replacing home visits by nurses in person with telemedicine videophone consultations. A patient-managed Home Telecare System with integrated clinical signs monitoring, automated scheduling and medication reminders, as well as access to health education and daily logs, is presented as an example of ICT use for chronic disease self-management. A clinical case study demonstrates how early identification of adverse trends in clinical signs recorded in the home can either avoid hospital readmission or reduce the length of hospital stay.

  19. Logging affects fledgling sex ratios and baseline corticosterone in a forest songbird.

    PubMed

    Leshyk, Rhiannon; Nol, Erica; Burke, Dawn M; Burness, Gary

    2012-01-01

    Silviculture (logging) creates a disturbance to forested environments. The degree to which forests are modified depends on the logging prescription and forest stand characteristics. In this study we compared the effects of two methods of group-selection ("moderate" and "heavy") silviculture (GSS) and undisturbed reference stands on stress and offspring sex ratios of a forest interior species, the Ovenbird (Seiurus aurocapilla), in Algonquin Provincial Park, Canada. Blood samples were taken from nestlings for corticosterone and molecular sexing. We found that logging creates a disturbance that is stressful for nestling Ovenbirds, as illustrated by elevated baseline corticosterone in cut sites. Ovenbirds nesting in undisturbed reference forest produce fewer male offspring per brood (proportion male = 30%) while logging with progressively greater forest disturbance, shifted the offspring sex ratio towards males (proportion male: moderate = 50%, heavy = 70%). If Ovenbirds in undisturbed forests usually produce female-biased broods, then the production of males as a result of logging may disrupt population viability. We recommend a broad examination of nestling sex ratios in response to anthropogenic disturbance to determine the generality of our findings.

  20. Medical Monitoring During Firefighter Incident Scene Rehabilitation.

    PubMed

    Barr, David A; Haigh, Craig A; Haller, Jeannie M; Smith, Denise L

    2016-01-01

    The objective of this study was to retrospectively investigate aspects of medical monitoring, including medical complaints, vital signs at entry, and vital sign recovery, in firefighters during rehabilitation following operational firefighting duties. Incident scene rehabilitation logs obtained over a 5-year span that included 53 incidents, approximately 40 fire departments, and more than 530 firefighters were reviewed. Only 13 of 694 cases involved a firefighter reporting a medical complaint. In most cases, vital signs were similar between firefighters who registered a complaint and those who did not. On average, heart rate was 104 ± 23 beats·min(-1), systolic blood pressure was 132 ± 17 mmHg, diastolic blood pressure was 81 ± 12 mmHg, and respiratory rate was 19 ± 3 breaths·min(-1) upon entry into rehabilitation. At least two measurements of heart rate, systolic blood pressure, diastolic blood pressure, and respiratory rate were obtained for 365, 383, 376, and 160 cases, respectively. Heart rate, systolic and diastolic blood pressures, and respiratory rate decreased significantly (p < 0.001) during rehabilitation. Initial vital signs and changes in vital signs during recovery were highly variable. Data from this study indicated that most firefighters recovered from the physiological stress of firefighting without any medical complaint or symptoms. Furthermore, vital signs were within fire service suggested guidelines for release within 10 or 20 minutes of rehabilitation. The data suggested that vital signs of firefighters with medical symptoms were not significantly different from vital signs of firefighters who had an unremarkable recovery.

  1. Strengthening your ties to referring physicians through RIS/PACS integration.

    PubMed

    Worthy, Susan; Rounds, Karla C; Soloway, Connie B

    2003-01-01

    Many imaging centers are turning to technology solutions to increase refering physician satisfaction, implementing such enhancements as automated report distribution, picture archiving and communications system (PACS), radiology information systems (RIS), and web-based results access. However, without seamless integration, these technology investments don't address the challenge at its core: convenient and reliable, two-way communication and interaction with referring physicians. In an integrated RIS/PACS solution, patient tracking in the RIS and PACS study status are logged and available to users. The time of the patient's registration at the imaging center, the exam start and completion time, the patient's departure time from the imaging center, and results status are all tracked and logged. An integrated RIS/PACS solution provides additional support to the radiologist, a critical factor that can improve the turnaround time of results to referring physicians. The RIS/PACS enhances the interpretation by providing the patient's history, which gives the radiologist additional insight and decreases the likelihood of missing a diagnostic element. In a tightly integrated RIS/PACS solution, results information is more complete. Physicians can view reports with associated images selected by the radiologist. They will also have full order information and complete imaging history including prior reports and images. Referring physicians can access and view images and exam notes at the same time that the radiologist is interpreting the exam. Without the benefit of an integrated RIS/PACS system, the referring physician would have to wait for the signed transcription to be released. In a seamlessly integrated solution, film-tracking modules within the RIS are fused with digital imaging workflow in the PACS. Users can see at a glance if a historical exam is available on film and benefit when a complete study history--both film-based and digital--is presented with the current case. It is up to the imaging center to market the benefits of reduced errors, reduced turnaround times, and a higher level of service to referring physician community, and encourage them to take advantage of the convenience it provides. The savvy imaging center will also regard the integrated RIS/PACS as a valuable marketing tool for use in attracting radiologists.

  2. Impact of logging on aboveground biomass stocks in lowland rain forest, Papua New Guinea.

    PubMed

    Bryan, Jane; Shearman, Phil; Ash, Julian; Kirkpatrick, J B

    2010-12-01

    Greenhouse-gas emissions resulting from logging are poorly quantified across the tropics. There is a need for robust measurement of rain forest biomass and the impacts of logging from which carbon losses can be reliably estimated at regional and global scales. We used a modified Bitterlich plotless technique to measure aboveground live biomass at six unlogged and six logged rain forest areas (coupes) across two approximately 3000-ha regions at the Makapa concession in lowland Papua New Guinea. "Reduced-impact logging" is practiced at Makapa. We found the mean unlogged aboveground biomass in the two regions to be 192.96 +/- 4.44 Mg/ha and 252.92 +/- 7.00 Mg/ha (mean +/- SE), which was reduced by logging to 146.92 +/- 4.58 Mg/ha and 158.84 +/- 4.16, respectively. Killed biomass was not a fixed proportion, but varied with unlogged biomass, with 24% killed in the lower-biomass region, and 37% in the higher-biomass region. Across the two regions logging resulted in a mean aboveground carbon loss of 35 +/- 2.8 Mg/ha. The plotless technique proved efficient at estimating mean aboveground biomass and logging damage. We conclude that substantial bias is likely to occur within biomass estimates derived from single unreplicated plots.

  3. Bayesian Computation for Log-Gaussian Cox Processes: A Comparative Analysis of Methods

    PubMed Central

    Teng, Ming; Nathoo, Farouk S.; Johnson, Timothy D.

    2017-01-01

    The Log-Gaussian Cox Process is a commonly used model for the analysis of spatial point pattern data. Fitting this model is difficult because of its doubly-stochastic property, i.e., it is an hierarchical combination of a Poisson process at the first level and a Gaussian Process at the second level. Various methods have been proposed to estimate such a process, including traditional likelihood-based approaches as well as Bayesian methods. We focus here on Bayesian methods and several approaches that have been considered for model fitting within this framework, including Hamiltonian Monte Carlo, the Integrated nested Laplace approximation, and Variational Bayes. We consider these approaches and make comparisons with respect to statistical and computational efficiency. These comparisons are made through several simulation studies as well as through two applications, the first examining ecological data and the second involving neuroimaging data. PMID:29200537

  4. Log-Linear Models for Gene Association

    PubMed Central

    Hu, Jianhua; Joshi, Adarsh; Johnson, Valen E.

    2009-01-01

    We describe a class of log-linear models for the detection of interactions in high-dimensional genomic data. This class of models leads to a Bayesian model selection algorithm that can be applied to data that have been reduced to contingency tables using ranks of observations within subjects, and discretization of these ranks within gene/network components. Many normalization issues associated with the analysis of genomic data are thereby avoided. A prior density based on Ewens’ sampling distribution is used to restrict the number of interacting components assigned high posterior probability, and the calculation of posterior model probabilities is expedited by approximations based on the likelihood ratio statistic. Simulation studies are used to evaluate the efficiency of the resulting algorithm for known interaction structures. Finally, the algorithm is validated in a microarray study for which it was possible to obtain biological confirmation of detected interactions. PMID:19655032

  5. Improvement of human cell line activation test (h-CLAT) using short-time exposure methods for prevention of false-negative results.

    PubMed

    Narita, Kazuto; Ishii, Yuuki; Vo, Phuc Thi Hong; Nakagawa, Fumiko; Ogata, Shinichi; Yamashita, Kunihiko; Kojima, Hajime; Itagaki, Hiroshi

    2018-01-01

    Recently, animal testing has been affected by increasing ethical, social, and political concerns regarding animal welfare. Several in vitro safety tests for evaluating skin sensitization, such as the human cell line activation test (h-CLAT), have been proposed. However, similar to other tests, the h-CLAT has produced false-negative results, including in tests for acid anhydride and water-insoluble chemicals. In a previous study, we demonstrated that the cause of false-negative results from phthalic anhydride was hydrolysis by an aqueous vehicle, with IL-8 release from THP-1 cells, and that short-time exposure to liquid paraffin (LP) dispersion medium could reduce false-negative results from acid anhydrides. In the present study, we modified the h-CLAT by applying this exposure method. We found that the modified h-CLAT is a promising method for reducing false-negative results obtained from acid anhydrides and chemicals with octanol-water partition coefficients (LogK ow ) greater than 3.5. Based on the outcomes from the present study, a combination of the original and the modified h-CLAT is suggested for reducing false-negative results. Notably, the combination method provided a sensitivity of 95% (overall chemicals) or 93% (chemicals with LogK ow > 2.0), and an accuracy of 88% (overall chemicals) or 81% (chemicals with LogK ow > 2.0). We found that the combined method is a promising evaluation scheme for reducing false-negative results seen in existing in vitro skin-sensitization tests. In the future, we expect a combination of original and modified h-CLAT to be applied in a newly developed in vitro test for evaluating skin sensitization.

  6. How Did Illiterates Fare as Literacy Became Almost Universal? Evidence from Nineteenth and Early Twentieth Century Liverpool

    ERIC Educational Resources Information Center

    Mitch, David

    2003-01-01

    A sample of marriage registers from the parish of Liverpool St. Nicholas Church in England between 1839-1927 is used to examine changing characteristics of grooms who signed with a mark over this period. The proportion of illiterate grooms in the parish fell from about a third to under 5%. Age at marriage and likelihood of being a widower rose…

  7. Outdoor Hazards & Preventive Measures: West Nile Virus: A Clinical Commentary for the Camp Health Care Community; Poison Ivy: A Primer for Prevention; Lyme Disease Prevention and Control.

    ERIC Educational Resources Information Center

    Reynolds, Ellen; Bauer, Holly; Ratner-Connolly, Heidi

    2003-01-01

    Transmitted by mosquitos, West Nile virus may cause serious illness, but the actual likelihood of infection is low. Prevention, implications, and recommendations for camps are discussed. Poison ivy identification, treatment, and complications are presented; a prevention quiz is included. Signs and symptoms of Lyme disease are described, as are…

  8. A Bootstrap Metropolis-Hastings Algorithm for Bayesian Analysis of Big Data.

    PubMed

    Liang, Faming; Kim, Jinsu; Song, Qifan

    2016-01-01

    Markov chain Monte Carlo (MCMC) methods have proven to be a very powerful tool for analyzing data of complex structures. However, their computer-intensive nature, which typically require a large number of iterations and a complete scan of the full dataset for each iteration, precludes their use for big data analysis. In this paper, we propose the so-called bootstrap Metropolis-Hastings (BMH) algorithm, which provides a general framework for how to tame powerful MCMC methods to be used for big data analysis; that is to replace the full data log-likelihood by a Monte Carlo average of the log-likelihoods that are calculated in parallel from multiple bootstrap samples. The BMH algorithm possesses an embarrassingly parallel structure and avoids repeated scans of the full dataset in iterations, and is thus feasible for big data problems. Compared to the popular divide-and-combine method, BMH can be generally more efficient as it can asymptotically integrate the whole data information into a single simulation run. The BMH algorithm is very flexible. Like the Metropolis-Hastings algorithm, it can serve as a basic building block for developing advanced MCMC algorithms that are feasible for big data problems. This is illustrated in the paper by the tempering BMH algorithm, which can be viewed as a combination of parallel tempering and the BMH algorithm. BMH can also be used for model selection and optimization by combining with reversible jump MCMC and simulated annealing, respectively.

  9. Estimating mortality risk in preoperative patients using immunologic, nutritional, and acute-phase response variables.

    PubMed Central

    Christou, N V; Tellado-Rodriguez, J; Chartrand, L; Giannas, B; Kapadia, B; Meakins, J; Rode, H; Gordon, J

    1989-01-01

    We measured the delayed type hypersensitivity (DTH) skin test response, along with additional variables of host immunocompetence in 245 preoperative patients to determine which variables are associated with septic-related deaths following operation. Of the 14 deaths (5.7%), 12 were related to sepsis and in 2 sepsis was contributory. The DTH response (p less than 0.00001), age (p less than 0.0002), serum albumin (p less than 0.003), hemoglobin (p less than 0.02), and total hemolytic complement (p less than 0.03), were significantly different between those who died and those who lived. By logistic regression analysis, only the DTH skin test response (log likelihood = 41.7, improvement X2 = 6.24, p less than 0.012) and the serum albumin (log likelihood = 44.8, improvement X2 = 17.7, p less than 0.001) were significantly and independently associated with the deaths. The resultant probability of mortality calculation equation was tested in a separate validation group of 519 patients (mortality = 5%) and yielded a good predictive capability as assessed by (1) X2 = 0.08 between observed and expected deaths, NS; (2) Goodman-Kruskall G statistic = 0.673) Receiver-Operating-Characteristic (ROC) curve analysis with an area under the ROC curve, Az = 0.79 +/- 0.05. We conclude that a reduced immune response (DTH skin test anergy) plus a nutritional deficit and/or acute-phase response change are both associated with increased septic-related deaths in elective surgical patients. PMID:2472781

  10. A Bootstrap Metropolis–Hastings Algorithm for Bayesian Analysis of Big Data

    PubMed Central

    Kim, Jinsu; Song, Qifan

    2016-01-01

    Markov chain Monte Carlo (MCMC) methods have proven to be a very powerful tool for analyzing data of complex structures. However, their computer-intensive nature, which typically require a large number of iterations and a complete scan of the full dataset for each iteration, precludes their use for big data analysis. In this paper, we propose the so-called bootstrap Metropolis-Hastings (BMH) algorithm, which provides a general framework for how to tame powerful MCMC methods to be used for big data analysis; that is to replace the full data log-likelihood by a Monte Carlo average of the log-likelihoods that are calculated in parallel from multiple bootstrap samples. The BMH algorithm possesses an embarrassingly parallel structure and avoids repeated scans of the full dataset in iterations, and is thus feasible for big data problems. Compared to the popular divide-and-combine method, BMH can be generally more efficient as it can asymptotically integrate the whole data information into a single simulation run. The BMH algorithm is very flexible. Like the Metropolis-Hastings algorithm, it can serve as a basic building block for developing advanced MCMC algorithms that are feasible for big data problems. This is illustrated in the paper by the tempering BMH algorithm, which can be viewed as a combination of parallel tempering and the BMH algorithm. BMH can also be used for model selection and optimization by combining with reversible jump MCMC and simulated annealing, respectively. PMID:29033469

  11. Stone tool production and utilization by bonobo-chimpanzees (Pan paniscus).

    PubMed

    Roffman, Itai; Savage-Rumbaugh, Sue; Rubert-Pugh, Elizabeth; Ronen, Avraham; Nevo, Eviatar

    2012-09-04

    Using direct percussion, language-competent bonobo-chimpanzees Kanzi and Pan-Banisha produced a significantly wider variety of flint tool types than hitherto reported, and used them task-specifically to break wooden logs or to dig underground for food retrieval. For log breaking, small flakes were rotated drill-like or used as scrapers, whereas thick cortical flakes were used as axes or wedges, leaving consistent wear patterns along the glued slits, the weakest areas of the log. For digging underground, a variety of modified stone tools, as well as unmodified flint nodules, were used as shovels. Such tool production and utilization competencies reported here in Pan indicate that present-day Pan exhibits Homo-like technological competencies.

  12. The effect of coniine on presynaptic nicotinic receptors.

    PubMed

    Erkent, Ulkem; Iskit, Alper B; Onur, Rustu; Ilhan, Mustafa

    2016-01-01

    Toxicity of coniine, an alkaloid of Conium maculatum (poison hemlock), is manifested by characteristic nicotinic clinical signs including excitement, depression, hypermetria, seizures, opisthotonos via postsynaptic nicotinic receptors. There is limited knowledge about the role of presynaptic nicotinic receptors on the pharmacological and toxicological effects of coniine in the literature. The present study was undertaken to evaluate the possible role of presynaptic nicotinic receptors on the pharmacological and toxicological effects of coniine. For this purpose, the rat anococcygeus muscle and guinea-pig atria were used in vitro. Nicotine (100 μM) elicited a biphasic response composed of a relaxation followed by contraction through the activation of nitrergic and noradrenergic nerve terminals in the phenylephrine-contracted rat anococcygeus muscle. Coniine inhibited both the nitrergic and noradrenergic response in the muscle (-logIC(50) = 3.79 ± 0.11 and -logIC(50) = 4.57 ± 0.12 M, respectively). The effect of coniine on nicotinic receptor-mediated noradrenergic transmission was also evaluated in the guinea-pig atrium (-logIC(50) = 4.47 ± 0.12 M) and did not differ from the -logIC(50) value obtained in the rat anococcygeus muscle. This study demonstrated that coniine exerts inhibitory effects on nicotinic receptor-mediated nitrergic and noradrenergic transmitter response.

  13. Historic Mining and Agriculture as Indicators of Occurrence and Abundance of Widespread Invasive Plant Species

    PubMed Central

    Calinger, Kellen; Calhoon, Elisabeth; Chang, Hsiao-chi; Whitacre, James; Wenzel, John; Comita, Liza; Queenborough, Simon

    2015-01-01

    Anthropogenic disturbances often change ecological communities and provide opportunities for non-native species invasion. Understanding the impacts of disturbances on species invasion is therefore crucial for invasive species management. We used generalized linear mixed effects models to explore the influence of land-use history and distance to roads on the occurrence and abundance of two invasive plant species (Rosa multiflora and Berberis thunbergii) in a 900-ha deciduous forest in the eastern U.S.A., the Powdermill Nature Reserve. Although much of the reserve has been continuously forested since at least 1939, aerial photos revealed a variety of land-uses since then including agriculture, mining, logging, and development. By 2008, both R. multiflora and B. thunbergii were widespread throughout the reserve (occurring in 24% and 13% of 4417 10-m diameter regularly-placed vegetation plots, respectively) with occurrence and abundance of each varying significantly with land-use history. Rosa multiflora was more likely to occur in historically farmed, mined, logged or developed plots than in plots that remained forested, (log odds of 1.8 to 3.0); Berberis thunbergii was more likely to occur in plots with agricultural, mining, or logging history than in plots without disturbance (log odds of 1.4 to 2.1). Mining, logging, and agriculture increased the probability that R. multiflora had >10% cover while only past agriculture was related to cover of B. thunbergii. Proximity to roads was positively correlated with the occurrence of R. multiflora (a 0.26 increase in the log odds for every 1-m closer) but not B. thunbergii, and roads had no impact on the abundance of either species. Our results indicated that a wide variety of disturbances may aid the introduction of invasive species into new habitats, while high-impact disturbances such as agriculture and mining increase the likelihood of high abundance post-introduction. PMID:26046534

  14. Identification and Management of Pump Thrombus in the HeartWare Left Ventricular Assist Device System: A Novel Approach Using Log File Analysis.

    PubMed

    Jorde, Ulrich P; Aaronson, Keith D; Najjar, Samer S; Pagani, Francis D; Hayward, Christopher; Zimpfer, Daniel; Schlöglhofer, Thomas; Pham, Duc T; Goldstein, Daniel J; Leadley, Katrin; Chow, Ming-Jay; Brown, Michael C; Uriel, Nir

    2015-11-01

    The study sought to characterize patterns in the HeartWare (HeartWare Inc., Framingham, Massachusetts) ventricular assist device (HVAD) log files associated with successful medical treatment of device thrombosis. Device thrombosis is a serious adverse event for mechanical circulatory support devices and is often preceded by increased power consumption. Log files of the pump power are easily accessible on the bedside monitor of HVAD patients and may allow early diagnosis of device thrombosis. Furthermore, analysis of the log files may be able to predict the success rate of thrombolysis or the need for pump exchange. The log files of 15 ADVANCE trial patients (algorithm derivation cohort) with 16 pump thrombus events treated with tissue plasminogen activator (tPA) were assessed for changes in the absolute and rate of increase in power consumption. Successful thrombolysis was defined as a clinical resolution of pump thrombus including normalization of power consumption and improvement in biochemical markers of hemolysis. Significant differences in log file patterns between successful and unsuccessful thrombolysis treatments were verified in 43 patients with 53 pump thrombus events implanted outside of clinical trials (validation cohort). The overall success rate of tPA therapy was 57%. Successful treatments had significantly lower measures of percent of expected power (130.9% vs. 196.1%, p = 0.016) and rate of increase in power (0.61 vs. 2.87, p < 0.0001). Medical therapy was successful in 77.7% of the algorithm development cohort and 81.3% of the validation cohort when the rate of power increase and percent of expected power values were <1.25% and 200%, respectively. Log file parameters can potentially predict the likelihood of successful tPA treatments and if validated prospectively, could substantially alter the approach to thrombus management. Copyright © 2015 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  15. Measures of model performance based on the log accuracy ratio

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morley, Steven Karl; Brito, Thiago Vasconcelos; Welling, Daniel T.

    Quantitative assessment of modeling and forecasting of continuous quantities uses a variety of approaches. We review existing literature describing metrics for forecast accuracy and bias, concentrating on those based on relative errors and percentage errors. Of these accuracy metrics, the mean absolute percentage error (MAPE) is one of the most common across many fields and has been widely applied in recent space science literature and we highlight the benefits and drawbacks of MAPE and proposed alternatives. We then introduce the log accuracy ratio, and derive from it two metrics: the median symmetric accuracy; and the symmetric signed percentage bias. Robustmore » methods for estimating the spread of a multiplicative linear model using the log accuracy ratio are also presented. The developed metrics are shown to be easy to interpret, robust, and to mitigate the key drawbacks of their more widely-used counterparts based on relative errors and percentage errors. Their use is illustrated with radiation belt electron flux modeling examples.« less

  16. Measures of model performance based on the log accuracy ratio

    DOE PAGES

    Morley, Steven Karl; Brito, Thiago Vasconcelos; Welling, Daniel T.

    2018-01-03

    Quantitative assessment of modeling and forecasting of continuous quantities uses a variety of approaches. We review existing literature describing metrics for forecast accuracy and bias, concentrating on those based on relative errors and percentage errors. Of these accuracy metrics, the mean absolute percentage error (MAPE) is one of the most common across many fields and has been widely applied in recent space science literature and we highlight the benefits and drawbacks of MAPE and proposed alternatives. We then introduce the log accuracy ratio, and derive from it two metrics: the median symmetric accuracy; and the symmetric signed percentage bias. Robustmore » methods for estimating the spread of a multiplicative linear model using the log accuracy ratio are also presented. The developed metrics are shown to be easy to interpret, robust, and to mitigate the key drawbacks of their more widely-used counterparts based on relative errors and percentage errors. Their use is illustrated with radiation belt electron flux modeling examples.« less

  17. Pavement crack detection combining non-negative feature with fast LoG in complex scene

    NASA Astrophysics Data System (ADS)

    Wang, Wanli; Zhang, Xiuhua; Hong, Hanyu

    2015-12-01

    Pavement crack detection is affected by much interference in the realistic situation, such as the shadow, road sign, oil stain, salt and pepper noise etc. Due to these unfavorable factors, the exist crack detection methods are difficult to distinguish the crack from background correctly. How to extract crack information effectively is the key problem to the road crack detection system. To solve this problem, a novel method for pavement crack detection based on combining non-negative feature with fast LoG is proposed. The two key novelties and benefits of this new approach are that 1) using image pixel gray value compensation to acquisit uniform image, and 2) combining non-negative feature with fast LoG to extract crack information. The image preprocessing results demonstrate that the method is indeed able to homogenize the crack image with more accurately compared to existing methods. A large number of experimental results demonstrate the proposed approach can detect the crack regions more correctly compared with traditional methods.

  18. Code conversion from signed-digit to complement representation based on look-ahead optical logic operations

    NASA Astrophysics Data System (ADS)

    Li, Guoqiang; Qian, Feng

    2001-11-01

    We present, for the first time to our knowledge, a generalized lookahead logic algorithm for number conversion from signed-digit to complement representation. By properly encoding the signed-digits, all the operations are performed by binary logic, and unified logical expressions can be obtained for conversion from modified-signed- digit (MSD) to 2's complement, trinary signed-digit (TSD) to 3's complement, and quarternary signed-digit (QSD) to 4's complement. For optical implementation, a parallel logical array module using an electron-trapping device is employed and experimental results are shown. This optical module is suitable for implementing complex logic functions in the form of the sum of the product. The algorithm and architecture are compatible with a general-purpose optoelectronic computing system.

  19. Survival times in dogs with presumptive intracranial gliomas treated with oral lomustine: A comparative retrospective study (2008-2017).

    PubMed

    Moirano, S J; Dewey, C W; Wright, K Z; Cohen, P W

    2018-05-24

    Intracranial gliomas are a common malignancy in dogs, and are associated with a poor prognosis due to their aggressive nature and a lack of clinically effective treatments. The efficacies of various treatment modalities for canine brain tumours have been previously described, though little data exist on the use of cytotoxic chemotherapy. A comparative retrospective study, including 40 cases from 5 northeastern US veterinary hospitals, from 2008 to 2017, was conducted. Variables analysed in this study with relation to overall survival and prognostic significance included: age, sex, clinical signs, clinical sign duration, tumour location and treatment protocol used. Dogs with presumptive intracranial gliomas treated with lomustine chemotherapy lived longer (median, 138 days) than those treated exclusively with symptomatic care (median, 35 days; P = .0026 log-rank, 0.0138 Wilcoxon). Additionally, a duration of clinical signs ≥16 days prior to diagnosis (median, 109 days) was associated with a longer survival than a duration <16 days prior (median, 25 days; P = .0100 log-rank, 0.0322 Wilcoxon). Lomustine-associated side effects included neutropenia in 46% of dogs, anaemia in 15% and thrombocytopenia in 15%. Potential renal and hepatotoxicity based on increased BUN and/or creatinine and ALT values were reported in 15% and 50% of dogs, respectively. This study provides evidence that lomustine therapy may be effective in prolonging survival in dogs with intracranial gliomas and should be considered as a potential treatment option. Although lomustine-related toxicities are fairly common, they are rarely life threatening and often do not result in discontinuation of therapy. © 2018 John Wiley & Sons Ltd.

  20. Foxfire 4: Fiddle Making, Springhouses, Horse Trading, Sassafras Tea, Berry Buckets, Gardening, and Further Affairs of Plain Living.

    ERIC Educational Resources Information Center

    Wigginton, Eliot, Ed.

    Planting by the signs of the moon, well digging, hewing logs, wood carving, knife making, bird trapping, and horsetrading are but a few of the aspects of Appalachian culture explored in "Foxfire 4." Like its predecessors, the volume was compiled by high school students at Rabun Gap-Nacoochee School. Information on the cultural heritage…

  1. Detection of Gauss-Markov Random Fields with Nearest-Neighbor Dependency

    DTIC Science & Technology

    2010-01-01

    sgn(Y )C log n, o.w, (45b) where sgn is the sign function and C > 0 is a constant. Consider the functionals H ′2, φ ′ 2 by replacing Yn with Zn in H2...Gaussian signal processing, and has held visiting faculty positions at INP , Toulouse. He is currently with the US Army Research Laboratory where his work

  2. Stan : A Probabilistic Programming Language

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carpenter, Bob; Gelman, Andrew; Hoffman, Matthew D.

    Stan is a probabilistic programming language for specifying statistical models. A Stan program imperatively defines a log probability function over parameters conditioned on specified data and constants. As of version 2.14.0, Stan provides full Bayesian inference for continuous-variable models through Markov chain Monte Carlo methods such as the No-U-Turn sampler, an adaptive form of Hamiltonian Monte Carlo sampling. Penalized maximum likelihood estimates are calculated using optimization methods such as the limited memory Broyden-Fletcher-Goldfarb-Shanno algorithm. Stan is also a platform for computing log densities and their gradients and Hessians, which can be used in alternative algorithms such as variational Bayes, expectationmore » propagation, and marginal inference using approximate integration. To this end, Stan is set up so that the densities, gradients, and Hessians, along with intermediate quantities of the algorithm such as acceptance probabilities, are easily accessible. Stan can also be called from the command line using the cmdstan package, through R using the rstan package, and through Python using the pystan package. All three interfaces support sampling and optimization-based inference with diagnostics and posterior analysis. rstan and pystan also provide access to log probabilities, gradients, Hessians, parameter transforms, and specialized plotting.« less

  3. Prediction of distal residue participation in enzyme catalysis.

    PubMed

    Brodkin, Heather R; DeLateur, Nicholas A; Somarowthu, Srinivas; Mills, Caitlyn L; Novak, Walter R; Beuning, Penny J; Ringe, Dagmar; Ondrechen, Mary Jo

    2015-05-01

    A scoring method for the prediction of catalytically important residues in enzyme structures is presented and used to examine the participation of distal residues in enzyme catalysis. Scores are based on the Partial Order Optimum Likelihood (POOL) machine learning method, using computed electrostatic properties, surface geometric features, and information obtained from the phylogenetic tree as input features. Predictions of distal residue participation in catalysis are compared with experimental kinetics data from the literature on variants of the featured enzymes; some additional kinetics measurements are reported for variants of Pseudomonas putida nitrile hydratase (ppNH) and for Escherichia coli alkaline phosphatase (AP). The multilayer active sites of P. putida nitrile hydratase and of human phosphoglucose isomerase are predicted by the POOL log ZP scores, as is the single-layer active site of P. putida ketosteroid isomerase. The log ZP score cutoff utilized here results in over-prediction of distal residue involvement in E. coli alkaline phosphatase. While fewer experimental data points are available for P. putida mandelate racemase and for human carbonic anhydrase II, the POOL log ZP scores properly predict the previously reported participation of distal residues. 2015 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.

  4. Log-gamma linear-mixed effects models for multiple outcomes with application to a longitudinal glaucoma study

    PubMed Central

    Zhang, Peng; Luo, Dandan; Li, Pengfei; Sharpsten, Lucie; Medeiros, Felipe A.

    2015-01-01

    Glaucoma is a progressive disease due to damage in the optic nerve with associated functional losses. Although the relationship between structural and functional progression in glaucoma is well established, there is disagreement on how this association evolves over time. In addressing this issue, we propose a new class of non-Gaussian linear-mixed models to estimate the correlations among subject-specific effects in multivariate longitudinal studies with a skewed distribution of random effects, to be used in a study of glaucoma. This class provides an efficient estimation of subject-specific effects by modeling the skewed random effects through the log-gamma distribution. It also provides more reliable estimates of the correlations between the random effects. To validate the log-gamma assumption against the usual normality assumption of the random effects, we propose a lack-of-fit test using the profile likelihood function of the shape parameter. We apply this method to data from a prospective observation study, the Diagnostic Innovations in Glaucoma Study, to present a statistically significant association between structural and functional change rates that leads to a better understanding of the progression of glaucoma over time. PMID:26075565

  5. Stan : A Probabilistic Programming Language

    DOE PAGES

    Carpenter, Bob; Gelman, Andrew; Hoffman, Matthew D.; ...

    2017-01-01

    Stan is a probabilistic programming language for specifying statistical models. A Stan program imperatively defines a log probability function over parameters conditioned on specified data and constants. As of version 2.14.0, Stan provides full Bayesian inference for continuous-variable models through Markov chain Monte Carlo methods such as the No-U-Turn sampler, an adaptive form of Hamiltonian Monte Carlo sampling. Penalized maximum likelihood estimates are calculated using optimization methods such as the limited memory Broyden-Fletcher-Goldfarb-Shanno algorithm. Stan is also a platform for computing log densities and their gradients and Hessians, which can be used in alternative algorithms such as variational Bayes, expectationmore » propagation, and marginal inference using approximate integration. To this end, Stan is set up so that the densities, gradients, and Hessians, along with intermediate quantities of the algorithm such as acceptance probabilities, are easily accessible. Stan can also be called from the command line using the cmdstan package, through R using the rstan package, and through Python using the pystan package. All three interfaces support sampling and optimization-based inference with diagnostics and posterior analysis. rstan and pystan also provide access to log probabilities, gradients, Hessians, parameter transforms, and specialized plotting.« less

  6. Bedside clinical signs associated with impending death in patients with advanced cancer: Preliminary findings of a prospective longitudinal cohort study

    PubMed Central

    Hui, David; dos Santos, Renata; Chisholm, Gary; Bansal, Swati; Crovador, Camila Souza; Bruera, Eduardo

    2014-01-01

    Background We recently reported 5 highly specific physical signs associated with death within 3 days among cancer patients that may aid the diagnosis of impending death. In this study, we examined the frequency and onset of an additional 52 bedside physical signs and their diagnostic performance for impending death. Methods We enrolled 357 consecutive patients with advanced cancer admitted to acute palliative care units at two tertiary care cancer centers. We systematically documented 52 physical signs every 12 hours from admission to death or discharge. We examined the frequency and median time of onset of each sign from death backwards, and calculated their likelihood ratios (LRs) associated with death in 3 days. Results 203/357 (57%) patients died at the end of the admission. We identified 8 physical signs that were highly diagnostic of impending death. These signs occurred in 5-78% of patients in the last 3 days of life, had a late onset, and had a high specificity (>95%) and high positive LR for death within 3 days, including non-reactive pupils (positive LR 16.7, 95% confidence interval 14.9-18.6), decreased response to verbal stimuli (8.3, 7.7-9), decreased response to visual stimuli (6.7, 6.3-7.1), inability to close eyelids (13.6, 11.7-15.5), drooping of nasolabial fold (8.3, 7.7-8.9), hyperextension of neck (7.3, 6.7-8), grunting of vocal cords (11.8, 10.3-13.4), and upper gastrointestinal bleed (10.3, 9.5-11.1). Conclusion We identified 8 highly specific physical signs associated with death within 3 days in cancer patients. These signs may inform the diagnosis of impending death. PMID:25676895

  7. Accuracy and precision of the signs and symptoms of streptococcal pharyngitis in children: a systematic review.

    PubMed

    Shaikh, Nader; Swaminathan, Nithya; Hooper, Emma G

    2012-03-01

    To conduct a systematic review to determine whether clinical findings can be used to rule in or to rule out streptococcal pharyngitis in children. Two authors independently searched MEDLINE and EMBASE. We included articles if they contained data on the accuracy of symptoms or signs of streptococcal pharyngitis, individually or combined into prediction rules, in children 3-18 years of age. Thirty-eight articles with data on individual symptoms and signs and 15 articles with data on prediction rules met all inclusion criteria. In children with sore throat, the presence of a scarlatiniform rash (likelihood ratio [LR], 3.91; 95% CI, 2.00-7.62), palatal petechiae (LR, 2.69; CI, 1.92-3.77), pharyngeal exudates (LR, 1.85; CI, 1.58-2.16), vomiting (LR, 1.79; CI, 1.58-2.16), and tender cervical nodes (LR, 1.72; CI, 1.54-1.93) were moderately useful in identifying those with streptococcal pharyngitis. Nevertheless, no individual symptoms or signs were effective in ruling in or ruling out streptococcal pharyngitis. Symptoms and signs, either individually or combined into prediction rules, cannot be used to definitively diagnose or rule out streptococcal pharyngitis. Copyright © 2012 Mosby, Inc. All rights reserved.

  8. The rational clinical examination. Does this infant have pneumonia?

    PubMed

    Margolis, P; Gadomski, A

    1998-01-28

    Acute lower respiratory tract illness is common among children seen in primary care. We reviewed the accuracy and precision of the clinical examination in detecting pneumonia in children. Although most cases are viral, it is important to identify bacterial pneumonia to provide appropriate therapy. Studies were identified by searching MEDLINE from 1982 to 1995, reviewing reference lists, reviewing a published compendium of studies of the clinical examination, and consulting experts. Observer agreement is good for most signs on the clinical examination. Each study was reviewed by 2 observers and graded for methodologic quality. There is better agreement about signs that can be observed (eg, use of accessory muscles, color, attentiveness; kappa, 0.48-0.66) than signs that require auscultation of the chest (eg, adventitious sounds; kappa, 0.3). Measurements of the respiratory rate are enhanced by counting for 60 seconds. The best individual finding for ruling out pneumonia is the absence of tachypnea. Chest indrawing, and other signs of increased work of breathing, increases the likelihood of pneumonia. If all clinical signs (respiratory rate, auscultation, and work of breathing) are negative, the chest x-ray findings are unlikely to be positive. Studies are needed to assess the value of clinical findings when they are used together.

  9. HIV-1 RNA May Decline More Slowly in Semen than in Blood following Initiation of Efavirenz-Based Antiretroviral Therapy

    PubMed Central

    Graham, Susan M.; Holte, Sarah E.; Dragavon, Joan A.; Ramko, Kelly M.; Mandaliya, Kishor N.; McClelland, R. Scott; Peshu, Norbert M.; Sanders, Eduard J.; Krieger, John N.; Coombs, Robert W.

    2012-01-01

    Objectives Antiretroviral therapy (ART) decreases HIV-1 RNA levels in semen and reduces sexual transmission from HIV-1-infected men. Our objective was to study the time course and magnitude of seminal HIV-1 RNA decay after initiation of efavirenz-based ART among 13 antiretroviral-naïve Kenyan men. Methods HIV-1 RNA was quantified (lower limit of detection, 120 copies/mL) in blood and semen at baseline and over the first month of ART. Median log10 HIV-1 RNA was compared at each time-point using Wilcoxon Signed Rank tests. Perelson’s two-phase viral decay model and nonlinear random effects were used to compare decay rates in blood and semen. Results Median baseline HIV-1 RNA was 4.40 log10 copies/mL in blood (range, 3.20–5.08 log10 copies/mL) and 3.69 log10 copies/mL in semen (range, <2.08–4.90 log10 copies/mL). The median reduction in HIV-1 RNA by day 28 was 1.90 log10 copies/mL in blood (range, 0.56–2.68 log10 copies/mL) and 1.36 log10 copies/mL in semen (range, 0–2.66 log10 copies/mL). ART led to a decrease from baseline by day 7 in blood and day 14 in semen (p = 0.005 and p = 0.006, respectively). The initial modeled decay rate was slower in semen than in blood (p = 0.06). There was no difference in second-phase decay rates between blood and semen. Conclusions Efavirenz-based ART reduced HIV-1 RNA levels more slowly in semen than in blood. Although this difference was of borderline significance in this small study, our observations suggest that there is suboptimal suppression of seminal HIV-1 RNA for some men in the early weeks of treatment. PMID:22912795

  10. Measurement of the k(T) distribution of particles in jets produced in pp collisions at sqrt(s)=1.96 TeV.

    PubMed

    Aaltonen, T; Adelman, J; Akimoto, T; Alvarez González, B; Amerio, S; Amidei, D; Anastassov, A; Annovi, A; Antos, J; Apollinari, G; Apresyan, A; Arisawa, T; Artikov, A; Ashmanskas, W; Attal, A; Aurisano, A; Azfar, F; Azzurri, P; Badgett, W; Barbaro-Galtieri, A; Barnes, V E; Barnett, B A; Bartsch, V; Bauer, G; Beauchemin, P-H; Bedeschi, F; Beecher, D; Behari, S; Bellettini, G; Bellinger, J; Benjamin, D; Beretvas, A; Beringer, J; Bhatti, A; Binkley, M; Bisello, D; Bizjak, I; Blair, R E; Blocker, C; Blumenfeld, B; Bocci, A; Bodek, A; Boisvert, V; Bolla, G; Bortoletto, D; Boudreau, J; Boveia, A; Brau, B; Bridgeman, A; Brigliadori, L; Bromberg, C; Brubaker, E; Budagov, J; Budd, H S; Budd, S; Burke, S; Burkett, K; Busetto, G; Bussey, P; Buzatu, A; Byrum, K L; Cabrera, S; Calancha, C; Campanelli, M; Campbell, M; Canelli, F; Canepa, A; Carls, B; Carlsmith, D; Carosi, R; Carrillo, S; Carron, S; Casal, B; Casarsa, M; Castro, A; Catastini, P; Cauz, D; Cavaliere, V; Cavalli-Sforza, M; Cerri, A; Cerrito, L; Chang, S H; Chen, Y C; Chertok, M; Chiarelli, G; Chlachidze, G; Chlebana, F; Cho, K; Chokheli, D; Chou, J P; Choudalakis, G; Chuang, S H; Chung, K; Chung, W H; Chung, Y S; Chwalek, T; Ciobanu, C I; Ciocci, M A; Clark, A; Clark, D; Compostella, G; Convery, M E; Conway, J; Cordelli, M; Cortiana, G; Cox, C A; Cox, D J; Crescioli, F; Cuenca Almenar, C; Cuevas, J; Culbertson, R; Cully, J C; Dagenhart, D; Datta, M; Davies, T; de Barbaro, P; De Cecco, S; Deisher, A; De Lorenzo, G; Dell'orso, M; Deluca, C; Demortier, L; Deng, J; Deninno, M; Derwent, P F; di Giovanni, G P; Dionisi, C; Di Ruzza, B; Dittmann, J R; D'Onofrio, M; Donati, S; Dong, P; Donini, J; Dorigo, T; Dube, S; Efron, J; Elagin, A; Erbacher, R; Errede, D; Errede, S; Eusebi, R; Fang, H C; Farrington, S; Fedorko, W T; Feild, R G; Feindt, M; Fernandez, J P; Ferrazza, C; Field, R; Flanagan, G; Forrest, R; Frank, M J; Franklin, M; Freeman, J C; Furic, I; Gallinaro, M; Galyardt, J; Garberson, F; Garcia, J E; Garfinkel, A F; Genser, K; Gerberich, H; Gerdes, D; Gessler, A; Giagu, S; Giakoumopoulou, V; Giannetti, P; Gibson, K; Gimmell, J L; Ginsburg, C M; Giokaris, N; Giordani, M; Giromini, P; Giunta, M; Giurgiu, G; Glagolev, V; Glenzinski, D; Gold, M; Goldschmidt, N; Golossanov, A; Gomez, G; Gomez-Ceballos, G; Goncharov, M; González, O; Gorelov, I; Goshaw, A T; Goulianos, K; Gresele, A; Grinstein, S; Grosso-Pilcher, C; Grundler, U; Guimaraes da Costa, J; Gunay-Unalan, Z; Haber, C; Hahn, K; Hahn, S R; Halkiadakis, E; Han, B-Y; Han, J Y; Happacher, F; Hara, K; Hare, D; Hare, M; Harper, S; Harr, R F; Harris, R M; Hartz, M; Hatakeyama, K; Hays, C; Heck, M; Heijboer, A; Heinrich, J; Henderson, C; Herndon, M; Heuser, J; Hewamanage, S; Hidas, D; Hill, C S; Hirschbuehl, D; Hocker, A; Hou, S; Houlden, M; Hsu, S-C; Huffman, B T; Hughes, R E; Husemann, U; Hussein, M; Husemann, U; Huston, J; Incandela, J; Introzzi, G; Iori, M; Ivanov, A; James, E; Jayatilaka, B; Jeon, E J; Jha, M K; Jindariani, S; Johnson, W; Jones, M; Joo, K K; Jun, S Y; Jung, J E; Junk, T R; Kamon, T; Kar, D; Karchin, P E; Kato, Y; Kephart, R; Keung, J; Khotilovich, V; Kilminster, B; Kim, D H; Kim, H S; Kim, H W; Kim, J E; Kim, M J; Kim, S B; Kim, S H; Kim, Y K; Kimura, N; Kirsch, L; Klimenko, S; Knuteson, B; Ko, B R; Kondo, K; Kong, D J; Konigsberg, J; Korytov, A; Kotwal, A V; Kreps, M; Kroll, J; Krop, D; Krumnack, N; Kruse, M; Krutelyov, V; Kubo, T; Kuhr, T; Kulkarni, N P; Kurata, M; Kwang, S; Laasanen, A T; Lami, S; Lammel, S; Lancaster, M; Lander, R L; Lannon, K; Lath, A; Latino, G; Lazzizzera, I; Lecompte, T; Lee, E; Lee, H S; Lee, S W; Leone, S; Lewis, J D; Lin, C-S; Linacre, J; Lindgren, M; Lipeles, E; Lister, A; Litvintsev, D O; Liu, C; Liu, T; Lockyer, N S; Loginov, A; Loreti, M; Lovas, L; Lucchesi, D; Luci, C; Lueck, J; Lujan, P; Lukens, P; Lungu, G; Lyons, L; Lys, J; Lysak, R; Macqueen, D; Madrak, R; Maeshima, K; Makhoul, K; Maki, T; Maksimovic, P; Malde, S; Malik, S; Manca, G; Manousakis-Katsikakis, A; Margaroli, F; Marino, C; Marino, C P; Martin, A; Martin, V; Martínez, M; Martínez-Ballarín, R; Maruyama, T; Mastrandrea, P; Masubuchi, T; Mathis, M; Mattson, M E; Mazzanti, P; McFarland, K S; McIntyre, P; McNulty, R; Mehta, A; Mehtala, P; Menzione, A; Merkel, P; Mesropian, C; Miao, T; Miladinovic, N; Miller, R; Mills, C; Milnik, M; Mitra, A; Mitselmakher, G; Miyake, H; Moggi, N; Moon, C S; Moore, R; Morello, M J; Morlok, J; Movilla Fernandez, P; Mülmenstädt, J; Mukherjee, A; Muller, Th; Mumford, R; Murat, P; Mussini, M; Nachtman, J; Nagai, Y; Nagano, A; Naganoma, J; Nakamura, K; Nakano, I; Napier, A; Necula, V; Nett, J; Neu, C; Neubauer, M S; Neubauer, S; Nielsen, J; Nodulman, L; Norman, M; Norniella, O; Nurse, E; Oakes, L; Oh, S H; Oh, Y D; Oksuzian, I; Okusawa, T; Orava, R; Pagan Griso, S; Palencia, E; Papadimitriou, V; Papaikonomou, A; Paramonov, A A; Parks, B; Pashapour, S; Patrick, J; Pauletta, G; Paulini, M; Paus, C; Peiffer, T; Pellett, D E; Penzo, A; Phillips, T J; Piacentino, G; Pianori, E; Pinera, L; Pitts, K; Plager, C; Pondrom, L; Poukhov, O; Pounder, N; Prakoshyn, F; Pronko, A; Proudfoot, J; Ptohos, F; Pueschel, E; Punzi, G; Pursley, J; Rademacker, J; Rahaman, A; Ramakrishnan, V; Ranjan, N; Redondo, I; Renton, P; Renz, M; Rescigno, M; Richter, S; Rimondi, F; Ristori, L; Robson, A; Rodrigo, T; Rodriguez, T; Rogers, E; Rolli, S; Roser, R; Rossi, M; Rossin, R; Roy, P; Ruiz, A; Russ, J; Rusu, V; Safonov, A; Sakumoto, W K; Saltó, O; Santi, L; Sarkar, S; Sartori, L; Sato, K; Savoy-Navarro, A; Schlabach, P; Schmidt, A; Schmidt, E E; Schmidt, M A; Schmidt, M P; Schmitt, M; Schwarz, T; Scodellaro, L; Scribano, A; Scuri, F; Sedov, A; Seidel, S; Seiya, Y; Semenov, A; Sexton-Kennedy, L; Sforza, F; Sfyrla, A; Shalhout, S Z; Shears, T; Shepard, P F; Shimojima, M; Shiraishi, S; Shochet, M; Shon, Y; Shreyber, I; Sidoti, A; Sinervo, P; Sisakyan, A; Slaughter, A J; Slaunwhite, J; Sliwa, K; Smith, J R; Snider, F D; Snihur, R; Soha, A; Somalwar, S; Sorin, V; Spalding, J; Spreitzer, T; Squillacioti, P; Stanitzki, M; St Denis, R; Stelzer, B; Stelzer-Chilton, O; Stentz, D; Strologas, J; Strycker, G L; Stuart, D; Suh, J S; Sukhanov, A; Suslov, I; Suzuki, T; Taffard, A; Takashima, R; Takeuchi, Y; Tanaka, R; Tecchio, M; Teng, P K; Terashi, K; Thom, J; Thompson, A S; Thompson, G A; Thomson, E; Tipton, P; Ttito-Guzmán, P; Tkaczyk, S; Toback, D; Tokar, S; Tollefson, K; Tomura, T; Tonelli, D; Torre, S; Torretta, D; Totaro, P; Tourneur, S; Trovato, M; Tsai, S-Y; Tu, Y; Turini, N; Ukegawa, F; Vallecorsa, S; van Remortel, N; Varganov, A; Vataga, E; Vázquez, F; Velev, G; Vellidis, C; Veszpremi, V; Vidal, M; Vidal, R; Vila, I; Vilar, R; Vine, T; Vogel, M; Volobouev, I; Volpi, G; Wagner, P; Wagner, R G; Wagner, R L; Wagner, W; Wagner-Kuhr, J; Wakisaka, T; Wallny, R; Wang, S M; Warburton, A; Waters, D; Weinberger, M; Weinelt, J; Wester, W C; Whitehouse, B; Whiteson, D; Wicklund, A B; Wicklund, E; Wilbur, S; Williams, G; Williams, H H; Wilson, P; Winer, B L; Wittich, P; Wolbers, S; Wolfe, C; Wright, T; Wu, X; Würthwein, F; Wynne, S M; Xie, S; Yagil, A; Yamamoto, K; Yamaoka, J; Yang, U K; Yang, Y C; Yao, W M; Yeh, G P; Yoh, J; Yorita, K; Yoshida, T; Yu, G B; Yu, I; Yu, S S; Yun, J C; Zanello, L; Zanetti, A; Zhang, X; Zheng, Y; Zucchelli, S

    2009-06-12

    We present a measurement of the transverse momentum with respect to the jet axis (k(t)) of particles in jets produced in pp collisions at sqrt(s)=1.96 TeV. Results are obtained for charged particles in a cone of 0.5 radians around the jet axis in events with dijet invariant masses between 66 and 737 GeV/c(2). The experimental data are compared to theoretical predictions obtained for fragmentation partons within the framework of resummed perturbative QCD using the modified leading log and next-to-modified leading log approximations. The comparison shows that trends in data are successfully described by the theoretical predictions, indicating that the perturbative QCD stage of jet fragmentation is dominant in shaping basic jet characteristics.

  11. Qualitative and quantitative characteristics of near-infrared autofluorescence in diabetic macular edema.

    PubMed

    Yoshitake, Shin; Murakami, Tomoaki; Horii, Takahiro; Uji, Akihito; Ogino, Ken; Unoki, Noriyuki; Nishijima, Kazuaki; Yoshimura, Nagahisa

    2014-05-01

    To study the characteristics of near-infrared autofluorescence (NIR-AF) imaging and its association with spectral-domain optical coherence tomography (SD-OCT) findings and logarithm of the minimal angle of resolution (logMAR) visual acuity (VA) in diabetic macular edema (DME). Retrospective, observational, cross-sectional study. One hundred twenty-one consecutive eyes of 87 patients with center-involved DME for whom NIR-AF and SD-OCT images of sufficient quality were obtained. The NIR-AF images were acquired using Heidelberg Retina Angiograph 2 (Heidelberg Engineering, Heidelberg, Germany), and sectional retinal images were obtained using Spectralis OCT (Heidelberg Engineering). The presence of a mosaic pattern and cystoid signs were determined qualitatively. We quantified the average fluorescence intensity in the central 1-mm subfield. The characteristics of the NIR-AF images were compared with the OCT findings and logMAR VA. Qualitative and quantitative characteristics of the NIR-AF images and their association with SD-OCT findings and logMAR VA. Fifty-seven eyes with a mosaic pattern in the NIR-AF macular images had worse logMAR VA (0.355±0.239 vs. 0.212±0.235; P = 0.001), a thicker central subfield (CSF) (530±143 μm vs. 438±105 μm; P <0.001), and disrupted external limiting membrane (ELM; P <0.001) compared with 64 eyes without these findings. Forty-one eyes with a cystoid sign in the NIR-AF images had worse logMAR VA (0.393±0.233 vs. 0.221±0.234; P <0.001) and a thicker CSF (557±155 μm vs. 443±100 μm; P <0.001) than those without them; there were no significant differences in the ELM status. The relative fluorescence intensity in the central subfield in the NIR-AF images was correlated negatively with the CSF thickness and logMAR VA (R = 0.492, P <0.001 and R = 0.377, P <0.001, respectively). Eyes with foveal serous retinal detachment had lower levels of relative fluorescence intensity than those without it (0.751±0.191 vs. 0.877±0.154; P = 0.007); there was no association with the presence of foveal cystoid spaces, disrupted ELM, or hyperreflective foci in the outer retinal layers. Novel qualitative and quantitative NIR-AF characteristics in the macula indicated the clinical relevance and suggested the pathogenesis in DME. Copyright © 2014 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  12. Ancestry as a potential modifier of gene expression in breast tumors from Colombian women

    PubMed Central

    Serrano-Gómez, Silvia J.; Sanabria-Salas, María Carolina; Garay, Jone; Baddoo, Melody C.; Hernández-Suarez, Gustavo; Mejía, Juan Carlos; García, Oscar; Miele, Lucio

    2017-01-01

    Background Hispanic/Latino populations are a genetically admixed and heterogeneous group, with variable fractions of European, Indigenous American and African ancestries. The molecular profile of breast cancer has been widely described in non-Hispanic Whites but equivalent knowledge is lacking in Hispanic/Latinas. We have previously reported that the most prevalent breast cancer intrinsic subtype in Colombian women was Luminal B as defined by St. Gallen 2013 criteria. In this study we explored ancestry-associated differences in molecular profiles of Luminal B tumors among these highly admixed women. Methods We performed whole-transcriptome RNA-seq analysis in 42 Luminal tumors (21 Luminal A and 21 Luminal B) from Colombian women. Genetic ancestry was estimated from a panel of 80 ancestry-informative markers (AIM). We categorized patients according to Luminal subtype and to the proportion of European and Indigenous American ancestry and performed differential expression analysis comparing Luminal B against Luminal A tumors according to the assigned ancestry groups. Results We found 5 genes potentially modulated by genetic ancestry: ERBB2 (log2FC = 2.367, padj<0.01), GRB7 (log2FC = 2.327, padj<0.01), GSDMB (log2FC = 1.723, padj<0.01, MIEN1 (log2FC = 2.195, padj<0.01 and ONECUT2 (log2FC = 2.204, padj<0.01). In the replication set we found a statistical significant association between ERBB2 expression with Indigenous American ancestry (p = 0.02, B = 3.11). This association was not biased by the distribution of HER2+ tumors among the groups analyzed. Conclusions Our results suggest that genetic ancestry in Hispanic/Latina women might modify ERBB2 gene expression in Luminal tumors. Further analyses are needed to confirm these findings and explore their prognostic value. PMID:28832682

  13. The influence of lunar phases and zodiac sign 'Leo' on perioperative complications and outcome in elective spine surgery.

    PubMed

    Joswig, Holger; Stienen, Martin N; Hock, Carolin; Hildebrandt, Gerhard; Surbeck, Werner

    2016-06-01

    Many people believe that the moon has an influence on daily life, and some even request elective surgery dates depending on the moon calendar. The aim of this study was to assess the influence of 'unfavorable' lunar or zodiac constellations on perioperative complications and outcome in elective surgery for degenerative disc disease. Retrospective database analysis including 924 patients. Using uni- and multivariate logistic regression, the likelihood for intraoperative complications and re-do surgeries as well as the clinical outcomes at 4 weeks was analyzed for surgeries performed during the waxing moon, full moon, and dates when the moon passed through the zodiac sign 'Leo.' In multivariate analysis, patients operated on during the waxing moon were 1.54 times as likely as patients who were operated on during the waning moon to suffer from an intraoperative complication (OR 1.54, 95 % CI 1.07-2.21, p = 0.019). In contrast, there was a trend toward fewer re-do surgeries for surgery during the waxing moon (OR 0.51, 95 % CI 0.23-1.16, p = 0.109), while the 4-week responder status was similar (OR 0.73, 95 % CI 0.47-1.14, p = 0.169). A full moon and the zodiac sign Leo did not increase the likelihood for complications, re-do surgeries or unfavorable outcomes. We found no influence of 'unfavorable' lunar or zodiac constellations on the 4-week responder status or the revision rate that would justify a moon calendar-based selection approach to elective spine surgery dates. However, the fact that patients undergoing surgery during the waxing moon were more likely to suffer from an intraoperative complication is a surprising curiosity and defies our ability to find a rational explanation.

  14. Predicting drug penetration across the blood-brain barrier: comparison of micellar liquid chromatography and immobilized artificial membrane liquid chromatography.

    PubMed

    De Vrieze, Mike; Lynen, Frédéric; Chen, Kai; Szucs, Roman; Sandra, Pat

    2013-07-01

    Several in vitro methods have been tested for their ability to predict drug penetration across the blood-brain barrier (BBB) into the central nervous system (CNS). In this article, the performance of a variety of micellar liquid chromatographic (MLC) methods and immobilized artificial membrane (IAM) liquid chromatographic approaches were compared for a set of 45 solutes. MLC measurements were performed on a C18 column with sodium dodecyl sulfate (SDS), polyoxyethylene (23) lauryl ether (Brij35), or sodium deoxycholate (SDC) as surfactant in the micellar mobile phase. IAM liquid chromatography measurements were performed with Dulbecco's phosphate-buffered saline (DPBS) and methanol as organic modifier in the mobile phase. The corresponding retention and computed descriptor data for each solute were used for construction of models to predict transport across the blood-brain barrier (log BB). All data were correlated with experimental log BB values and the relative performance of the models was studied. SDS-based models proved most suitable for prediction of log BB values, followed closely by a simplified IAM method, in which it could be observed that extrapolation of retention data to 0% modifier in the mobile phase was unnecessary.

  15. Unified framework to evaluate panmixia and migration direction among multiple sampling locations.

    PubMed

    Beerli, Peter; Palczewski, Michal

    2010-05-01

    For many biological investigations, groups of individuals are genetically sampled from several geographic locations. These sampling locations often do not reflect the genetic population structure. We describe a framework using marginal likelihoods to compare and order structured population models, such as testing whether the sampling locations belong to the same randomly mating population or comparing unidirectional and multidirectional gene flow models. In the context of inferences employing Markov chain Monte Carlo methods, the accuracy of the marginal likelihoods depends heavily on the approximation method used to calculate the marginal likelihood. Two methods, modified thermodynamic integration and a stabilized harmonic mean estimator, are compared. With finite Markov chain Monte Carlo run lengths, the harmonic mean estimator may not be consistent. Thermodynamic integration, in contrast, delivers considerably better estimates of the marginal likelihood. The choice of prior distributions does not influence the order and choice of the better models when the marginal likelihood is estimated using thermodynamic integration, whereas with the harmonic mean estimator the influence of the prior is pronounced and the order of the models changes. The approximation of marginal likelihood using thermodynamic integration in MIGRATE allows the evaluation of complex population genetic models, not only of whether sampling locations belong to a single panmictic population, but also of competing complex structured population models.

  16. 78 FR 53755 - Information Collection(s) Being Submitted for Review and Approval to the Office of Management and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-30

    ... transmission to ascertain the call sign transmitted. However, this gives a new group of licensee stations (PLMRs) an option regarding the method of transmission of required call sign information; it modifies the... providing the Commission sufficient information to decode the transmission--unless they choose the digital...

  17. Walking training and cortisol to DHEA-S ratio in postmenopause: An intervention study.

    PubMed

    Di Blasio, Andrea; Izzicupo, Pascal; Di Baldassarre, Angela; Gallina, Sabina; Bucci, Ines; Giuliani, Cesidio; Di Santo, Serena; Di Iorio, Angelo; Ripari, Patrizio; Napolitano, Giorgio

    2018-04-01

    The literature indicates that the plasma cortisol-to-dehydroepiandrosterone-sulfate (DHEA-S) ratio is a marker of health status after menopause, when a decline in both estrogen and DHEA-S and an increase in cortisol occur. An increase in the cortisol-to-DHEA-S ratio has been positively correlated with metabolic syndrome, all-cause mortality, cancer, and other diseases. The aim of this study was to investigate the effects of a walking program on the plasma cortisol-to-DHEA-S ratio in postmenopausal women. Fifty-one postmenopausal women participated in a 13-week supervised walking program, in the metropolitan area of Pescara (Italy), from June to September 2013. Participants were evaluated in April-May and September-October of the same year. The linear mixed model showed that the variation of the log 10 Cortisol-to-log 10 DHEA-S ratio was associated with the volume of exercise (p = .03). Participants having lower adherence to the walking program did not have a significantly modified log 10 Cortisol or log 10 DHEA-S, while those having the highest adherence had a significant reduction in log 10 Cortisol (p = .016) and a nearly significant increase in log 10 DHEA-S (p = .084). Walking training appeared to reduce the plasma log 10 Cortisol-to-log 10 DHEA-S ratio, although a minimum level of training was necessary to achieve this significant reduction.

  18. A blind search for a common signal in gravitational wave detectors

    NASA Astrophysics Data System (ADS)

    Liu, Hao; Creswell, James; von Hausegger, Sebastian; Jackson, Andrew D.; Naselsky, Pavel

    2018-02-01

    We propose a blind, template-free method for the extraction of a common signal between the Hanford and Livingston detectors and apply it especially to the GW150914 event. We construct a log-likelihood method that maximizes the cross-correlation between each detector and the common signal and minimizes the cross-correlation between the residuals. The reliability of this method is tested using simulations with an injected common signal. Finally, our method is used to assess the quality of theoretical gravitational wave templates for GW150914.

  19. Analysis of the Hessian for Inverse Scattering Problems. Part 3. Inverse Medium Scattering of Electromagnetic Waves in Three Dimensions

    DTIC Science & Technology

    2012-08-01

    small data noise and model error, the discrete Hessian can be approximated by a low-rank matrix. This in turn enables fast solution of an appropriately...implication of the compactness of the Hessian is that for small data noise and model error, the discrete Hessian can be approximated by a low-rank matrix. This...probability distribution is given by the inverse of the Hessian of the negative log likelihood function. For Gaussian data noise and model error, this

  20. Impact of selective logging on inbreeding and gene dispersal in an Amazonian tree population of Carapa guianensis Aubl.

    PubMed

    Cloutier, D; Kanashiro, M; Ciampi, A Y; Schoen, D J

    2007-02-01

    Selective logging may impact patterns of genetic diversity within populations of harvested forest tree species by increasing distances separating conspecific trees, and modifying physical and biotic features of the forest habitat. We measured levels of gene diversity, inbreeding, pollen dispersal and spatial genetic structure (SGS) of an Amazonian insect-pollinated Carapa guianensis population before and after commercial selective logging. Similar levels of gene diversity and allelic richness were found before and after logging in both the adult and the seed generations. Pre- and post-harvest outcrossing rates were high, and not significantly different from one another. We found no significant levels of biparental inbreeding either before or after logging. Low levels of pollen pool differentiation were found, and the pre- vs. post-harvest difference was not significant. Pollen dispersal distance estimates averaged between 75 m and 265 m before logging, and between 76 m and 268 m after logging, depending on the value of tree density and the dispersal model used. There were weak and similar levels of differentiation of allele frequencies in the adults and in the pollen pool, before and after logging occurred, as well as weak and similar pre- and post-harvest levels of SGS among adult trees. The large neighbourhood sizes estimated suggest high historical levels of gene flow. Overall our results indicate that there is no clear short-term genetic impact of selective logging on this population of C. guianensis.

  1. Selective logging in tropical forests decreases the robustness of liana-tree interaction networks to the loss of host tree species.

    PubMed

    Magrach, Ainhoa; Senior, Rebecca A; Rogers, Andrew; Nurdin, Deddy; Benedick, Suzan; Laurance, William F; Santamaria, Luis; Edwards, David P

    2016-03-16

    Selective logging is one of the major drivers of tropical forest degradation, causing important shifts in species composition. Whether such changes modify interactions between species and the networks in which they are embedded remain fundamental questions to assess the 'health' and ecosystem functionality of logged forests. We focus on interactions between lianas and their tree hosts within primary and selectively logged forests in the biodiversity hotspot of Malaysian Borneo. We found that lianas were more abundant, had higher species richness, and different species compositions in logged than in primary forests. Logged forests showed heavier liana loads disparately affecting slow-growing tree species, which could exacerbate the loss of timber value and carbon storage already associated with logging. Moreover, simulation scenarios of host tree local species loss indicated that logging might decrease the robustness of liana-tree interaction networks if heavily infested trees (i.e. the most connected ones) were more likely to disappear. This effect is partially mitigated in the short term by the colonization of host trees by a greater diversity of liana species within logged forests, yet this might not compensate for the loss of preferred tree hosts in the long term. As a consequence, species interaction networks may show a lagged response to disturbance, which may trigger sudden collapses in species richness and ecosystem function in response to additional disturbances, representing a new type of 'extinction debt'. © 2016 The Author(s).

  2. Identification of coal seam strata from geophysical logs of borehole using Adaptive Neuro-Fuzzy Inference System

    NASA Astrophysics Data System (ADS)

    Yegireddi, Satyanarayana; Uday Bhaskar, G.

    2009-01-01

    Different parameters obtained through well-logging geophysical sensors such as SP, resistivity, gamma-gamma, neutron, natural gamma and acoustic, help in identification of strata and estimation of the physical, electrical and acoustical properties of the subsurface lithology. Strong and conspicuous changes in some of the log parameters associated with any particular stratigraphy formation, are function of its composition, physical properties and help in classification. However some substrata show moderate values in respective log parameters and make difficult to identify or assess the type of strata, if we go by the standard variability ranges of any log parameters and visual inspection. The complexity increases further with more number of sensors involved. An attempt is made to identify the type of stratigraphy from borehole geophysical log data using a combined approach of neural networks and fuzzy logic, known as Adaptive Neuro-Fuzzy Inference System. A model is built based on a few data sets (geophysical logs) of known stratigraphy of in coal areas of Kothagudem, Godavari basin and further the network model is used as test model to infer the lithology of a borehole from their geophysical logs, not used in simulation. The results are very encouraging and the model is able to decipher even thin cola seams and other strata from borehole geophysical logs. The model can be further modified to assess the physical properties of the strata, if the corresponding ground truth is made available for simulation.

  3. Selective logging in tropical forests decreases the robustness of liana–tree interaction networks to the loss of host tree species

    PubMed Central

    Magrach, Ainhoa; Senior, Rebecca A.; Rogers, Andrew; Nurdin, Deddy; Benedick, Suzan; Laurance, William F.; Santamaria, Luis; Edwards, David P.

    2016-01-01

    Selective logging is one of the major drivers of tropical forest degradation, causing important shifts in species composition. Whether such changes modify interactions between species and the networks in which they are embedded remain fundamental questions to assess the ‘health’ and ecosystem functionality of logged forests. We focus on interactions between lianas and their tree hosts within primary and selectively logged forests in the biodiversity hotspot of Malaysian Borneo. We found that lianas were more abundant, had higher species richness, and different species compositions in logged than in primary forests. Logged forests showed heavier liana loads disparately affecting slow-growing tree species, which could exacerbate the loss of timber value and carbon storage already associated with logging. Moreover, simulation scenarios of host tree local species loss indicated that logging might decrease the robustness of liana–tree interaction networks if heavily infested trees (i.e. the most connected ones) were more likely to disappear. This effect is partially mitigated in the short term by the colonization of host trees by a greater diversity of liana species within logged forests, yet this might not compensate for the loss of preferred tree hosts in the long term. As a consequence, species interaction networks may show a lagged response to disturbance, which may trigger sudden collapses in species richness and ecosystem function in response to additional disturbances, representing a new type of ‘extinction debt’. PMID:26936241

  4. Modeling gene expression measurement error: a quasi-likelihood approach

    PubMed Central

    Strimmer, Korbinian

    2003-01-01

    Background Using suitable error models for gene expression measurements is essential in the statistical analysis of microarray data. However, the true probabilistic model underlying gene expression intensity readings is generally not known. Instead, in currently used approaches some simple parametric model is assumed (usually a transformed normal distribution) or the empirical distribution is estimated. However, both these strategies may not be optimal for gene expression data, as the non-parametric approach ignores known structural information whereas the fully parametric models run the risk of misspecification. A further related problem is the choice of a suitable scale for the model (e.g. observed vs. log-scale). Results Here a simple semi-parametric model for gene expression measurement error is presented. In this approach inference is based an approximate likelihood function (the extended quasi-likelihood). Only partial knowledge about the unknown true distribution is required to construct this function. In case of gene expression this information is available in the form of the postulated (e.g. quadratic) variance structure of the data. As the quasi-likelihood behaves (almost) like a proper likelihood, it allows for the estimation of calibration and variance parameters, and it is also straightforward to obtain corresponding approximate confidence intervals. Unlike most other frameworks, it also allows analysis on any preferred scale, i.e. both on the original linear scale as well as on a transformed scale. It can also be employed in regression approaches to model systematic (e.g. array or dye) effects. Conclusions The quasi-likelihood framework provides a simple and versatile approach to analyze gene expression data that does not make any strong distributional assumptions about the underlying error model. For several simulated as well as real data sets it provides a better fit to the data than competing models. In an example it also improved the power of tests to identify differential expression. PMID:12659637

  5. Size distribution of submarine landslides along the U.S. Atlantic margin

    USGS Publications Warehouse

    Chaytor, J.D.; ten Brink, Uri S.; Solow, A.R.; Andrews, B.D.

    2009-01-01

    Assessment of the probability for destructive landslide-generated tsunamis depends on the knowledge of the number, size, and frequency of large submarine landslides. This paper investigates the size distribution of submarine landslides along the U.S. Atlantic continental slope and rise using the size of the landslide source regions (landslide failure scars). Landslide scars along the margin identified in a detailed bathymetric Digital Elevation Model (DEM) have areas that range between 0.89??km2 and 2410??km2 and volumes between 0.002??km3 and 179??km3. The area to volume relationship of these failure scars is almost linear (inverse power-law exponent close to 1), suggesting a fairly uniform failure thickness of a few 10s of meters in each event, with only rare, deep excavating landslides. The cumulative volume distribution of the failure scars is very well described by a log-normal distribution rather than by an inverse power-law, the most commonly used distribution for both subaerial and submarine landslides. A log-normal distribution centered on a volume of 0.86??km3 may indicate that landslides preferentially mobilize a moderate amount of material (on the order of 1??km3), rather than large landslides or very small ones. Alternatively, the log-normal distribution may reflect an inverse power law distribution modified by a size-dependent probability of observing landslide scars in the bathymetry data. If the latter is the case, an inverse power-law distribution with an exponent of 1.3 ?? 0.3, modified by a size-dependent conditional probability of identifying more failure scars with increasing landslide size, fits the observed size distribution. This exponent value is similar to the predicted exponent of 1.2 ?? 0.3 for subaerial landslides in unconsolidated material. Both the log-normal and modified inverse power-law distributions of the observed failure scar volumes suggest that large landslides, which have the greatest potential to generate damaging tsunamis, occur infrequently along the margin. ?? 2008 Elsevier B.V.

  6. Reduction of Salmonella on chicken breast fillets stored under aerobic or modified atmosphere packaging by the application of lytic bacteriophage preparation SalmoFreshTM.

    PubMed

    Sukumaran, Anuraj T; Nannapaneni, Rama; Kiess, Aaron; Sharma, Chander Shekhar

    2016-03-01

    The present study evaluated the efficacy of recently approved Salmonella lytic bacteriophage preparation (SalmoFresh™) in reducing Salmonella on chicken breast fillets, as a surface and dip application. The effectiveness of phage in combination with modified atmosphere packaging (MAP) and the ability of phage preparation in reducing Salmonella on chicken breast fillets at room temperature was also evaluated. Chicken breast fillets inoculated with a cocktail of Salmonella Typhimurium, S. Heidelberg, and S. Enteritidis were treated with bacteriophage (10(9) PFU/mL) as either a dip or surface treatment. The dip-treated samples were stored at 4°C aerobically and the surface-treated samples were stored under aerobic and MAP conditions (95% CO2/5% O2) at 4°C for 7 d. Immersion of Salmonella-inoculated chicken breast fillets in bacteriophage solution reduced Salmonella (P < 0.05) by 0.7 and 0.9 log CFU/g on d 0 and d 1 of storage, respectively. Surface treatment with phage significantly (P < 0.05) reduced Salmonella by 0.8, 0.8, and 1 log CFU/g on d 0, 1, and 7 of storage, respectively, under aerobic conditions. Higher reductions in Salmonella counts were achieved on chicken breast fillets when the samples were surface treated with phage and stored under MAP conditions. The Salmonella counts were reduced by 1.2, 1.1, and 1.2 log CFU/g on d 0, 1, and 7 of storage, respectively. Bacteriophage surface application on chicken breast fillets stored at room temperature reduced the Salmonella counts by 0.8, 0.9, and 0.4 log CFU/g after 0, 4, and 8 h, respectively, compared to the untreated positive control. These findings indicate that lytic phage preparation was effective in reducing Salmonella on chicken breast fillets stored under aerobic and modified atmosphere conditions. © 2015 Poultry Science Association Inc.

  7. Comparing Thermal Process Validation Methods for Salmonella Inactivation on Almond Kernels.

    PubMed

    Jeong, Sanghyup; Marks, Bradley P; James, Michael K

    2017-01-01

    Ongoing regulatory changes are increasing the need for reliable process validation methods for pathogen reduction processes involving low-moisture products; however, the reliability of various validation methods has not been evaluated. Therefore, the objective was to quantify accuracy and repeatability of four validation methods (two biologically based and two based on time-temperature models) for thermal pasteurization of almonds. Almond kernels were inoculated with Salmonella Enteritidis phage type 30 or Enterococcus faecium (NRRL B-2354) at ~10 8 CFU/g, equilibrated to 0.24, 0.45, 0.58, or 0.78 water activity (a w ), and then heated in a pilot-scale, moist-air impingement oven (dry bulb 121, 149, or 177°C; dew point <33.0, 69.4, 81.6, or 90.6°C; v air = 2.7 m/s) to a target lethality of ~4 log. Almond surface temperatures were measured in two ways, and those temperatures were used to calculate Salmonella inactivation using a traditional (D, z) model and a modified model accounting for process humidity. Among the process validation methods, both methods based on time-temperature models had better repeatability, with replication errors approximately half those of the surrogate ( E. faecium ). Additionally, the modified model yielded the lowest root mean squared error in predicting Salmonella inactivation (1.1 to 1.5 log CFU/g); in contrast, E. faecium yielded a root mean squared error of 1.2 to 1.6 log CFU/g, and the traditional model yielded an unacceptably high error (3.4 to 4.4 log CFU/g). Importantly, the surrogate and modified model both yielded lethality predictions that were statistically equivalent (α = 0.05) to actual Salmonella lethality. The results demonstrate the importance of methodology, a w , and process humidity when validating thermal pasteurization processes for low-moisture foods, which should help processors select and interpret validation methods to ensure product safety.

  8. The Maximum Likelihood Solution for Inclination-only Data

    NASA Astrophysics Data System (ADS)

    Arason, P.; Levi, S.

    2006-12-01

    The arithmetic means of inclination-only data are known to introduce a shallowing bias. Several methods have been proposed to estimate unbiased means of the inclination along with measures of the precision. Most of the inclination-only methods were designed to maximize the likelihood function of the marginal Fisher distribution. However, the exact analytical form of the maximum likelihood function is fairly complicated, and all these methods require various assumptions and approximations that are inappropriate for many data sets. For some steep and dispersed data sets, the estimates provided by these methods are significantly displaced from the peak of the likelihood function to systematically shallower inclinations. The problem in locating the maximum of the likelihood function is partly due to difficulties in accurately evaluating the function for all values of interest. This is because some elements of the log-likelihood function increase exponentially as precision parameters increase, leading to numerical instabilities. In this study we succeeded in analytically cancelling exponential elements from the likelihood function, and we are now able to calculate its value for any location in the parameter space and for any inclination-only data set, with full accuracy. Furtermore, we can now calculate the partial derivatives of the likelihood function with desired accuracy. Locating the maximum likelihood without the assumptions required by previous methods is now straight forward. The information to separate the mean inclination from the precision parameter will be lost for very steep and dispersed data sets. It is worth noting that the likelihood function always has a maximum value. However, for some dispersed and steep data sets with few samples, the likelihood function takes its highest value on the boundary of the parameter space, i.e. at inclinations of +/- 90 degrees, but with relatively well defined dispersion. Our simulations indicate that this occurs quite frequently for certain data sets, and relatively small perturbations in the data will drive the maxima to the boundary. We interpret this to indicate that, for such data sets, the information needed to separate the mean inclination and the precision parameter is permanently lost. To assess the reliability and accuracy of our method we generated large number of random Fisher-distributed data sets and used seven methods to estimate the mean inclination and precision paramenter. These comparisons are described by Levi and Arason at the 2006 AGU Fall meeting. The results of the various methods is very favourable to our new robust maximum likelihood method, which, on average, is the most reliable, and the mean inclination estimates are the least biased toward shallow values. Further information on our inclination-only analysis can be obtained from: http://www.vedur.is/~arason/paleomag

  9. Micellar versus hydro-organic mobile phases for retention-hydrophobicity relationship studies with ionizable diuretics and an anionic surfactant.

    PubMed

    Ruiz-Angel, M J; Carda-Broch, S; García-Alvarez-Coque, M C; Berthod, A

    2004-03-19

    Logarithm of retention factors (log k) of a group of 14 ionizable diuretics were correlated with the molecular (log P o/w) and apparent (log P(app)) octanol-water partition coefficients. The compounds were chromatographed using aqueous-organic (reversed-phase liquid chromatography, RPLC) and micellar-organic mobile phases (micellar liquid chromatography, MLC) with the anionic surfactant sodium dodecyl sulfate (SDS), in the pH range 3-7, and a conventional octadecylsilane column. Acetonitrile was used as the organic modifier in both modes. The quality of the correlations obtained for log P(app) at varying ionization degree confirms that this correction is required in the aqueous-organic mixtures. The correlation is less improved with SDS micellar media because the acid-base equilibriums are shifted towards higher pH values for acidic compounds. In micellar chromatography, an electrostatic interaction with charged solutes is added to hydrophobic forces; consequently, different correlations should be established for neutral and acidic compounds, and for basic compounds. Correlations between log k and the isocratic descriptors log k(w), log k(wm) (extrapolated retention to pure water in the aqueous-organic and micellar-organic systems, respectively), and psi0 (extrapolated mobile phase composition giving a k = 1 retention factor or twice the dead time), and between these descriptors and log P(app) were also satisfactory, although poorer than those between log k and log P(app) due to the extrapolation. The study shows that, in the particular case of the ionizable diuretics studied, classical RPLC gives better results than MLC with SDS in the retention hydrophobicity correlations.

  10. Sign rank versus Vapnik-Chervonenkis dimension

    NASA Astrophysics Data System (ADS)

    Alon, N.; Moran, Sh; Yehudayoff, A.

    2017-12-01

    This work studies the maximum possible sign rank of sign (N × N)-matrices with a given Vapnik-Chervonenkis dimension d. For d=1, this maximum is three. For d=2, this maximum is \\widetilde{\\Theta}(N1/2). For d >2, similar but slightly less accurate statements hold. The lower bounds improve on previous ones by Ben-David et al., and the upper bounds are novel. The lower bounds are obtained by probabilistic constructions, using a theorem of Warren in real algebraic topology. The upper bounds are obtained using a result of Welzl about spanning trees with low stabbing number, and using the moment curve. The upper bound technique is also used to: (i) provide estimates on the number of classes of a given Vapnik-Chervonenkis dimension, and the number of maximum classes of a given Vapnik-Chervonenkis dimension--answering a question of Frankl from 1989, and (ii) design an efficient algorithm that provides an O(N/log(N)) multiplicative approximation for the sign rank. We also observe a general connection between sign rank and spectral gaps which is based on Forster's argument. Consider the adjacency (N × N)-matrix of a Δ-regular graph with a second eigenvalue of absolute value λ and Δ ≤ N/2. We show that the sign rank of the signed version of this matrix is at least Δ/λ. We use this connection to prove the existence of a maximum class C\\subseteq\\{+/- 1\\}^N with Vapnik-Chervonenkis dimension 2 and sign rank \\widetilde{\\Theta}(N1/2). This answers a question of Ben-David et al. regarding the sign rank of large Vapnik-Chervonenkis classes. We also describe limitations of this approach, in the spirit of the Alon-Boppana theorem. We further describe connections to communication complexity, geometry, learning theory, and combinatorics. Bibliography: 69 titles.

  11. Bisphosphonate-ciprofloxacin bound to Skelite is a prototype for enhancing experimental local antibiotic delivery to injured bone.

    PubMed

    Buxton, T B; Walsh, D S; Harvey, S B; McPherson, J C; Hartmann, J F; Plowman, K M

    2004-09-01

    The risk of osteomyelitis after open bone fracture may be reduced by locally applied antibiotics. ENC-41-HP (E41), which comprises ciprofloxacin linked to a 'bone seeking' bisphosphonate, loaded on to carrier Skelite calcium phosphate granules (E41-Skelite) has favourable in vitro characteristics for application to wounded bone. This study assessed E41-Skelite in a rat model of acute tibial osteomyelitis. Mechanically induced tibial troughs were contaminated with approximately log10 4 colony forming units (c.f.u.) of Staphylococcus aureus (Cowan 1 strain) 'resistant' to E41 (minimum inhibitory concentration 8-16 microg/ml), lavaged and packed with Skelite alone, or with E41-Skelite slurry. Animals were killed at 24 h (n = 62), 72 h (n = 46) or 14 days (n = 12), and each tibia was assessed for S. aureus load (c.f.u./g tibia) and histological appearance (14 days only). At 24 and 72 h, the tibias of rats treated with E41-Skelite (n = 54) had a significantly lower mean (s.e.m.) load of S. aureus than animals that received Skelite alone (n = 54): log10 3.6(0.2) versus 6.4(0.1) c.f.u./g respectively at 24 h (P < 0.001, Mann-Whitney rank sum test) and log10 4.4(0.2) versus 6.6(0.1) c.f.u./g at 72 h (P < 0.001). At 14 days, E41-Skelite-treated tibias had fewer bacteria, no signs of osteomyelitis and histological signs of healing. E41-Skelite, a prototype granulated topical antibiotic delivery system, reduced the development of infection in experimental bone wounds. Copyright 2004 British Journal of Surgery Society Ltd.

  12. 105-KE Isolation Barrier Leak Rate Acceptance Test Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCracken, K.J.

    1995-06-14

    This Acceptance Test Report (ATR) contains the completed and signed Acceptance Procedure (ATP) for the 105-KE Isolations Barrier Leak Rate Test. The Test Engineer`s log, the completed sections of the ATP in the Appendix for Repeat Testing (Appendix K), the approved WHC J-7s (Appendix H), the data logger files (Appendices T and U), and the post test calibration checks (Appendix V) are included.

  13. Integrated analysis of well logs and seismic data to estimate gas hydrate concentrations at Keathley Canyon, Gulf of Mexico

    USGS Publications Warehouse

    Lee, M.W.; Collett, T.S.

    2008-01-01

    Accurately detecting and quantifying gas hydrate or free gas in sediments from seismic data require downhole well-log data to calibrate the physical properties of the gas hydrate-/free gas-bearing sediments. As part of the Gulf of Mexico Joint Industry Program, a series of wells were either cored or drilled in the Gulf of Mexico to characterize the physical properties of gas hydrate-bearing sediments, to calibrate geophysical estimates, and to evaluate source and transport mechanisms for gas within the gas hydrates. Downhole acoustic logs were used sparingly in this study because of degraded log quality due to adverse wellbore conditions. However, reliable logging while drilling (LWD) electrical resistivity and porosity logs were obtained. To tie the well-log information to the available 3-D seismic data in this area, a velocity log was calculated from the available resistivity log at the Keathley Canyon 151-2 well, because the acoustic log or vertical seismic data acquired at the nearby Keathley Canyon 151-3 well were either of poor quality or had limited depth coverage. Based on the gas hydrate saturations estimated from the LWD resistivity log, the modified Biot-Gassmann theory was used to generate synthetic acoustic log and a synthetic seismogram was generated with a fairly good agreement with a seismic profile crossing the well site. Based on the well-log information, a faintly defined bottom-simulating reflection (BSR) in this area is interpreted as a reflection representing gas hydrate-bearing sediments with about 15% saturation overlying partially gas-saturated sediments with 3% saturation. Gas hydrate saturations over 30-40% are estimated from the resistivity log in two distinct intervals at 220-230 and 264-300 m below the sea floor, but gas hydrate was not physically recovered in cores. It is speculated that the poor recovery of cores and gas hydrate morphology are responsible for the lack of physical gas hydrate recovery.

  14. Left Ventricular Dilatation Assessed on the Lateral Chest Radiograph: The Classic Hoffman and Rigler Sign Falls Short in a Modern-Day Population.

    PubMed

    Spaziano, Marco; Marquis-Gravel, Guillaume; Ramsay, Isabelle; Romanelli, Giovanni; Marchand, Émilie; Tournoux, François

    2016-03-01

    The classic Hoffman and Rigler (H&R) sign, originally described in 1965, suggests that left ventricular (LV) dilatation is present if the left ventricle extends more than 18 mm posterior to the inferior vena cava at a level 2 cm above their crossing on a lateral chest radiograph. This sign is still widely used by radiologists but has not been well evaluated against modern methods of noninvasive assessment. This study investigated the sensitivity and specificity of the H&R sign in a modern population. A sample of 145 patients with LV dilatation based on current echocardiographic criteria was matched for age and sex with 145 patients without LV dilatation. Patients were required to have undergone a lateral chest radiograph in the 3 months before or after undergoing echocardiography; the H&R sign and the cardiothoracic index were assessed on the radiograph independently by 2 blinded physicians. Using the threshold value of 18 mm, sensitivity, specificity, and positive and negative likelihood ratios of the H&R sign were 54.9%, 59.2%, 1.34, and 0.76, respectively (area under the curve [AUC], 0.58). In comparison, the cardiothoracic index provided better prediction of LV dilatation (sensitivity, 87.9%; specificity, 47.5%; AUC, 0.72). The H&R sign is a poor marker of LV enlargement when compared with echocardiography and should not be used as a radiologic index of LV enlargement. Copyright © 2016 Canadian Cardiovascular Society. Published by Elsevier Inc. All rights reserved.

  15. Clinical signs of impending death in cancer patients.

    PubMed

    Hui, David; dos Santos, Renata; Chisholm, Gary; Bansal, Swati; Silva, Thiago Buosi; Kilgore, Kelly; Crovador, Camila Souza; Yu, Xiaoying; Swartz, Michael D; Perez-Cruz, Pedro Emilio; Leite, Raphael de Almeida; Nascimento, Maria Salete de Angelis; Reddy, Suresh; Seriaco, Fabiola; Yennu, Sriram; Paiva, Carlos Eduardo; Dev, Rony; Hall, Stacy; Fajardo, Julieta; Bruera, Eduardo

    2014-06-01

    The physical signs of impending death have not been well characterized in cancer patients. A better understanding of these signs may improve the ability of clinicians to diagnose impending death. We examined the frequency and onset of 10 bedside physical signs and their diagnostic performance for impending death. We systematically documented 10 physical signs every 12 hours from admission to death or discharge in 357 consecutive patients with advanced cancer admitted to two acute palliative care units. We examined the frequency and median onset of each sign from death backward and calculated their likelihood ratios (LRs) associated with death within 3 days. In total, 203 of 357 patients (52 of 151 in the U.S., 151 of 206 in Brazil) died. Decreased level of consciousness, Palliative Performance Scale ≤20%, and dysphagia of liquids appeared at high frequency and >3 days before death and had low specificity (<90%) and positive LR (<5) for impending death. In contrast, apnea periods, Cheyne-Stokes breathing, death rattle, peripheral cyanosis, pulselessness of radial artery, respiration with mandibular movement, and decreased urine output occurred mostly in the last 3 days of life and at lower frequency. Five of these signs had high specificity (>95%) and positive LRs for death within 3 days, including pulselessness of radial artery (positive LR: 15.6; 95% confidence interval [CI]: 13.7-17.4), respiration with mandibular movement (positive LR: 10; 95% CI: 9.1-10.9), decreased urine output (positive LR: 15.2; 95% CI: 13.4-17.1), Cheyne-Stokes breathing (positive LR: 12.4; 95% CI: 10.8-13.9), and death rattle (positive LR: 9; 95% CI: 8.1-9.8). We identified highly specific physical signs associated with death within 3 days among cancer patients. ©AlphaMed Press.

  16. Use of Proteomic and Hematology Biomarkers for Prediction of Hematopoietic Acute Radiation Syndrome Severity in Baboon Radiation Models.

    PubMed

    Blakely, William F; Bolduc, David L; Debad, Jeff; Sigal, George; Port, Matthias; Abend, Michael; Valente, Marco; Drouet, Michel; Hérodin, Francis

    2018-07-01

    Use of plasma proteomic and hematological biomarkers represents a promising approach to provide useful diagnostic information for assessment of the severity of hematopoietic acute radiation syndrome. Eighteen baboons were evaluated in a radiation model that underwent total-body and partial-body irradiations at doses of Co gamma rays from 2.5 to 15 Gy at dose rates of 6.25 cGy min and 32 cGy min. Hematopoietic acute radiation syndrome severity levels determined by an analysis of blood count changes measured up to 60 d after irradiation were used to gauge overall hematopoietic acute radiation syndrome severity classifications. A panel of protein biomarkers was measured on plasma samples collected at 0 to 28 d after exposure using electrochemiluminescence-detection technology. The database was split into two distinct groups (i.e., "calibration," n = 11; "validation," n = 7). The calibration database was used in an initial stepwise regression multivariate model-fitting approach followed by down selection of biomarkers for identification of subpanels of hematopoietic acute radiation syndrome-responsive biomarkers for three time windows (i.e., 0-2 d, 2-7 d, 7-28 d). Model 1 (0-2 d) includes log C-reactive protein (p < 0.0001), log interleukin-13 (p < 0.0054), and procalcitonin (p < 0.0316) biomarkers; model 2 (2-7 d) includes log CD27 (p < 0.0001), log FMS-related tyrosine kinase 3 ligand (p < 0.0001), log serum amyloid A (p < 0.0007), and log interleukin-6 (p < 0.0002); and model 3 (7-28 d) includes log CD27 (p < 0.0012), log serum amyloid A (p < 0.0002), log erythropoietin (p < 0.0001), and log CD177 (p < 0.0001). The predicted risk of radiation injury categorization values, representing the hematopoietic acute radiation syndrome severity outcome for the three models, produced least squares multiple regression fit confidences of R = 0.73, 0.82, and 0.75, respectively. The resultant algorithms support the proof of concept that plasma proteomic biomarkers can supplement clinical signs and symptoms to assess hematopoietic acute radiation syndrome risk severity.

  17. Concealment of sexual orientation.

    PubMed

    Sylva, David; Rieger, Gerulf; Linsenmeier, Joan A W; Bailey, J Michael

    2010-02-01

    Sex-atypical behaviors may be used to identify a person as homosexual. To shield themselves from prejudice, homosexual people may attempt to conceal these behaviors. It is not clear how effectively they can do so. In Study 1, we asked homosexual participants to conceal their sex-atypical behaviors while talking about the weather. Raters watched videos of the participants and judged the likelihood that each participant was homosexual. Homosexual participants were able to partially conceal signs of their orientation, but they remained distinguishable from heterosexual participants. In Study 2, we tested the ability to conceal signs of one's sexual orientation in a more demanding situation: a mock job interview. In this scenario, homosexual men were even less effective at concealing their orientation. Higher cognitive demands in this new situation may have interfered with their ability to conceal.

  18. Risk Associated with the Release of Wolbachia-Infected Aedes aegypti Mosquitoes into the Environment in an Effort to Control Dengue.

    PubMed

    Murray, Justine V; Jansen, Cassie C; De Barro, Paul

    2016-01-01

    In an effort to eliminate dengue, a successful technology was developed with the stable introduction of the obligate intracellular bacteria Wolbachia pipientis into the mosquito Aedes aegypti to reduce its ability to transmit dengue fever due to life shortening and inhibition of viral replication effects. An analysis of risk was required before considering release of the modified mosquito into the environment. Expert knowledge and a risk assessment framework were used to identify risk associated with the release of the modified mosquito. Individual and group expert elicitation was performed to identify potential hazards. A Bayesian network (BN) was developed to capture the relationship between hazards and the likelihood of events occurring. Risk was calculated from the expert likelihood estimates populating the BN and the consequence estimates elicited from experts. The risk model for "Don't Achieve Release" provided an estimated 46% likelihood that the release would not occur by a nominated time but generated an overall risk rating of very low. The ability to obtain compliance had the greatest influence on the likelihood of release occurring. The risk model for "Cause More Harm" provided a 12.5% likelihood that more harm would result from the release, but the overall risk was considered negligible. The efficacy of mosquito management had the most influence, with the perception that the threat of dengue fever had been eliminated, resulting in less household mosquito control, and was scored as the highest ranked individual hazard (albeit low risk). The risk analysis was designed to incorporate the interacting complexity of hazards that may affect the release of the technology into the environment. The risk analysis was a small, but important, implementation phase in the success of this innovative research introducing a new technology to combat dengue transmission in the environment.

  19. Absence of Sublexical Representations in Late-Learning Signers? A Statistical Critique of Lieberman et al. (2015)

    ERIC Educational Resources Information Center

    Salverda, Anne Pier

    2016-01-01

    Lieberman, Borovsky, Hatrak, and Mayberry (2015) used a modified version of the visual-world paradigm to examine the real-time processing of signs in American Sign Language. They examined the activation of phonological and semantic competitors in native signers and late-learning signers and concluded that their results provide evidence that the…

  20. Effects of modified constraint-induced movement therapy on reach-to-grasp movements and functional performance after chronic stroke: a randomized controlled study.

    PubMed

    Lin, K-C; Wu, C-Y; Wei, T-H; Lee, C-Y; Liu, J-S

    2007-12-01

    To evaluate changes in (1) motor control characteristics of the hemiparetic hand during the performance of a functional reach-to-grasp task and (2) functional performance of daily activities in patients with stroke treated with modified constraint-induced movement therapy. Two-group randomized controlled trial with pretreatment and posttreatment measures. Rehabilitation clinics. Thirty-two chronic stroke patients (21 men, 11 women; mean age=57.9 years, range=43-81 years) 13-26 months (mean 16.3 months) after onset of a first-ever cerebrovascular accident. Thirty-two patients were randomized to receive modified constraint-induced movement therapy (restraint of the unaffected limb combined with intensive training of the affected limb) or traditional rehabilitation for three weeks. Kinematic analysis was used to assess motor control characteristics as patients reached to grasp a beverage can. Functional outcomes were evaluated using the Motor Activity Log and Functional Independence Measure. There were moderate and significant effects of modified constraint-induced movement therapy on some aspects of motor control of reach-to-grasp and on functional ability. The modified constraint-induced movement therapy group preplanned reaching and grasping (P=0.018) more efficiently and depended more on the feedforward control of reaching (P=0.046) than did the traditional rehabilitation group. The modified constraint-induced movement therapy group also showed significantly improved functional performance on the Motor Activity Log (P<0.0001) and the Functional Independence Measure (P=0.016). In addition to improving functional use of the affected arm and daily functioning, modified constraint-induced movement therapy improved motor control strategy during goal-directed reaching, a possible mechanism for the improved movement performance of stroke patients undergoing this therapy.

  1. Advanced prior modeling for 3D bright field electron tomography

    NASA Astrophysics Data System (ADS)

    Sreehari, Suhas; Venkatakrishnan, S. V.; Drummy, Lawrence F.; Simmons, Jeffrey P.; Bouman, Charles A.

    2015-03-01

    Many important imaging problems in material science involve reconstruction of images containing repetitive non-local structures. Model-based iterative reconstruction (MBIR) could in principle exploit such redundancies through the selection of a log prior probability term. However, in practice, determining such a log prior term that accounts for the similarity between distant structures in the image is quite challenging. Much progress has been made in the development of denoising algorithms like non-local means and BM3D, and these are known to successfully capture non-local redundancies in images. But the fact that these denoising operations are not explicitly formulated as cost functions makes it unclear as to how to incorporate them in the MBIR framework. In this paper, we formulate a solution to bright field electron tomography by augmenting the existing bright field MBIR method to incorporate any non-local denoising operator as a prior model. We accomplish this using a framework we call plug-and-play priors that decouples the log likelihood and the log prior probability terms in the MBIR cost function. We specifically use 3D non-local means (NLM) as the prior model in the plug-and-play framework, and showcase high quality tomographic reconstructions of a simulated aluminum spheres dataset, and two real datasets of aluminum spheres and ferritin structures. We observe that streak and smear artifacts are visibly suppressed, and that edges are preserved. Also, we report lower RMSE values compared to the conventional MBIR reconstruction using qGGMRF as the prior model.

  2. A Comparison of Diarrheal Severity Scores in the MAL-ED Multisite Community-Based Cohort Study

    PubMed Central

    Lee, Gwenyth O.; Richard, Stephanie A.; Kang, Gagandeep; Houpt, Eric R.; Seidman, Jessica C.; Pendergast, Laura L.; Bhutta, Zulfiqar A.; Ahmed, Tahmeed; Mduma, Estomih R.; Lima, Aldo A.; Bessong, Pascal; Jennifer, Mats Steffi; Hossain, Md. Iqbal; Chandyo, Ram Krishna; Nyathi, Emanuel; Lima, Ila F.; Pascal, John; Soofi, Sajid; Ladaporn, Bodhidatta; Guerrant, Richard L.; Caulfield, Laura E.; Black, Robert E.; Kosek, Margaret N.

    2016-01-01

    ABSTRACT Objectives: There is a lack of consensus on how to measure diarrheal severity. Within the context of a multisite, prospective cohort study, we evaluated the performance of a modified Vesikari score (MAL-ED), 2 previously published scores (Clark and CODA [a diarrheal severity score (Community DiarrheA) published by Lee et al]), and a modified definition of moderate-to-severe diarrhea (MSD) based on dysentery and health care worker diagnosed dehydration. Methods: Scores were built using maternally reported symptoms or fieldworker-reported clinical signs obtained during the first 7 days of a diarrheal episode. The association between these and the risk of hospitalization were tested using receiver operating characteristic analysis. Severity scores were also related to illness etiology, and the likelihood of the episode subsequently becoming prolonged or persistent. Results: Of 10,159 episodes from 1681 children, 143 (4.0%) resulted in hospitalization. The area under the curve of each score as a predictor of hospitalization was 0.84 (95% confidence interval: 0.81, 0.87) (Clark), 0.85 (0.82, 0.88) (MAL-ED), and 0.87 (0.84, 0.89) (CODA). Severity was also associated with etiology and episode duration. Although families were more likely to seek care for severe diarrhea, approximately half of severe cases never reached the health system. Conclusions: Community-based diarrheal severity scores are predictive of relevant child health outcomes. Because they require no assumptions about health care access or utilization, they are useful in refining estimates of the burden of diarrheal disease, in estimating the effect of disease control interventions, and in triaging children for referral in low- and middle-income countries in which the rates of morbidity and mortality after diarrhea remain high. PMID:27347723

  3. Prospective Validation of Modified NEXUS Cervical Spine Injury Criteria in Low-risk Elderly Fall Patients

    PubMed Central

    Tran, John; Jeanmonod, Donald; Agresti, Darin; Hamden, Khalief; Jeanmonod, Rebecca K.

    2016-01-01

    Introduction The National Emergency X-radiography Utilization Study (NEXUS) criteria are used extensively in emergency departments to rule out C-spine injuries (CSI) in the general population. Although the NEXUS validation set included 2,943 elderly patients, multiple case reports and the Canadian C-Spine Rules question the validity of applying NEXUS to geriatric populations. The objective of this study was to validate a modified NEXUS criteria in a low-risk elderly fall population with two changes: a modified definition for distracting injury and the definition of normal mentation. Methods This is a prospective, observational cohort study of geriatric fall patients who presented to a Level I trauma center and were not triaged to the trauma bay. Providers enrolled non-intoxicated patients at baseline mental status with no lateralizing neurologic deficits. They recorded midline neck tenderness, signs of trauma, and presence of other distracting injury. Results We enrolled 800 patients. One patient fall event was excluded due to duplicate enrollment, and four were lost to follow up, leaving 795 for analysis. Average age was 83.6 (range 65–101). The numbers in parenthesis after the negative predictive value represent confidence interval. There were 11 (1.4%) cervical spine injuries. One hundred seventeen patients had midline tenderness and seven of these had CSI; 366 patients had signs of trauma to the face/neck, and 10 of these patients had CSI. Using signs of trauma to the head/neck as the only distracting injury and baseline mental status as normal alertness, the modified NEXUS criteria was 100% sensitive (CI [67.9–100]) with a negative predictive value of 100 (98.7–100). Conclusion Our study suggests that a modified NEXUS criteria can be safely applied to low-risk elderly falls. PMID:27330655

  4. Prospective Validation of Modified NEXUS Cervical Spine Injury Criteria in Low-risk Elderly Fall Patients.

    PubMed

    Tran, John; Jeanmonod, Donald; Agresti, Darin; Hamden, Khalief; Jeanmonod, Rebecca K

    2016-05-01

    The National Emergency X-radiography Utilization Study (NEXUS) criteria are used extensively in emergency departments to rule out C-spine injuries (CSI) in the general population. Although the NEXUS validation set included 2,943 elderly patients, multiple case reports and the Canadian C-Spine Rules question the validity of applying NEXUS to geriatric populations. The objective of this study was to validate a modified NEXUS criteria in a low-risk elderly fall population with two changes: a modified definition for distracting injury and the definition of normal mentation. This is a prospective, observational cohort study of geriatric fall patients who presented to a Level I trauma center and were not triaged to the trauma bay. Providers enrolled non-intoxicated patients at baseline mental status with no lateralizing neurologic deficits. They recorded midline neck tenderness, signs of trauma, and presence of other distracting injury. We enrolled 800 patients. One patient fall event was excluded due to duplicate enrollment, and four were lost to follow up, leaving 795 for analysis. Average age was 83.6 (range 65-101). The numbers in parenthesis after the negative predictive value represent confidence interval. There were 11 (1.4%) cervical spine injuries. One hundred seventeen patients had midline tenderness and seven of these had CSI; 366 patients had signs of trauma to the face/neck, and 10 of these patients had CSI. Using signs of trauma to the head/neck as the only distracting injury and baseline mental status as normal alertness, the modified NEXUS criteria was 100% sensitive (CI [67.9-100]) with a negative predictive value of 100 (98.7-100). Our study suggests that a modified NEXUS criteria can be safely applied to low-risk elderly falls.

  5. Phenomenology of Schizophrenia and the Representativeness of Modern Diagnostic Criteria.

    PubMed

    Kendler, Kenneth S

    2016-10-01

    This article aims to determine the degree to which modern operationalized diagnostic criteria for schizophrenia reflect the main clinical features of the disorder as described historically by diagnostic experts. Amazon.com, the National Library of Medicine, and Forgottenbooks.com were searched for articles written or translated into English from 1900 to 1960. Clinical descriptions of schizophrenia or dementia praecox appearing in 16 textbooks or review articles published between 1899 and 1956 were reviewed and compared with the criteria for schizophrenia from 6 modern US operationalized diagnostic systems. Twenty prominent symptoms and signs were reported by 5 or more authors. A strong association was seen between the frequency with which the symptoms/signs were reported and the likelihood of their presence in modern diagnostic systems. Of these 20 symptoms/signs, 3 (thought disorder, delusions, and hallucinations) were included in all diagnostic systems and were among the 4 most frequently reported. Three symptoms/signs were added then kept in subsequent criteria: emotional blunting, changes in volition, and changes in social life. Three symptoms/signs were added but then dropped: bizarre delusions, passivity symptoms, and mood incongruity. Eleven symptoms/signs were never included in any diagnostic system. Compared with historical authors, modern criteria favored symptoms over signs. Odd movements and postures, noted by 16 of 18 historical authors, were absent from all modern criteria. DSM-5 criteria contain 6 of the 20 historically noted symptoms/signs. Although modern operationalized criteria for schizophrenia reflect symptoms and signs commonly reported by historical experts, many clinical features emphasized by these experts are absent from modern criteria. This is not necessarily problematic as diagnostic criteria are meant to index rather than thoroughly describe syndromes. However, the lack of correspondence in schizophrenia between historically important symptoms/signs and current diagnostic systems highlights the limitations of clinical evaluations and research studies that restrict the diagnostic assessments to current diagnostic criteria. We should not confuse our DSM diagnostic criteria with the disorders that they were designed to index.

  6. Multilevel correlates of household anthropometric typologies in Colombian mothers and their infants.

    PubMed

    Parra, D C; Gomez, L F; Iannotti, L; Haire-Joshu, D; Sebert Kuhlmann, A K; Brownson, R C

    2018-01-01

    The aim of this study was to establish the association of maternal, family, and contextual correlates of anthropometric typologies at the household level in Colombia using 2005 Demographic Health Survey (DHS/ENDS) data. Household-level information from mothers 18-49 years old and their children <5 years old was included. Stunting and overweight were assessed for each child. Mothers were classified according to their body mass index. Four anthropometric typologies at the household level were constructed: normal, underweight, overweight, and dual burden. Four three-level [households ( n  = 8598) nested within municipalities ( n  = 226), nested within states ( n  = 32)] hierarchical polytomous logistic models were developed. Household log-odds of belonging to one of the four anthropometric categories, holding 'normal' as the reference group, were obtained. This study found that anthropometric typologies were associated with maternal and family characteristics of maternal age, parity, maternal education, and wealth index. Higher municipal living conditions index was associated with a lower likelihood of underweight typology and a higher likelihood of overweight typology. Higher population density was associated with a lower likelihood of overweight typology. Distal and proximal determinants of the various anthropometric typologies at the household level should be taken into account when framing policies and designing interventions to reduce malnutrition in Colombia.

  7. An algorithm for computing moments-based flood quantile estimates when historical flood information is available

    USGS Publications Warehouse

    Cohn, T.A.; Lane, W.L.; Baier, W.G.

    1997-01-01

    This paper presents the expected moments algorithm (EMA), a simple and efficient method for incorporating historical and paleoflood information into flood frequency studies. EMA can utilize three types of at-site flood information: systematic stream gage record; information about the magnitude of historical floods; and knowledge of the number of years in the historical period when no large flood occurred. EMA employs an iterative procedure to compute method-of-moments parameter estimates. Initial parameter estimates are calculated from systematic stream gage data. These moments are then updated by including the measured historical peaks and the expected moments, given the previously estimated parameters, of the below-threshold floods from the historical period. The updated moments result in new parameter estimates, and the last two steps are repeated until the algorithm converges. Monte Carlo simulations compare EMA, Bulletin 17B's [United States Water Resources Council, 1982] historically weighted moments adjustment, and maximum likelihood estimators when fitting the three parameters of the log-Pearson type III distribution. These simulations demonstrate that EMA is more efficient than the Bulletin 17B method, and that it is nearly as efficient as maximum likelihood estimation (MLE). The experiments also suggest that EMA has two advantages over MLE when dealing with the log-Pearson type III distribution: It appears that EMA estimates always exist and that they are unique, although neither result has been proven. EMA can be used with binomial or interval-censored data and with any distributional family amenable to method-of-moments estimation.

  8. Hospital pharmacists’ knowledge about and attitude toward HIV/AIDS and patients living with HIV/AIDS in Kedah, Malaysia

    PubMed Central

    Baig, Mirza Rafi

    2012-01-01

    Introduction The current study aims to explore the knowledge, attitude, and perception of hospital pharmacists towards HIV/AIDS and patients living with HIV/AIDS (PLWHA) in the state of Kedah, Malaysia. Material and methods This was a cross-sectional study conducted among the hospital pharmacists in three government hospitals in Kedah, using a self-administered 43-item questionnaire. Data analysis was done using non-parametric and multinomial regression. Results A total of 75 respondents participated in this study, resulting in a response rate of 60.8%. The majority were found to be well aware of the causes of HIV/AIDS. However, about 34 (45.3%) believed erroneously that HIV/AIDS cannot be transmitted through tattooing or body piercing. Nearly 25 (33.3%) of the respondents believed that preventing the use of intravenous drugs may not be effective to prevent HIV/AIDS and endorsed social isolation as a measure to prevent HIV/AIDS. The majority (66.6%) had negative attitudes and about 20% held extremely negative attitudes. Findings from regression modelling revealed that hospital (–2 log likelihood = 215.182, χ2 = 18.060, Df = 8, p = 0.021) and gender (–2 log likelihood = 213.643, χ2 = 16.521, Df = 8, p = 0.035) were more likely to affect the attitudes of respondents. Conclusions Overall, more than one third of the respondents were found to have negative attitudes towards PLWHA. Gender, job experience, and hospitals with more HIV/AIDS patient visits were the main factors affecting attitudes. PMID:24482660

  9. An algorithm for computing moments-based flood quantile estimates when historical flood information is available

    NASA Astrophysics Data System (ADS)

    Cohn, T. A.; Lane, W. L.; Baier, W. G.

    This paper presents the expected moments algorithm (EMA), a simple and efficient method for incorporating historical and paleoflood information into flood frequency studies. EMA can utilize three types of at-site flood information: systematic stream gage record; information about the magnitude of historical floods; and knowledge of the number of years in the historical period when no large flood occurred. EMA employs an iterative procedure to compute method-of-moments parameter estimates. Initial parameter estimates are calculated from systematic stream gage data. These moments are then updated by including the measured historical peaks and the expected moments, given the previously estimated parameters, of the below-threshold floods from the historical period. The updated moments result in new parameter estimates, and the last two steps are repeated until the algorithm converges. Monte Carlo simulations compare EMA, Bulletin 17B's [United States Water Resources Council, 1982] historically weighted moments adjustment, and maximum likelihood estimators when fitting the three parameters of the log-Pearson type III distribution. These simulations demonstrate that EMA is more efficient than the Bulletin 17B method, and that it is nearly as efficient as maximum likelihood estimation (MLE). The experiments also suggest that EMA has two advantages over MLE when dealing with the log-Pearson type III distribution: It appears that EMA estimates always exist and that they are unique, although neither result has been proven. EMA can be used with binomial or interval-censored data and with any distributional family amenable to method-of-moments estimation.

  10. Wetlands Research Program. Corps of Engineers Wetlands Delineation Manual. Appendix C. Sections 1 and 2. Region 4 - North Plains.

    DTIC Science & Technology

    1987-01-01

    wetlands, are as follows: Category Symbol Definition OBLIGATE WETLAND OBL Plants that occur almost always PLANTS (estimated probability >.992) in...estimated probability 1% to 33%) in nonwetlands. FACULTATIVE PLANTS FAC Plants with a similar likelihood (estimated probability 337 to 67%) of... Symbols Symbols appearing in the list under the indicator status column are as follows: +: A "+" sign following an indicator status denotes that the

  11. Quincke, de Musset, Duroziez, and Hill: some aortic regurgitations.

    PubMed

    Sapira, J D

    1981-04-01

    Four peripheral signs of aortic insufficiency are considered in terms of their original descriptions, their popularity, and their potential future clinical contribution. It is concluded that: (1) Quincke's capillary pulse sign is not useful. (2) de Musset's head bobbing sign is of undetermined but apparently low sensitivity and specificity. (3) Duroziez's femoral double intermittent murmur sign, as modified by Blumgart and Ernstene, is almost 100% specific for the diagnosis of aortic insufficiency. Since its sensitivity, when properly performed, is about 90%, especially in pure aortic insufficiency, it is highly recommended. (4) Hill's sign (a popliteal indirect systolic blood pressure which is 20 mm Hg greater than a simultaneously measured brachial indirect systolic blood pressure) though almost unknown, is useful in diagnosing all but the mild cases of aortic insufficiency, and is the only sign that may predict the degree of aortic insufficiency subsequently found angiographically.

  12. Objectively combining AR5 instrumental period and paleoclimate climate sensitivity evidence

    NASA Astrophysics Data System (ADS)

    Lewis, Nicholas; Grünwald, Peter

    2018-03-01

    Combining instrumental period evidence regarding equilibrium climate sensitivity with largely independent paleoclimate proxy evidence should enable a more constrained sensitivity estimate to be obtained. Previous, subjective Bayesian approaches involved selection of a prior probability distribution reflecting the investigators' beliefs about climate sensitivity. Here a recently developed approach employing two different statistical methods—objective Bayesian and frequentist likelihood-ratio—is used to combine instrumental period and paleoclimate evidence based on data presented and assessments made in the IPCC Fifth Assessment Report. Probabilistic estimates from each source of evidence are represented by posterior probability density functions (PDFs) of physically-appropriate form that can be uniquely factored into a likelihood function and a noninformative prior distribution. The three-parameter form is shown accurately to fit a wide range of estimated climate sensitivity PDFs. The likelihood functions relating to the probabilistic estimates from the two sources are multiplicatively combined and a prior is derived that is noninformative for inference from the combined evidence. A posterior PDF that incorporates the evidence from both sources is produced using a single-step approach, which avoids the order-dependency that would arise if Bayesian updating were used. Results are compared with an alternative approach using the frequentist signed root likelihood ratio method. Results from these two methods are effectively identical, and provide a 5-95% range for climate sensitivity of 1.1-4.05 K (median 1.87 K).

  13. Feature and Score Fusion Based Multiple Classifier Selection for Iris Recognition

    PubMed Central

    Islam, Md. Rabiul

    2014-01-01

    The aim of this work is to propose a new feature and score fusion based iris recognition approach where voting method on Multiple Classifier Selection technique has been applied. Four Discrete Hidden Markov Model classifiers output, that is, left iris based unimodal system, right iris based unimodal system, left-right iris feature fusion based multimodal system, and left-right iris likelihood ratio score fusion based multimodal system, is combined using voting method to achieve the final recognition result. CASIA-IrisV4 database has been used to measure the performance of the proposed system with various dimensions. Experimental results show the versatility of the proposed system of four different classifiers with various dimensions. Finally, recognition accuracy of the proposed system has been compared with existing N hamming distance score fusion approach proposed by Ma et al., log-likelihood ratio score fusion approach proposed by Schmid et al., and single level feature fusion approach proposed by Hollingsworth et al. PMID:25114676

  14. Spatial resolution properties of motion-compensated tomographic image reconstruction methods.

    PubMed

    Chun, Se Young; Fessler, Jeffrey A

    2012-07-01

    Many motion-compensated image reconstruction (MCIR) methods have been proposed to correct for subject motion in medical imaging. MCIR methods incorporate motion models to improve image quality by reducing motion artifacts and noise. This paper analyzes the spatial resolution properties of MCIR methods and shows that nonrigid local motion can lead to nonuniform and anisotropic spatial resolution for conventional quadratic regularizers. This undesirable property is akin to the known effects of interactions between heteroscedastic log-likelihoods (e.g., Poisson likelihood) and quadratic regularizers. This effect may lead to quantification errors in small or narrow structures (such as small lesions or rings) of reconstructed images. This paper proposes novel spatial regularization design methods for three different MCIR methods that account for known nonrigid motion. We develop MCIR regularization designs that provide approximately uniform and isotropic spatial resolution and that match a user-specified target spatial resolution. Two-dimensional PET simulations demonstrate the performance and benefits of the proposed spatial regularization design methods.

  15. Feature and score fusion based multiple classifier selection for iris recognition.

    PubMed

    Islam, Md Rabiul

    2014-01-01

    The aim of this work is to propose a new feature and score fusion based iris recognition approach where voting method on Multiple Classifier Selection technique has been applied. Four Discrete Hidden Markov Model classifiers output, that is, left iris based unimodal system, right iris based unimodal system, left-right iris feature fusion based multimodal system, and left-right iris likelihood ratio score fusion based multimodal system, is combined using voting method to achieve the final recognition result. CASIA-IrisV4 database has been used to measure the performance of the proposed system with various dimensions. Experimental results show the versatility of the proposed system of four different classifiers with various dimensions. Finally, recognition accuracy of the proposed system has been compared with existing N hamming distance score fusion approach proposed by Ma et al., log-likelihood ratio score fusion approach proposed by Schmid et al., and single level feature fusion approach proposed by Hollingsworth et al.

  16. [Contribution of ultrasound signs for the prenatal diagnosis of posterior urethral valves: Experience of 3years at the maternity of the Bicêtre Hospital].

    PubMed

    Roy, S; Colmant, C; Cordier, A-G; Sénat, M-V

    2016-05-01

    Posterior urethral valves (PUV) are the most common cause of renal impairment in boys during early childhood. The aim of this study was to evaluate the value of ultrasound (US) criteria currently used to diagnose PUV. From 2009 to 2012, 31 patients were referred to the Bicêtre Hospital after detection of fetal bilateral hydronephrosis in male fetus. The ultrasound criteria were bladder dilation, thick-walled bladder, urethral dilation ("keyhole sign"), and amniotic fluid volume. Patients were divided in two groups: suspected or not to have PUV. US diagnosis of PUV was done in 18 fetuses and confirmed in 14 new borns, one of them without prenatal diagnosis. Sensitivity and specificity of US scan were 92.8 and 66.7%. The likelihood ratio (LHR) was 4.8 for a thick-walled bladder, 4.2 for oligohydramnios, 3.6 for the "keyhole sign", 2.4 for bladder dilation and 1.6 for ureteral dilation. The first four signs were combined in four fetuses, all of them with PUV. US scan is a very sensitive exam for the diagnosis of PUV but with a low specificity. A thick-walled bladder seems to have a better diagnostic performance than the "keyhole sign". Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  17. Remote sensing of multiple vital signs using a CMOS camera-equipped infrared thermography system and its clinical application in rapidly screening patients with suspected infectious diseases.

    PubMed

    Sun, Guanghao; Nakayama, Yosuke; Dagdanpurev, Sumiyakhand; Abe, Shigeto; Nishimura, Hidekazu; Kirimoto, Tetsuo; Matsui, Takemi

    2017-02-01

    Infrared thermography (IRT) is used to screen febrile passengers at international airports, but it suffers from low sensitivity. This study explored the application of a combined visible and thermal image processing approach that uses a CMOS camera equipped with IRT to remotely sense multiple vital signs and screen patients with suspected infectious diseases. An IRT system that produced visible and thermal images was used for image acquisition. The subjects' respiration rates were measured by monitoring temperature changes around the nasal areas on thermal images; facial skin temperatures were measured simultaneously. Facial blood circulation causes tiny color changes in visible facial images that enable the determination of the heart rate. A logistic regression discriminant function predicted the likelihood of infection within 10s, based on the measured vital signs. Sixteen patients with an influenza-like illness and 22 control subjects participated in a clinical test at a clinic in Fukushima, Japan. The vital-sign-based IRT screening system had a sensitivity of 87.5% and a negative predictive value of 91.7%; these values are higher than those of conventional fever-based screening approaches. Multiple vital-sign-based screening efficiently detected patients with suspected infectious diseases. It offers a promising alternative to conventional fever-based screening. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  18. Mallampati test as a predictor of laryngoscopic view.

    PubMed

    Adamus, Milan; Fritscherova, Sarka; Hrabalek, Lumir; Gabrhelik, Tomas; Zapletalova, Jana; Janout, Vladimir

    2010-12-01

    To determine the accuracy of the modified Mallampati test for predicting difficult tracheal intubation. A cross-sectional, clinical, observational, non-blinded study. A quality analysis of anesthetic care. Operating theatres and department of anesthesiology in a university hospital. Following the local ethics committee approval and patients' informed consent to anesthesia, all adult patients (> 18 yrs) presenting for any type of non-emergency surgical procedures under general anesthesia requiring endotracheal intubation were enrolled. Prior to anesthesia, Samsoon and Young's modification of the Mallampati test (modified Mallampati test) was performed. Following induction, the anesthesiologist described the laryngoscopic view using the Cormack-Lehane scale. Classes 3 or 4 of the modified Mallampati test were considered a predictor of difficult intubation. Grades 3 or 4 of the Cormack-Lehane classification of the laryngoscopic view were defined as impaired glottic exposure. The sensitivity, specificity, positive and negative predictive value, relative risk, likelihood ratio and accuracy of the modified Mallampati test were calculated on 2x2 contingency tables. Of the total 1,518 patients enrolled, 48 had difficult intubation (3.2%). We failed to detect as many as 35.4% patients in whom glottis exposure during direct laryngoscopy was inadequate (sensitivity 0.646). Compared to the original article by Mallampati, we found lower specificity (0.824 vs. 0.995), lower positive predictive value (0.107 vs. 0.933), higher negative predictive value (0.986 vs. 0.928), lower likelihood ratio (3.68 vs. 91.0) and accuracy (0.819 vs. 0.929). When used as a single examination, the modified Mallampati test is of limited value in predicting difficult intubation.

  19. Performance of the AOAC use-dilution method with targeted modifications: collaborative study.

    PubMed

    Tomasino, Stephen F; Parker, Albert E; Hamilton, Martin A; Hamilton, Gordon C

    2012-01-01

    The U.S. Environmental Protection Agency (EPA), in collaboration with an industry work group, spearheaded a collaborative study designed to further enhance the AOAC use-dilution method (UDM). Based on feedback from laboratories that routinely conduct the UDM, improvements to the test culture preparation steps were prioritized. A set of modifications, largely based on culturing the test microbes on agar as specified in the AOAC hard surface carrier test method, were evaluated in a five-laboratory trial. The modifications targeted the preparation of the Pseudomonas aeruginosa test culture due to the difficulty in separating the pellicle from the broth in the current UDM. The proposed modifications (i.e., the modified UDM) were compared to the current UDM methodology for P. aeruginosa and Staphylococcus aureus. Salmonella choleraesuis was not included in the study. The goal was to determine if the modifications reduced method variability. Three efficacy response variables were statistically analyzed: the number of positive carriers, the log reduction, and the pass/fail outcome. The scope of the collaborative study was limited to testing one liquid disinfectant (an EPA-registered quaternary ammonium product) at two levels of presumed product efficacies, high and low. Test conditions included use of 400 ppm hard water as the product diluent and a 5% organic soil load (horse serum) added to the inoculum. Unfortunately, the study failed to support the adoption of the major modification (use of an agar-based approach to grow the test cultures) based on an analysis of method's variability. The repeatability and reproducibility standard deviations for the modified method were equal to or greater than those for the current method across the various test variables. However, the authors propose retaining the frozen stock preparation step of the modified method, and based on the statistical equivalency of the control log densities, support its adoption as a procedural change to the current UDM. The current UDM displayed acceptable responsiveness to changes in product efficacy; acceptable repeatability across multiple tests in each laboratory for the control counts and log reductions; and acceptable reproducibility across multiple laboratories for the control log density values and log reductions. Although the data do not support the adoption of all modifications, the UDM collaborative study data are valuable for assessing sources of method variability and a reassessment of the performance standard for the UDM.

  20. Property Attribution in Combined Concepts

    ERIC Educational Resources Information Center

    Spalding, Thomas L.; Gagné, Christina L.

    2015-01-01

    Recent research shows that the judged likelihood of properties of modified nouns ("baby ducks have webbed feet") is reduced relative to judgments for unmodified nouns ("ducks have webbed feet"). This modification effect has been taken as evidence both for and against the idea that combined concepts automatically inherit…

  1. The GMO-Nanotech (Dis)Analogy?

    ERIC Educational Resources Information Center

    Sandler, Ronald; Kay, W. D.

    2006-01-01

    The genetically-modified-organism (GMO) experience has been prominent in motivating science, industry, and regulatory communities to address the social and ethical dimensions of nanotechnology. However, there are some significant problems with the GMO-nanotech analogy. First, it overstates the likelihood of a GMO-like backlash against…

  2. Urban planning and traffic safety at night

    NASA Astrophysics Data System (ADS)

    Ispas, N.; Trusca, D.

    2016-08-01

    Urban planning including traffic signs serve vital functions, providing road users with regulatory, warning and guidance information about the roadway and surrounding environment. There are a large number of signs and even more guidelines on how these signs should be designed, installed, and maintained in concordance with on road surface traffic signs. More requirements for signs are made for night urban traffic, including appearance (size, shape, colour), placement (height, lateral, and longitudinal), maintenance (visibility, position, damage) and signs light and retroreflective. In the night, traffic signs visibility can interact by on pedestrian visibility and diminish urban traffic safety. The main aim of this paper are the scientific determination of an urban specific zone visibility for evaluate at night real conditions in case of a traffic accident in the Braşov city area. The night visibility study was made using PC-Rect version 4.2. Other goal of the paper was to modify some urban planning solution in order to increase the urban safety in Brașov.

  3. Efficient estimation of Pareto model: Some modified percentile estimators.

    PubMed

    Bhatti, Sajjad Haider; Hussain, Shahzad; Ahmad, Tanvir; Aslam, Muhammad; Aftab, Muhammad; Raza, Muhammad Ali

    2018-01-01

    The article proposes three modified percentile estimators for parameter estimation of the Pareto distribution. These modifications are based on median, geometric mean and expectation of empirical cumulative distribution function of first-order statistic. The proposed modified estimators are compared with traditional percentile estimators through a Monte Carlo simulation for different parameter combinations with varying sample sizes. Performance of different estimators is assessed in terms of total mean square error and total relative deviation. It is determined that modified percentile estimator based on expectation of empirical cumulative distribution function of first-order statistic provides efficient and precise parameter estimates compared to other estimators considered. The simulation results were further confirmed using two real life examples where maximum likelihood and moment estimators were also considered.

  4. A modification of the fusion model for log polar coordinates

    NASA Technical Reports Server (NTRS)

    Griswold, N. C.; Weiman, Carl F. R.

    1990-01-01

    The fusion mechanism for application in stereo analysis of range restricted the depth of field and therefore required a shift variant mechanism in the peripheral area to find disparity. Misregistration was prevented by restricting the disparity detection range to a neighborhood spanned by the directional edge detection filters. This transformation was essentially accomplished by a nonuniform resampling of the original image in a horizontal direction. While this is easily implemented for digital processing, the approach does not (in the peripheral vision area) model the log-conformal mapping which is known to occur in the human mechanism. This paper therefore modifies the original fusion concept in the peripheral area to include the polar exponential grid-to-log conformal tesselation. Examples of the fusion process resulting in accurate disparity values are given.

  5. Zero-inflated Poisson model based likelihood ratio test for drug safety signal detection.

    PubMed

    Huang, Lan; Zheng, Dan; Zalkikar, Jyoti; Tiwari, Ram

    2017-02-01

    In recent decades, numerous methods have been developed for data mining of large drug safety databases, such as Food and Drug Administration's (FDA's) Adverse Event Reporting System, where data matrices are formed by drugs such as columns and adverse events as rows. Often, a large number of cells in these data matrices have zero cell counts and some of them are "true zeros" indicating that the drug-adverse event pairs cannot occur, and these zero counts are distinguished from the other zero counts that are modeled zero counts and simply indicate that the drug-adverse event pairs have not occurred yet or have not been reported yet. In this paper, a zero-inflated Poisson model based likelihood ratio test method is proposed to identify drug-adverse event pairs that have disproportionately high reporting rates, which are also called signals. The maximum likelihood estimates of the model parameters of zero-inflated Poisson model based likelihood ratio test are obtained using the expectation and maximization algorithm. The zero-inflated Poisson model based likelihood ratio test is also modified to handle the stratified analyses for binary and categorical covariates (e.g. gender and age) in the data. The proposed zero-inflated Poisson model based likelihood ratio test method is shown to asymptotically control the type I error and false discovery rate, and its finite sample performance for signal detection is evaluated through a simulation study. The simulation results show that the zero-inflated Poisson model based likelihood ratio test method performs similar to Poisson model based likelihood ratio test method when the estimated percentage of true zeros in the database is small. Both the zero-inflated Poisson model based likelihood ratio test and likelihood ratio test methods are applied to six selected drugs, from the 2006 to 2011 Adverse Event Reporting System database, with varying percentages of observed zero-count cells.

  6. Combining Radar and Optical Data for Forest Disturbance Studies

    NASA Technical Reports Server (NTRS)

    Ranson, K. Jon; Smith, David E. (Technical Monitor)

    2002-01-01

    Disturbance is an important factor in determining the carbon balance and succession of forests. Until the early 1990's researchers have focused on using optical or thermal sensors to detect and map forest disturbances from wild fires, logging or insect outbreaks. As part of a NASA Siberian mapping project, a study evaluated the capability of three different radar sensors (ERS, JERS and Radarsat) and an optical sensor (Landsat 7) to detect fire scars, logging and insect damage in the boreal forest. This paper describes the data sets and techniques used to evaluate the use of remote sensing to detect disturbance in central Siberian forests. Using images from each sensor individually and combined an assessment of the utility of using these sensors was developed. Transformed Divergence analysis and maximum likelihood classification revealed that Landsat data was the single best data type for this purpose. However, the combined use of the three radar and optical sensors did improve the results of discriminating these disturbances.

  7. Blend sign predicts poor outcome in patients with intracerebral hemorrhage.

    PubMed

    Li, Qi; Yang, Wen-Song; Wang, Xing-Chen; Cao, Du; Zhu, Dan; Lv, Fa-Jin; Liu, Yang; Yuan, Liang; Zhang, Gang; Xiong, Xin; Li, Rui; Hu, Yun-Xin; Qin, Xin-Yue; Xie, Peng

    2017-01-01

    Blend sign has been recently described as a novel imaging marker that predicts hematoma expansion. The purpose of our study was to investigate the prognostic value of CT blend sign in patients with ICH. Patients with intracerebral hemorrhage who underwent baseline CT scan within 6 hours were included. The presence of blend sign on admission nonenhanced CT was independently assessed by two readers. The functional outcome was assessed by using the modified Rankin Scale (mRS) at 90 days. Blend sign was identified in 40 of 238 (16.8%) patients on admission CT scan. The proportion of patients with a poor functional outcome was significantly higher in patients with blend sign than those without blend sign (75.0% versus 47.5%, P = 0.001). The multivariate logistic regression analysis demonstrated that age, intraventricular hemorrhage, admission GCS score, baseline hematoma volume and presence of blend sign on baseline CT independently predict poor functional outcome at 90 days. The CT blend sign independently predicts poor outcome in patients with ICH (odds ratio 3.61, 95% confidence interval [1.47-8.89];p = 0.005). Early identification of blend sign is useful in prognostic stratification and may serve as a potential therapeutic target for prospective interventional studies.

  8. Multilevel mixed effects parametric survival models using adaptive Gauss-Hermite quadrature with application to recurrent events and individual participant data meta-analysis.

    PubMed

    Crowther, Michael J; Look, Maxime P; Riley, Richard D

    2014-09-28

    Multilevel mixed effects survival models are used in the analysis of clustered survival data, such as repeated events, multicenter clinical trials, and individual participant data (IPD) meta-analyses, to investigate heterogeneity in baseline risk and covariate effects. In this paper, we extend parametric frailty models including the exponential, Weibull and Gompertz proportional hazards (PH) models and the log logistic, log normal, and generalized gamma accelerated failure time models to allow any number of normally distributed random effects. Furthermore, we extend the flexible parametric survival model of Royston and Parmar, modeled on the log-cumulative hazard scale using restricted cubic splines, to include random effects while also allowing for non-PH (time-dependent effects). Maximum likelihood is used to estimate the models utilizing adaptive or nonadaptive Gauss-Hermite quadrature. The methods are evaluated through simulation studies representing clinically plausible scenarios of a multicenter trial and IPD meta-analysis, showing good performance of the estimation method. The flexible parametric mixed effects model is illustrated using a dataset of patients with kidney disease and repeated times to infection and an IPD meta-analysis of prognostic factor studies in patients with breast cancer. User-friendly Stata software is provided to implement the methods. Copyright © 2014 John Wiley & Sons, Ltd.

  9. The Luminosity Function of Star Clusters in 20 Star-Forming Galaxies Based on Hubble Legacy Archive Photometry

    NASA Astrophysics Data System (ADS)

    Bowers, Ariel; Whitmore, B. C.; Chandar, R.; Larsen, S. S.

    2014-01-01

    Luminosity functions have been determined for star cluster populations in 20 nearby (4 - 30 Mpc), star-forming galaxies based on ACS source lists generated by the Hubble Legacy Archive (http://hla.stsci.edu). These cluster catalogs provide one of the largest sets of uniform, automatically-generated cluster candidates available in the literature at present. Comparisons are made with other recently generated cluster catalogs demonstrating that the HLA-generated catalogs are of similar quality, but in general do not go as deep. A typical cluster luminosity function can be approximated by a power-law, dN/dL ∝ Lα, with an average value for α of -2.37 and rms scatter = 0.18. A comparison of fitting results based on methods which use binned and unbinned data shows good agreement, although there may be a systematic tendency for the unbinned (maximum-likelihood) method to give slightly more negative values of α for galaxies with steper luminosity functions. Our uniform database results in a small scatter (0.5 magnitude) in the correlation between the magnitude of the brightest cluster (Mbrightest) and Log of the number of clusters brighter than MI = -9 (Log N). We also examine the magnitude of the brightest cluster vs. Log SFR for a sample including LIRGS and ULIRGS.

  10. Pre-test probability of obstructive coronary stenosis in patients undergoing coronary CT angiography: Comparative performance of the modified diamond-Forrester algorithm versus methods incorporating cardiovascular risk factors.

    PubMed

    Ferreira, António Miguel; Marques, Hugo; Tralhão, António; Santos, Miguel Borges; Santos, Ana Rita; Cardoso, Gonçalo; Dores, Hélder; Carvalho, Maria Salomé; Madeira, Sérgio; Machado, Francisco Pereira; Cardim, Nuno; de Araújo Gonçalves, Pedro

    2016-11-01

    Current guidelines recommend the use of the Modified Diamond-Forrester (MDF) method to assess the pre-test likelihood of obstructive coronary artery disease (CAD). We aimed to compare the performance of the MDF method with two contemporary algorithms derived from multicenter trials that additionally incorporate cardiovascular risk factors: the calculator-based 'CAD Consortium 2' method, and the integer-based CONFIRM score. We assessed 1069 consecutive patients without known CAD undergoing coronary CT angiography (CCTA) for stable chest pain. Obstructive CAD was defined as the presence of coronary stenosis ≥50% on 64-slice dual-source CT. The three methods were assessed for calibration, discrimination, net reclassification, and changes in proposed downstream testing based upon calculated pre-test likelihoods. The observed prevalence of obstructive CAD was 13.8% (n=147). Overestimations of the likelihood of obstructive CAD were 140.1%, 9.8%, and 18.8%, respectively, for the MDF, CAD Consortium 2 and CONFIRM methods. The CAD Consortium 2 showed greater discriminative power than the MDF method, with a C-statistic of 0.73 vs. 0.70 (p<0.001), while the CONFIRM score did not (C-statistic 0.71, p=0.492). Reclassification of pre-test likelihood using the 'CAD Consortium 2' or CONFIRM scores resulted in a net reclassification improvement of 0.19 and 0.18, respectively, which would change the diagnostic strategy in approximately half of the patients. Newer risk factor-encompassing models allow for a more precise estimation of pre-test probabilities of obstructive CAD than the guideline-recommended MDF method. Adoption of these scores may improve disease prediction and change the diagnostic pathway in a significant proportion of patients. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. On the scaling of velocity and vorticity variances in turbulent channel flow

    NASA Astrophysics Data System (ADS)

    Leonard, A.

    2015-11-01

    The availability of new DNS-based statistics for turbulent channel flow (Lee & Moser, JFM 2015) along with previous results (e.g., Hoyas & Jiménez, Phys. Flu. 2006) has provided the opportunity for another look at the scaling laws for this flow. For example, data from the former (fig. 4(e)) for the streamwise velocity variance in the outer region clearly indicate a modified log law for that quantity at Reτ = 5200 , i.e., + =C0 -C1 ln (y / δ) -C2 ln (y / δ)2 where δ is the channel half height. We find that this result fits the the data very well for 0 . 1 < y / δ < 0 . 8 . The Reynolds number (5200) is still apparently too low to observe the much-discussed log law (above with C2 = 0), which, presumably, would appear for roughly y / δ < 0 . 1 , as it does in high Reτ pipe flow (Hultmark et al., PRL 2012) with δ replaced by R. On the other hand, the above modified log law with the same values for C1 and C2 is a good fit for the pipe data at Reτ = 98 ×105 for y / R > 0 . 12 (fig. 4 of Hultmark et al.).

  12. Robust generative asymmetric GMM for brain MR image segmentation.

    PubMed

    Ji, Zexuan; Xia, Yong; Zheng, Yuhui

    2017-11-01

    Accurate segmentation of brain tissues from magnetic resonance (MR) images based on the unsupervised statistical models such as Gaussian mixture model (GMM) has been widely studied during last decades. However, most GMM based segmentation methods suffer from limited accuracy due to the influences of noise and intensity inhomogeneity in brain MR images. To further improve the accuracy for brain MR image segmentation, this paper presents a Robust Generative Asymmetric GMM (RGAGMM) for simultaneous brain MR image segmentation and intensity inhomogeneity correction. First, we develop an asymmetric distribution to fit the data shapes, and thus construct a spatial constrained asymmetric model. Then, we incorporate two pseudo-likelihood quantities and bias field estimation into the model's log-likelihood, aiming to exploit the neighboring priors of within-cluster and between-cluster and to alleviate the impact of intensity inhomogeneity, respectively. Finally, an expectation maximization algorithm is derived to iteratively maximize the approximation of the data log-likelihood function to overcome the intensity inhomogeneity in the image and segment the brain MR images simultaneously. To demonstrate the performances of the proposed algorithm, we first applied the proposed algorithm to a synthetic brain MR image to show the intermediate illustrations and the estimated distribution of the proposed algorithm. The next group of experiments is carried out in clinical 3T-weighted brain MR images which contain quite serious intensity inhomogeneity and noise. Then we quantitatively compare our algorithm to state-of-the-art segmentation approaches by using Dice coefficient (DC) on benchmark images obtained from IBSR and BrainWeb with different level of noise and intensity inhomogeneity. The comparison results on various brain MR images demonstrate the superior performances of the proposed algorithm in dealing with the noise and intensity inhomogeneity. In this paper, the RGAGMM algorithm is proposed which can simply and efficiently incorporate spatial constraints into an EM framework to simultaneously segment brain MR images and estimate the intensity inhomogeneity. The proposed algorithm is flexible to fit the data shapes, and can simultaneously overcome the influence of noise and intensity inhomogeneity, and hence is capable of improving over 5% segmentation accuracy comparing with several state-of-the-art algorithms. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Measuring and partitioning the high-order linkage disequilibrium by multiple order Markov chains.

    PubMed

    Kim, Yunjung; Feng, Sheng; Zeng, Zhao-Bang

    2008-05-01

    A map of the background levels of disequilibrium between nearby markers can be useful for association mapping studies. In order to assess the background levels of linkage disequilibrium (LD), multilocus LD measures are more advantageous than pairwise LD measures because the combined analysis of pairwise LD measures is not adequate to detect simultaneous allele associations among multiple markers. Various multilocus LD measures based on haplotypes have been proposed. However, most of these measures provide a single index of association among multiple markers and does not reveal the complex patterns and different levels of LD structure. In this paper, we employ non-homogeneous, multiple order Markov Chain models as a statistical framework to measure and partition the LD among multiple markers into components due to different orders of marker associations. Using a sliding window of multiple markers on phased haplotype data, we compute corresponding likelihoods for different Markov Chain (MC) orders in each window. The log-likelihood difference between the lowest MC order model (MC0) and the highest MC order model in each window is used as a measure of the total LD or the overall deviation from the gametic equilibrium for the window. Then, we partition the total LD into lower order disequilibria and estimate the effects from two-, three-, and higher order disequilibria. The relationship between different orders of LD and the log-likelihood difference involving two different orders of MC models are explored. By applying our method to the phased haplotype data in the ENCODE regions of the HapMap project, we are able to identify high/low multilocus LD regions. Our results reveal that the most LD in the HapMap data is attributed to the LD between adjacent pairs of markers across the whole region. LD between adjacent pairs of markers appears to be more significant in high multilocus LD regions than in low multilocus LD regions. We also find that as the multilocus total LD increases, the effects of high-order LD tends to get weaker due to the lack of observed multilocus haplotypes. The overall estimates of first, second, third, and fourth order LD across the ENCODE regions are 64, 23, 9, and 3%.

  14. The XMM-Newton Wide-Field Survey in the COSMOS Field. II. X-Ray Data and the logN-logS Relations

    NASA Astrophysics Data System (ADS)

    Cappelluti, N.; Hasinger, G.; Brusa, M.; Comastri, A.; Zamorani, G.; Böhringer, H.; Brunner, H.; Civano, F.; Finoguenov, A.; Fiore, F.; Gilli, R.; Griffiths, R. E.; Mainieri, V.; Matute, I.; Miyaji, T.; Silverman, J.

    2007-09-01

    We present data analysis and X-ray source counts for the first season of XMM-Newton observations in the COSMOS field. The survey covers ~2 deg2 within the region of sky bounded by 09h57m30s

  15. Methods and apparatus to produce stick-slip motion of logging tool attached to a wireline drawn upward by a continuously rotating wireline drum

    DOEpatents

    Vail, III, William Banning; Momii, Steven Thomas

    1998-01-01

    Methods and apparatus are described to produce stick-slip motion of a logging tool within a cased well attached to a wireline that is drawn upward by a continuously rotating wireline drum. The stick-slip motion results in the periodic upward movement of the tool in the cased well described in terms of a dwell time during which time the tool is stationary, the move time during which time the tool moves, and the stroke that is upward distance that the tool translates during the "slip" portion of the stick-slip motion. This method of measurement is used to log the well at different vertical positions of the tool. Therefore, any typical "station-to-station logging tool" may be modified to be a "continuous logging tool", where "continuous" means that the wireline drum continually rotates while the tool undergoes stick-slip motion downhole and measurements are performed during the dwell times when the tool is momentarily stationary. The stick-slip methods of operation and the related apparatus are particularly described in terms of making measurements of formation resistivity from within a cased well during the dwell times when the tool is momentarily stationary during the periodic stick-slip motion of the logging tool.

  16. Methods and apparatus to produce stick-slip motion of logging tool attached to a wireline drawn upward by a continuously rotating wireline drum

    DOEpatents

    Vail, W.B. III; Momii, S.T.

    1998-02-10

    Methods and apparatus are described to produce stick-slip motion of a logging tool within a cased well attached to a wireline that is drawn upward by a continuously rotating wireline drum. The stick-slip motion results in the periodic upward movement of the tool in the cased well described in terms of a dwell time during which time the tool is stationary, the move time during which time the tool moves, and the stroke that is upward distance that the tool translates during the ``slip`` portion of the stick-slip motion. This method of measurement is used to log the well at different vertical positions of the tool. Therefore, any typical ``station-to-station logging tool`` may be modified to be a ``continuous logging tool,`` where ``continuous`` means that the wireline drum continually rotates while the tool undergoes stick-slip motion downhole and measurements are performed during the dwell times when the tool is momentarily stationary. The stick-slip methods of operation and the related apparatus are particularly described in terms of making measurements of formation resistivity from within a cased well during the dwell times when the tool is momentarily stationary during the periodic stick-slip motion of the logging tool. 12 figs.

  17. Two in-vivo protocols for testing virucidal efficacy of handwashing and hand disinfection.

    PubMed

    Steinmann, J; Nehrkorn, R; Meyer, A; Becker, K

    1995-01-01

    Whole-hands and fingerpads of seven volunteers were contaminated with poliovirus type 1 Sabin strain in order to evaluate virucidal efficacy of different forms of handwashing and handrub with alcohols and alcohol-based disinfectants. In the whole-hand protocol, handwashing with unmedicated soap for 5 min and handrubbing with 80% ethanol yielded a log reduction factor (RF) of > 2, whereas the log RF by 96.8% ethanol exceeded 3.2. With the fingerpad model ethanol produced a greater log RF than iso- or n-propanol. Comparing five commercial hand disinfectants and a chlorine solution (1.0% chloramine T-solution) for handrub, Desderman and Promanum, both composed of ethanol, yielded log RFs of 2.47 and 2.26 respectively after an application time of 60 s, similar to 1.0% chloramine T-solution (log RF of 2.28). Autosept, Mucasept, and Sterillium, based on n-propanol and/or isopropanol, were found to be significantly less effective (log RFs of 1.16, 1.06 and 1.52 respectively). A comparison of a modified whole-hand and the fingerpad protocol with Promanum showed similar results with the two systems suggesting both models are suitable for testing the in-vivo efficacy of handwashing agents and hand disinfectants which are used without any water.

  18. Antimicrobial potential of flavoring ingredients against Bacillus cereus in a milk-based beverage.

    PubMed

    Pina-Pérez, Maria C; Rodrigo, Dolores; Martínez-López, Antonio

    2013-11-01

    Natural ingredients--cinnamon, cocoa, vanilla, and anise--were assessed based on Bacillus cereus vegetative cell growth inhibition in a mixed liquid whole egg and skim milk beverage (LWE-SM), under different conditions: ingredient concentration (1, 2.5, and 5% [wt/vol]) and incubation temperature (5, 10, and 22 °C). According to the results obtained, ingredients significantly (p<0.05) reduced bacterial growth when supplementing the LWE-SM beverage. B. cereus behavior was mathematically described for each substrate by means of a modified Gompertz equation. Kinetic parameters, lag time, and maximum specific growth rate were obtained. Cinnamon was the most bacteriostatic ingredient and cocoa the most bactericidal one when they were added at 5% (wt/vol) and beverages were incubated at 5 °C. The bactericidal effect of cocoa 5% (wt/vol) reduced final B. cereus log10 counts (log Nf, log10 (colony-forming units/mL)) by 4.10 ± 0.21 log10 cycles at 5 °C.

  19. Improvement of Polymyxin-Egg Yolk-Mannitol-Bromothymol Blue Agar for the Enumeration and Isolation of Bacillus cereus in Various Foods.

    PubMed

    Kang, Il-Byeong; Chon, Jung-Whan; Kim, Dong-Hyeon; Jeong, Dana; Kim, Hong-Seok; Kim, Hyunsook; Seo, Kun-Ho

    2017-03-01

    A modified polymyxin-egg yolk-mannitol-bromothymol blue agar (mPEMBA) was developed by supplementing polymyxin-egg yolk-mannitol-bromothymol blue agar (PEMBA) with trimethoprim to improve the selectivity for and recoverability of Bacillus cereus from naturally and artificially contaminated food samples. The number of B. cereus in mPEMBA was significantly higher than in PEMBA, indicating better recoverability (P < 0.05) in red pepper powder (PEMBA 0.80 ± 0.22 log CFU/g versus mPEMBA 1.95 ± 0.17 log CFU/g) and soybean paste (PEMBA 2.19 ± 0.18 log CFU/g versus mPEMBA 3.09 ± 0.13 log CFU/g). In addition, mPEMBA provided better visual differentiation of B. cereus colonies than PEMBA, which is attributable to the reduced number of competing microflora. We conclude that the addition of trimethoprim to PEMBA could generate a synergistic effect to improve selectivity for B. cereus .

  20. Host range of the emerald ash borer (Agrilus planipennis Fairmaire) (Coleoptera: Buprestidae) in North America: results of multiple-choice field experiments.

    PubMed

    Anulewicz, Andrea C; McCullough, Deborah G; Cappaert, David L; Poland, Therese M

    2008-02-01

    Emerald ash borer (Agrilus planipennis Fairmaire) (Coleoptera: Buprestidae), an invasive phloem-feeding pest, was identified as the cause of widespread ash (Fraxinus) mortality in southeast Michigan and Windsor, Ontario, Canada, in 2002. A. planipennis reportedly colonizes other genera in its native range in Asia, including Ulmus L., Juglans L., and Pterocarya Kunth. Attacks on nonash species have not been observed in North America to date, but there is concern that other genera could be colonized. From 2003 to 2005, we assessed adult A. planipennis landing rates, oviposition, and larval development on North American ash species and congeners of its reported hosts in Asia in multiple-choice field studies conducted at several southeast Michigan sites. Nonash species evaluated included American elm (U. americana L.), hackberry (Celtis occidentalis L.), black walnut (J. nigra L.), shagbark hickory [Carya ovata (Mill.) K.Koch], and Japanese tree lilac (Syringa reticulata Bl.). In studies with freshly cut logs, adult beetles occasionally landed on nonash logs but generally laid fewer eggs than on ash logs. Larvae fed and developed normally on ash logs, which were often heavily infested. No larvae were able to survive, grow, or develop on any nonash logs, although failed first-instar galleries occurred on some walnut logs. High densities of larvae developed on live green ash and white ash nursery trees, but there was no evidence of larval survival or development on Japanese tree lilac and black walnut trees in the same plantation. We felled, debarked, and intensively examined >28 m2 of phloem area on nine American elm trees growing in contact with or adjacent to heavily infested ash trees. We found no sign of A. planipennis feeding on any elm.

  1. The Relationship Between Hope and Adolescent Likelihood to Endorse Substance Use Behaviors in a Sample of Marginalized Youth.

    PubMed

    Brooks, Merrian J; Marshal, Michael P; McCauley, Heather L; Douaihy, Antoine; Miller, Elizabeth

    2016-11-09

    Hopefulness has been associated with increased treatment retention and reduced substance abuse among adults, and may be a promising modifiable factor to leverage in substance abuse treatment settings. Few studies have assessed the relationship between hopefulness and substance use in adolescents, particularly those with high-risk backgrounds. We explored whether high hope is associated with less likelihood for engaging in a variety of substance use behaviors in a sample of marginalized adolescents. Using logistic regression, we assessed results from a cross-sectional anonymous youth behavior survey (n = 256 youth, ages 14 to 19). We recruited from local youth serving agencies (e.g., homeless shelters, group homes, short-term detention). The sample was almost 60% male and two thirds African American. Unadjusted models showed youth with higher hope had a 50-58% (p = <.05) decreased odds of endorsing heavy episodic drinking, daily tobacco use, recent or lifetime marijuana use, and sex after using substances. Adjusted models showed a 52% decreased odds of lifetime marijuana use with higher hope, and a trend towards less sex after substance use (AOR 0.481; p = 0.065). No other substance use behaviors remained significantly associated with higher hope scores in adjusted models. Hopefulness may contribute to decreased likelihood of substance use in adolescents. Focusing on hope may be one modifiable target in a comprehensive primary or secondary substance use prevention program.

  2. Safe semi-supervised learning based on weighted likelihood.

    PubMed

    Kawakita, Masanori; Takeuchi, Jun'ichi

    2014-05-01

    We are interested in developing a safe semi-supervised learning that works in any situation. Semi-supervised learning postulates that n(') unlabeled data are available in addition to n labeled data. However, almost all of the previous semi-supervised methods require additional assumptions (not only unlabeled data) to make improvements on supervised learning. If such assumptions are not met, then the methods possibly perform worse than supervised learning. Sokolovska, Cappé, and Yvon (2008) proposed a semi-supervised method based on a weighted likelihood approach. They proved that this method asymptotically never performs worse than supervised learning (i.e., it is safe) without any assumption. Their method is attractive because it is easy to implement and is potentially general. Moreover, it is deeply related to a certain statistical paradox. However, the method of Sokolovska et al. (2008) assumes a very limited situation, i.e., classification, discrete covariates, n(')→∞ and a maximum likelihood estimator. In this paper, we extend their method by modifying the weight. We prove that our proposal is safe in a significantly wide range of situations as long as n≤n('). Further, we give a geometrical interpretation of the proof of safety through the relationship with the above-mentioned statistical paradox. Finally, we show that the above proposal is asymptotically safe even when n(')

  3. SPOTting model parameters using a ready-made Python package

    NASA Astrophysics Data System (ADS)

    Houska, Tobias; Kraft, Philipp; Breuer, Lutz

    2015-04-01

    The selection and parameterization of reliable process descriptions in ecological modelling is driven by several uncertainties. The procedure is highly dependent on various criteria, like the used algorithm, the likelihood function selected and the definition of the prior parameter distributions. A wide variety of tools have been developed in the past decades to optimize parameters. Some of the tools are closed source. Due to this, the choice for a specific parameter estimation method is sometimes more dependent on its availability than the performance. A toolbox with a large set of methods can support users in deciding about the most suitable method. Further, it enables to test and compare different methods. We developed the SPOT (Statistical Parameter Optimization Tool), an open source python package containing a comprehensive set of modules, to analyze and optimize parameters of (environmental) models. SPOT comes along with a selected set of algorithms for parameter optimization and uncertainty analyses (Monte Carlo, MC; Latin Hypercube Sampling, LHS; Maximum Likelihood, MLE; Markov Chain Monte Carlo, MCMC; Scuffled Complex Evolution, SCE-UA; Differential Evolution Markov Chain, DE-MCZ), together with several likelihood functions (Bias, (log-) Nash-Sutcliff model efficiency, Correlation Coefficient, Coefficient of Determination, Covariance, (Decomposed-, Relative-, Root-) Mean Squared Error, Mean Absolute Error, Agreement Index) and prior distributions (Binomial, Chi-Square, Dirichlet, Exponential, Laplace, (log-, multivariate-) Normal, Pareto, Poisson, Cauchy, Uniform, Weibull) to sample from. The model-independent structure makes it suitable to analyze a wide range of applications. We apply all algorithms of the SPOT package in three different case studies. Firstly, we investigate the response of the Rosenbrock function, where the MLE algorithm shows its strengths. Secondly, we study the Griewank function, which has a challenging response surface for optimization methods. Here we see simple algorithms like the MCMC struggling to find the global optimum of the function, while algorithms like SCE-UA and DE-MCZ show their strengths. Thirdly, we apply an uncertainty analysis of a one-dimensional physically based hydrological model build with the Catchment Modelling Framework (CMF). The model is driven by meteorological and groundwater data from a Free Air Carbon Enrichment (FACE) experiment in Linden (Hesse, Germany). Simulation results are evaluated with measured soil moisture data. We search for optimal parameter sets of the van Genuchten-Mualem function and find different equally optimal solutions with some of the algorithms. The case studies reveal that the implemented SPOT methods work sufficiently well. They further show the benefit of having one tool at hand that includes a number of parameter search methods, likelihood functions and a priori parameter distributions within one platform independent package.

  4. Water treatment with exceptional virus inactivation using activated carbon modified with silver (Ag) and copper oxide (CuO) nanoparticles.

    PubMed

    Shimabuku, Quelen Letícia; Arakawa, Flávia Sayuri; Fernandes Silva, Marcela; Ferri Coldebella, Priscila; Ueda-Nakamura, Tânia; Fagundes-Klen, Márcia Regina; Bergamasco, Rosangela

    2017-08-01

    Continuous flow experiments (450 mL min -1 ) were performed in household filter in order to investigate the removal and/or inactivation of T4 bacteriophage, using granular activated carbon (GAC) modified with silver and/or copper oxide nanoparticles at different concentrations. GAC and modified GAC were characterized by X-ray diffractometry, specific surface area, pore size and volume, pore average diameter, scanning electron microscopy, transmission electron microscopy, zeta potential and atomic absorption spectroscopy. The antiviral activity of the produced porous media was evaluated by passing suspensions of T4 bacteriophage (∼10 5  UFP/mL) through filters. The filtered water was analyzed for the presence of the bacteriophage and the release of silver and copper oxide. The porous media containing silver and copper oxide nanoparticles showed high inactivation capacity, even reaching reductions higher than 3 log. GAC6 (GAC/Ag0.5%Cu1.0%) was effective in the bacteriophage inactivation, reaching 5.53 log reduction. The levels of silver and copper released in filtered water were below the recommended limits (100 ppb for silver and 1000 ppb for copper) in drinking water. From this study, it is possible to conclude that activated carbon modified with silver and copper oxide nanoparticles can be used as a filter for virus removal in the treatment of drinking water.

  5. Accuracy of clinical pallor in the diagnosis of anaemia in children: a meta-analysis.

    PubMed

    Chalco, Juan P; Huicho, Luis; Alamo, Carlos; Carreazo, Nilton Y; Bada, Carlos A

    2005-12-08

    Anaemia is highly prevalent in children of developing countries. It is associated with impaired physical growth and mental development. Palmar pallor is recommended at primary level for diagnosing it, on the basis of few studies. The objective of the study was to systematically assess the accuracy of clinical signs in the diagnosis of anaemia in children. A systematic review on the accuracy of clinical signs of anaemia in children. We performed an Internet search in various databases and an additional reference tracking. Studies had to be on performance of clinical signs in the diagnosis of anaemia, using haemoglobin as the gold standard. We calculated pooled diagnostic likelihood ratios (LR's) and odds ratios (DOR's) for each clinical sign at different haemoglobin thresholds. Eleven articles met the inclusion criteria. Most studies were performed in Africa, in children underfive. Chi-square test for proportions and Cochran Q for DOR's and for LR's showed heterogeneity. Type of observer and haemoglobin technique influenced the results. Pooling was done using the random effects model. Pooled DOR at haemoglobin <11 g/dL was 4.3 (95% CI 2.6-7.2) for palmar pallor, 3.7 (2.3-5.9) for conjunctival pallor, and 3.4 (1.8-6.3) for nailbed pallor. DOR's and LR's were slightly better for nailbed pallor at all other haemoglobin thresholds. The accuracy did not vary substantially after excluding outliers. This meta-analysis did not document a highly accurate clinical sign of anaemia. In view of poor performance of clinical signs, universal iron supplementation may be an adequate control strategy in high prevalence areas. Further well-designed studies are needed in settings other than Africa. They should assess inter-observer variation, performance of combined clinical signs, phenotypic differences, and different degrees of anaemia.

  6. Signs and symptoms of Group A versus Non-Group A strep throat: A meta-analysis.

    PubMed

    Thai, Thuy N; Dale, Ariella P; Ebell, Mark H

    2017-10-13

    Both non-Group A streptococcal (non-GAS) pharyngitis and Group A streptococcal (GAS) pharyngitis are commonly found in patients with sore throat. It is not known whether or not they present with similar signs and symptoms compared to patients with non-streptococcal pharyngitis. MEDLINE was searched for prospective studies that reported throat culture for both GAS and non-GAS as a reference standard, and reported at least one sign, symptom, or the Centor score. Summary estimates of sensitivity, specificity, likelihood ratios (LR+ and LR-), and diagnostic odds ratios (DOR) were calculated using a bivariate random effects model. Summary receiver operating characteristic (ROC) curves were created for key signs and symptoms. Eight studies met our inclusion criteria. Tonsillar exudate had the highest LR+ for both GAS and non-GAS pharyngitis (1.53 versus 1.71). The confidence intervals of sensitivity, LR+, LR-, and DOR for all signs, symptoms, and the Centor score between two groups overlapped, with the relative difference between sensitivities within 15% for arthralgia or myalgia, fever, injected throat, tonsillar enlargement, and tonsillar exudate. Larger differences in sensitivities were observed for sore throat, cervical adenopathy, and lack of a cough, although the difference for lack of a cough largely due to a single outlier. Signs and symptoms of patients with GAS and non-GAS pharyngitis are generally similar. No signs or symptoms clearly distinguish GAS from non-GAS infection. Further work is needed to determine whether Group C streptococcus is a pathogen that should be treated. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. Variations in Vital Signs in the Last Days of Life in Patients With Advanced Cancer

    PubMed Central

    Bruera, Sebastian; Chisholm, Gary; Dos Santos, Renata; Crovador, Camila; Bruera, Eduardo; Hui, David

    2014-01-01

    Context Few studies have examined variation in vital signs in the last days of life. Objectives We determined the variation of vital signs in the final two weeks of life in patients with advanced cancer and examined their association with impending death in three days. Methods In this prospective, longitudinal, observational study, we enrolled consecutive patients admitted to two acute palliative care units and documented their vital signs (heart rate, blood pressure, respiratory rate, oxygen saturation, and temperature) twice a day serially from admission to death or discharge. Results Of 357 patients, 203 (55%) died in hospital. Systolic blood pressure (P < 0.001), diastolic blood pressure (P < 0.001), and oxygen saturation (P < 0.001) decreased significantly in the final three days of life, and temperature increased slightly (P < 0.04). Heart rate (P = 0.22) and respiratory rate (P = 0.24) remained similar in the last three days. Impending death in three days was significantly associated with increased heart rate (odds ratio [OR] = 2; P = 0.01), decreased systolic blood pressure (OR = 2.5; P = 0.004), decreased diastolic blood pressure (OR = 2.3; P = 0.002), and decreased oxygen saturation (OR = 3.7; P = 0.003) from baseline readings on admission. These changes had high specificity (≥80%), low sensitivity (≤35%), and modest positive likelihood ratios (≤5) for impending death within three days. A large proportion of patients had normal vital signs in the last days of life. Conclusion Blood pressure and oxygen saturation decreased in the last days of life. Clinicians and families cannot rely on vital sign changes alone to rule in or rule out impending death. Our findings do not support routine vital signs monitoring of patients who are imminently dying. PMID:24731412

  8. Performance of Low-Density Parity-Check Coded Modulation

    NASA Technical Reports Server (NTRS)

    Hamkins, Jon

    2010-01-01

    This paper reports the simulated performance of each of the nine accumulate-repeat-4-jagged-accumulate (AR4JA) low-density parity-check (LDPC) codes [3] when used in conjunction with binary phase-shift-keying (BPSK), quadrature PSK (QPSK), 8-PSK, 16-ary amplitude PSK (16- APSK), and 32-APSK.We also report the performance under various mappings of bits to modulation symbols, 16-APSK and 32-APSK ring scalings, log-likelihood ratio (LLR) approximations, and decoder variations. One of the simple and well-performing LLR approximations can be expressed in a general equation that applies to all of the modulation types.

  9. A new LDPC decoding scheme for PDM-8QAM BICM coherent optical communication system

    NASA Astrophysics Data System (ADS)

    Liu, Yi; Zhang, Wen-bo; Xi, Li-xia; Tang, Xian-feng; Zhang, Xiao-guang

    2015-11-01

    A new log-likelihood ratio (LLR) message estimation method is proposed for polarization-division multiplexing eight quadrature amplitude modulation (PDM-8QAM) bit-interleaved coded modulation (BICM) optical communication system. The formulation of the posterior probability is theoretically analyzed, and the way to reduce the pre-decoding bit error rate ( BER) of the low density parity check (LDPC) decoder for PDM-8QAM constellations is presented. Simulation results show that it outperforms the traditional scheme, i.e., the new post-decoding BER is decreased down to 50% of that of the traditional post-decoding algorithm.

  10. Analyzing Data for Systems Biology: Working at the Intersection of Thermodynamics and Data Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cannon, William R.; Baxter, Douglas J.

    2012-08-15

    Many challenges in systems biology have to do with analyzing data within the framework of molecular phenomena and cellular pathways. How does this relate to thermodynamics that we know govern the behavior of molecules? Making progress in relating data analysis to thermodynamics is essential in systems biology if we are to build predictive models that enable the field of synthetic biology. This report discusses work at the crossroads of thermodynamics and data analysis, and demonstrates that statistical mechanical free energy is a multinomial log likelihood. Applications to systems biology are presented.

  11. Automatic integration of data from dissimilar sensors

    NASA Astrophysics Data System (ADS)

    Citrin, W. I.; Proue, R. W.; Thomas, J. W.

    The present investigation is concerned with the automatic integration of radar and electronic support measures (ESM) sensor data, and with the development of a method for the automatical integration of identification friend or foe (IFF) and radar sensor data. On the basis of the two considered proojects, significant advances have been made in the areas of sensor data integration. It is pointed out that the log likelihood approach in sensor data correlation is appropriate for both similar and dissimilar sensor data. Attention is given to the real time integration of radar and ESM sensor data, and a radar ESM correlation simulation program.

  12. Neyman Pearson detection of K-distributed random variables

    NASA Astrophysics Data System (ADS)

    Tucker, J. Derek; Azimi-Sadjadi, Mahmood R.

    2010-04-01

    In this paper a new detection method for sonar imagery is developed in K-distributed background clutter. The equation for the log-likelihood is derived and compared to the corresponding counterparts derived for the Gaussian and Rayleigh assumptions. Test results of the proposed method on a data set of synthetic underwater sonar images is also presented. This database contains images with targets of different shapes inserted into backgrounds generated using a correlated K-distributed model. Results illustrating the effectiveness of the K-distributed detector are presented in terms of probability of detection, false alarm, and correct classification rates for various bottom clutter scenarios.

  13. Enduring stereoscopic motion aftereffects induced by prolonged adaptation.

    PubMed

    Bowd, C; Rose, D; Phinney, R E; Patterson, R

    1996-11-01

    This study investigated the effects of prolonged adaptation on the recovery of the stereoscopic motion aftereffect (adaptation induced by moving binocular disparity information). The adapting and test stimuli were stereoscopic grating patterns created from disparity, embedded in dynamic random-dot stereograms. Motion aftereffects induced by luminance stimuli were included in the study for comparison. Adaptation duration was either 1, 2, 4, 8, 16, 32 or 64 min and the duration of the ensuing aftereffect was the variable of interest. The results showed that aftereffect duration was proportional to the square root of adaptation duration for both stereoscopic and luminance stimuli; on log-log axes, the relation between aftereffect duration and adaptation duration was a power law with the slope near 0.5 in both cases. For both kinds of stimuli, there was no sign of adaptation saturation even at the longest adaptation duration.

  14. Short communication: A comparison of biofilm development on stainless steel and modified-surface plate heat exchangers during a 17-h milk pasteurization run.

    PubMed

    Jindal, Shivali; Anand, Sanjeev; Metzger, Lloyd; Amamcharla, Jayendra

    2018-04-01

    Flow of milk through the plate heat exchanger (PHE) results in denaturation of proteins, resulting in fouling. This also accelerates bacterial adhesion on the PHE surface, eventually leading to the development of biofilms. During prolonged processing, these biofilms result in shedding of bacteria and cross-contaminate the milk being processed, thereby limiting the duration of production runs. Altering the surface properties of PHE, such as surface energy and hydrophobicity, could be an effective approach to reduce biofouling. This study was conducted to compare the extent of biofouling on native stainless steel (SS) and modified-surface [Ni-P-polytetrafluoroethylene (PTFE)] PHE during the pasteurization of raw milk for an uninterrupted processing run of 17 h. For microbial studies, raw and pasteurized milk samples were aseptically collected from inlets and outlets of both PHE at various time intervals to examine shedding of bacteria in the milk. At the end of the run, 3M quick swabs (3M, St. Paul, MN) and ATP swabs (Charm Sciences Inc., Lawrence, MA) were used to sample plates from different sections of the pasteurizers (regeneration, heating, and cooling) for biofilm screening and to estimate the efficiency of cleaning in place, respectively. The data were tested for ANOVA, and means were compared. Modified PHE experienced lower mesophilic and thermophilic bacterial attachment and biofilm formation (average log 1.0 and 0.99 cfu/cm 2 , respectively) in the regenerative section of the pasteurizer compared with SS PHE (average log 1.49 and 1.47, respectively). Similarly, higher relative light units were observed for SS PHE compared with the modified PHE, illustrating the presence of more organic matter on the surface of SS PHE at the end of the run. In addition, at h 17, milk collected from the outlet of SS PHE showed plate counts of 5.44 cfu/cm 2 , which were significantly higher than those for pasteurized milk collected from modified PHE (4.12 log cfu/cm 2 ). This provided further evidence in favor of the modified PHE achieving better microbial quality of pasteurized milk in long process runs. Moreover, because cleaning SS PHE involves an acid treatment step, whereas an alkali treatment step is sufficient for the modified-surface PHE, use of the latter is both cost and time effective, making it a better surface for thermal processing of milk and other fluid dairy products. Copyright © 2018 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  15. Exact one-sided confidence limits for the difference between two correlated proportions.

    PubMed

    Lloyd, Chris J; Moldovan, Max V

    2007-08-15

    We construct exact and optimal one-sided upper and lower confidence bounds for the difference between two probabilities based on matched binary pairs using well-established optimality theory of Buehler. Starting with five different approximate lower and upper limits, we adjust them to have coverage probability exactly equal to the desired nominal level and then compare the resulting exact limits by their mean size. Exact limits based on the signed root likelihood ratio statistic are preferred and recommended for practical use.

  16. Deaf students and their classroom communication: an evaluation of higher order categorical interactions among school and background characteristics.

    PubMed

    Allen, Thomas E; Anderson, Melissa L

    2010-01-01

    This article investigated to what extent age, use of a cochlear implant, parental hearing status, and use of sign in the home determine language of instruction for profoundly deaf children. Categorical data from 8,325 profoundly deaf students from the 2008 Annual Survey of Deaf and Hard-of-Hearing Children and Youth were analyzed using chi-square automated interaction detector, a stepwise analytic procedure that allows the assessment of higher order interactions among categorical variables. Results indicated that all characteristics were significantly related to classroom communication modality. Although younger and older students demonstrated a different distribution of communication modality, for both younger and older students, cochlear implantation had the greatest effect on differentiating students into communication modalities, yielding greater gains in the speech-only category for implanted students. For all subgroups defined by age and implantation status, the use of sign at home further segregated the sample into communication modality subgroups, reducing the likelihood of speech only and increasing the placement of students into signing classroom settings. Implications for future research in the field of deaf education are discussed.

  17. Influence of logging on the effects of wildfire in Siberia

    NASA Astrophysics Data System (ADS)

    Kukavskaya, E. A.; Buryak, L. V.; Ivanova, G. A.; Conard, S. G.; Kalenskaya, O. P.; Zhila, S. V.; McRae, D. J.

    2013-12-01

    The Russian boreal zone supports a huge terrestrial carbon pool. Moreover, it is a tremendous reservoir of wood products concentrated mainly in Siberia. The main natural disturbance in these forests is wildfire, which modifies the carbon budget and has potentially important climate feedbacks. In addition, both legal and illegal logging increase landscape complexity and affect burning conditions and fuel consumption. We investigated 100 individual sites with different histories of logging and fire on a total of 23 study areas in three different regions of Siberia to evaluate the impacts of fire and logging on fuel loads, carbon emissions, and tree regeneration in pine and larch forests. We found large variations of fire and logging effects among regions depending on growing conditions and type of logging activity. Logged areas in the Angara region had the highest surface and ground fuel loads (up to 135 t ha-1), mainly due to logging debris. This resulted in high carbon emissions where fires occurred on logged sites (up to 41 tC ha-1). The Shushenskoe/Minusinsk and Zabaikal regions are characterized by better slash removal and a smaller amount of carbon emitted to the atmosphere during fires. Illegal logging, which is widespread in the Zabaikal region, resulted in an increase in fire hazard and higher carbon emissions than legal logging. The highest fuel loads (on average 108 t ha-1) and carbon emissions (18-28 tC ha-1) in the Zabaikal region are on repeatedly burned unlogged sites where trees fell on the ground following the first fire event. Partial logging in the Shushenskoe/Minusinsk region has insufficient impact on stand density, tree mortality, and other forest conditions to substantially increase fire hazard or affect carbon stocks. Repeated fires on logged sites resulted in insufficient tree regeneration and transformation of forest to grasslands. We conclude that negative impacts of fire and logging on air quality, the carbon cycle, and ecosystem sustainability could be decreased by better slash removal in the Angara region, removal of trees killed by fire in the Zabaikal region, and tree planting after fires in drier conditions where natural regeneration is hampered by soil overheating and grass proliferation.

  18. Validation of cooking times and temperatures for thermal inactivation of Yersinia pestis strains KIM5 and CDC-A1122 in irradiated ground beef.

    PubMed

    Porto-Fett, Anna C S; Juneja, Vijay K; Tamplin, Mark L; Luchansky, John B

    2009-03-01

    Irradiated ground beef samples (ca. 3-g portions with ca. 25% fat) inoculated with Yersina pestis strain KIM5 (ca. 6.7 log CFU/g) were heated in a circulating water bath stabilized at 48.9, 50, 52.5, 55, 57.5, or 60 degrees C (120, 122, 126.5, 131, 135.5, and 140 degrees F, respectively). Average D-values were 192.17, 34.38, 17.11, 3.87, 1.32, and 0.56 min, respectively, with a corresponding z-value of 4.67 degrees C (8.41 degrees F). In related experiments, irradiated ground beef patties (ca. 95 g per patty with ca. 25% fat) were inoculated with Y. pestis strains KIMS or CDC-A1122 (ca. 6.0 log CFU/g) and cooked on an open-flame gas grill or on a clam-shell type electric grill to internal target temperatures of 48.9, 60, and 71.1 degrees C (120, 140, and 160 degrees F, respectively). For patties cooked on the gas grill, strain KIM5 populations decreased from ca. 6.24 to 4.32, 3.51, and < or = 0.7 log CFU/g at 48.9, 60, and 71.1 degrees C, respectively, and strain CDC-A1122 populations decreased to 3.46 log CFU/g at 48.9 degrees C and to < or = 0.7 log CFU/g at both 60 and 71.1 degrees C. For patties cooked on the clam-shell grill, strain KIM5 populations decreased from ca. 5.96 to 2.53 log CFU/g at 48.9 degrees C and to < or = 0.7 log CFU/g at 60 or 71.1 degrees C, and strain CDC-A1122 populations decreased from ca. 5.98 to < or = 0.7 log CFU/g at all three cooking temperatures. These data confirm that cooking ground beef on an open-flame gas grill or on a clam-shell type electric grill to the temperatures and times recommended by the U.S. Department of Agriculture and the U.S. Food and Drug Administration Food Code, appreciably lessens the likelihood, severity, and/or magnitude of consumer illness if the ground beef were purposefully contaminated even with relatively high levels of Y. pestis.

  19. Two-bit trinary full adder design based on restricted signed-digit numbers

    NASA Astrophysics Data System (ADS)

    Ahmed, J. U.; Awwal, A. A. S.; Karim, M. A.

    1994-08-01

    A 2-bit trinary full adder using a restricted set of a modified signed-digit trinary numeric system is designed. When cascaded together to design a multi-bit adder machine, the resulting system is able to operate at a speed independent of the size of the operands. An optical non-holographic content addressable memory based on binary coded arithmetic is considered for implementing the proposed adder.

  20. Representativity and univocity of traffic signs and their effect on trajectory movement in a driving simulation task: Warning signs.

    PubMed

    Vilchez, Jose Luis

    2017-07-04

    The effect of traffic signs on the behavior of drivers is not completely understood. Knowing about how humans process the meaning of signs (not just by learning but instinctively) will improve reaction time and decision making when traveling. The economic, social, and psychological consequences of car accidents are well known. This study sounds out which traffic signs are more ergonomic for participants, from a cognitive point of view, and determines, at the same time, their effect in participants' movement trajectories in a driving simulation task. Results point out that the signs least representative of their meaning produce a greater deviation from the center of the road than the most representative ones. This study encourages both an in-depth analysis of the effect on movement of roadside signs and the study of how this effect can be modified by the context in which these signs are presented (with the aim to move the research closer to and analyze the data in real contexts). The goal is to achieve clarity of meaning and lack of counterproductive effects on the trajectory of representative signs (those that provoke fewer mistakes in the decision task).

  1. Effects of orbitofrontal cortex lesions on autoshaped lever pressing and reversal learning.

    PubMed

    Chang, Stephen E

    2014-10-15

    A cue associated with a rewarding event can trigger behavior towards the cue itself due to the cue acquiring incentive value through its pairing with the rewarding outcome (i.e., sign-tracking). For example, rats will approach, press, and attempt to "consume" a retractable lever conditioned stimulus (CS) that signals delivery of a food unconditioned stimulus (US). Attending to food-predictive CSs is important when seeking out food, and it is just as important to be able to modify one's behavior when the relationships between CSs and USs are changed. Using a discriminative autoshaping procedure with lever CSs, the present study investigated the effects of orbitofrontal cortex (OFC) lesions on sign-tracking and reversal learning. Insertion of one lever was followed by sucrose delivery upon retraction, and insertion of another lever was followed by nothing. After the acquisition phase, the contingencies between the levers and outcomes were reversed. Bilateral OFC lesions had no effect on the acquisition of sign-tracking. However, OFC-lesioned rats showed substantial deficits in acquiring sign-tracking compared to sham-lesioned rats once the stimulus-outcome contingencies were reversed. Over the course of reversal learning, OFC-lesioned rats were able to reach comparable levels of sign-tracking as sham-lesioned rats. These findings suggest that OFC is not necessary for the ability of a CS to acquire incentive value and provide more evidence that OFC is critical for modifying behavior appropriately following a change in stimulus-outcome contingencies. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Finite-difference modeling of the electroseismic logging in a fluid-saturated porous formation

    NASA Astrophysics Data System (ADS)

    Guan, Wei; Hu, Hengshan

    2008-05-01

    In a fluid-saturated porous medium, an electromagnetic (EM) wavefield induces an acoustic wavefield due to the electrokinetic effect. A potential geophysical application of this effect is electroseismic (ES) logging, in which the converted acoustic wavefield is received in a fluid-filled borehole to evaluate the parameters of the porous formation around the borehole. In this paper, a finite-difference scheme is proposed to model the ES logging responses to a vertical low frequency electric dipole along the borehole axis. The EM field excited by the electric dipole is calculated separately by finite-difference first, and is considered as a distributed exciting source term in a set of extended Biot's equations for the converted acoustic wavefield in the formation. This set of equations is solved by a modified finite-difference time-domain (FDTD) algorithm that allows for the calculation of dynamic permeability so that it is not restricted to low-frequency poroelastic wave problems. The perfectly matched layer (PML) technique without splitting the fields is applied to truncate the computational region. The simulated ES logging waveforms approximately agree with those obtained by the analytical method. The FDTD algorithm applies also to acoustic logging simulation in porous formations.

  3. Maximum Likelihood and Minimum Distance Applied to Univariate Mixture Distributions.

    ERIC Educational Resources Information Center

    Wang, Yuh-Yin Wu; Schafer, William D.

    This Monte-Carlo study compared modified Newton (NW), expectation-maximization algorithm (EM), and minimum Cramer-von Mises distance (MD), used to estimate parameters of univariate mixtures of two components. Data sets were fixed at size 160 and manipulated by mean separation, variance ratio, component proportion, and non-normality. Results…

  4. Estimating Likelihood of Fetal In Vivo Interactions Using In Vitro HTS Data (Teratology meeting)

    EPA Science Inventory

    Tox21/ToxCast efforts provide in vitro concentration-response data for thousands of compounds. Predicting whether chemical-biological interactions observed in vitro will occur in vivo is challenging. We hypothesize that using a modified model from the FDA guidance for drug intera...

  5. Strong bimodality in the host halo mass of central galaxies from galaxy-galaxy lensing

    NASA Astrophysics Data System (ADS)

    Mandelbaum, Rachel; Wang, Wenting; Zu, Ying; White, Simon; Henriques, Bruno; More, Surhud

    2016-04-01

    We use galaxy-galaxy lensing to study the dark matter haloes surrounding a sample of locally brightest galaxies (LBGs) selected from the Sloan Digital Sky Survey. We measure mean halo mass as a function of the stellar mass and colour of the central galaxy. Mock catalogues constructed from semi-analytic galaxy formation simulations demonstrate that most LBGs are the central objects of their haloes, greatly reducing interpretation uncertainties due to satellite contributions to the lensing signal. Over the full stellar mass range, 10.3 < log [M*/M⊙] < 11.6, we find that passive central galaxies have haloes that are at least twice as massive as those of star-forming objects of the same stellar mass. The significance of this effect exceeds 3σ for log [M*/M⊙] > 10.7. Tests using the mock catalogues and on the data themselves clarify the effects of LBG selection and show that it cannot artificially induce a systematic dependence of halo mass on LBG colour. The bimodality in halo mass at fixed stellar mass is reproduced by the astrophysical model underlying our mock catalogue, but the sign of the effect is inconsistent with recent, nearly parameter-free age-matching models. The sign and magnitude of the effect can, however, be reproduced by halo occupation distribution models with a simple (few-parameter) prescription for type dependence.

  6. Extension of Kaplan-Meier methods in observational studies with time-varying treatment.

    PubMed

    Xu, Stanley; Shetterly, Susan; Powers, David; Raebel, Marsha A; Tsai, Thomas T; Ho, P Michael; Magid, David

    2012-01-01

    Inverse probability of treatment weighted Kaplan-Meier estimates have been developed to compare two treatments in the presence of confounders in observational studies. Recently, stabilized weights were developed to reduce the influence of extreme inverse probability of treatment-weighted weights in estimating treatment effects. The objective of this research was to use adjusted Kaplan-Meier estimates and modified log-rank and Wilcoxon tests to examine the effect of a treatment that varies over time in an observational study. We proposed stabilized weight adjusted Kaplan-Meier estimates and modified log-rank and Wilcoxon tests when the treatment was time-varying over the follow-up period. We applied these new methods in examining the effect of an anti-platelet agent, clopidogrel, on subsequent events, including bleeding, myocardial infarction, and death after a drug-eluting stent was implanted into a coronary artery. In this population, clopidogrel use may change over time based on a patient's behavior (e.g., nonadherence) and physicians' recommendations (e.g., end of duration of therapy). Consequently, clopidogrel use was treated as a time-varying variable. We demonstrate that 1) the sample sizes at three chosen time points are almost identical in the original and weighted datasets; and 2) the covariates between patients on and off clopidogrel were well balanced after stabilized weights were applied to the original samples. The stabilized weight-adjusted Kaplan-Meier estimates and modified log-rank and Wilcoxon tests are useful in presenting and comparing survival functions for time-varying treatments in observational studies while adjusting for known confounders. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  7. Can obviously intoxicated patrons still easily buy alcohol at on-premise establishments?

    PubMed Central

    Toomey, Traci L.; Lenk, Kathleen M.; Nederhoff, Dawn M.; Nelson, Toben F.; Ecklund, Alexandra M.; Horvath, Keith J.; Erickson, Darin J.

    2015-01-01

    Background Excessive alcohol consumption at licensed alcohol establishments (i.e., bars and restaurants) has been directly linked to alcohol-related problems such as traffic crashes and violence. Historically, alcohol establishments have had a high likelihood of selling alcohol to obviously intoxicated patrons (also referred to as “overservice”) despite laws prohibiting these sales. Given the risks associated with overservice and the need for up-to-date data, it is critical that we monitor the likelihood of sales to obviously intoxicated patrons. Methods To assess the current likelihood of a licensed alcohol establishment selling alcohol to an obviously intoxicated patron, we conducted pseudo-intoxicated purchase attempts (i.e., actors attempt to purchase alcohol while acting out obvious signs of intoxication) at 340 establishments in one Midwestern metropolitan area. We also measured characteristics of the establishments, the pseudo-intoxicated patrons, the servers, the managers, and the neighborhoods to assess whether these characteristics were associated with likelihood of sales of obviously intoxicated patrons. We assessed these associations with bivariate and multivariate regression models. Results Pseudo-intoxicated buyers were able to purchase alcohol at 82% of the establishments. In the fully adjusted multivariate regression model, only one of the characteristics we assessed was significantly associated with likelihood of selling to intoxicated patrons–establishments owned by a corporate entity had 3.6 greater odds of selling alcohol to a pseudo-intoxicated buyer compared to independently-owned establishments. Discussion Given the risks associated with overservice of alcohol, more resources should be devoted first to identify effective interventions for decreasing overservice of alcohol and then to educate practitioners who are working in their communities to address this public health problem. PMID:26891204

  8. An approach to derive some simple empirical equations to calibrate nuclear and acoustic well logging tools.

    PubMed

    Mohammad Al Alfy, Ibrahim

    2018-01-01

    A set of three pads was constructed from primary materials (sand, gravel and cement) to calibrate the gamma-gamma density tool. A simple equation was devised to convert the qualitative cps values to quantitative g/cc values. The neutron-neutron porosity tool measures the qualitative cps porosity values. A direct equation was derived to calculate the porosity percentage from the cps porosity values. Cement-bond log illustrates the cement quantities, which surround well pipes. This log needs a difficult process due to the existence of various parameters, such as: drilling well diameter as well as internal diameter, thickness and type of well pipes. An equation was invented to calculate the cement percentage at standard conditions. This equation can be modified according to varying conditions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. A 3-year prospective study of neurological soft signs in first-episode schizophrenia.

    PubMed

    Chen, Eric Yu-Hai; Hui, Christy Lai-Ming; Chan, Raymond Chor-Kiu; Dunn, Eva Lai-Wah; Miao, May Yin-King; Yeung, Wai-Song; Wong, Chi-Keung; Chan, Wah-Fat; Tang, Wai-Nang

    2005-06-01

    Neurological soft signs are biological traits that underlie schizophrenia and are found to occur at higher levels in at-risk individuals. The expression of neurological soft signs may be modifiable during the onset of the first psychotic episode and the subsequent evolution of the illness and its treatment. This study investigates neurological soft signs in 138 patients with first-episode schizophrenia and tracks the expression of motor soft signs in the following 3 years. For the 93 patients who have completed the 3-year follow-up, we find that neurological soft signs are stable in the 3 years that follow the first psychotic episode, and that neurological soft signs are already elevated at the presentation of first-episode psychosis in medication-naive subjects. The level of neurological soft signs at clinical stabilization is lower for patients with a shorter duration of untreated psychosis. Although the quantity of neurological soft signs does not significantly change in the 3 years that follow the first episode, the relationship between neurological soft signs and negative symptoms does not become apparent until 1 year after the initial episode. A higher level of neurological soft signs is related to a lower educational level and an older age at onset, but the level of neurological soft signs does not predict the outcome in terms of relapse or occupational functioning.

  10. Likelihood Ratios for Glaucoma Diagnosis Using Spectral Domain Optical Coherence Tomography

    PubMed Central

    Lisboa, Renato; Mansouri, Kaweh; Zangwill, Linda M.; Weinreb, Robert N.; Medeiros, Felipe A.

    2014-01-01

    Purpose To present a methodology for calculating likelihood ratios for glaucoma diagnosis for continuous retinal nerve fiber layer (RNFL) thickness measurements from spectral domain optical coherence tomography (spectral-domain OCT). Design Observational cohort study. Methods 262 eyes of 187 patients with glaucoma and 190 eyes of 100 control subjects were included in the study. Subjects were recruited from the Diagnostic Innovations Glaucoma Study. Eyes with preperimetric and perimetric glaucomatous damage were included in the glaucoma group. The control group was composed of healthy eyes with normal visual fields from subjects recruited from the general population. All eyes underwent RNFL imaging with Spectralis spectral-domain OCT. Likelihood ratios for glaucoma diagnosis were estimated for specific global RNFL thickness measurements using a methodology based on estimating the tangents to the Receiver Operating Characteristic (ROC) curve. Results Likelihood ratios could be determined for continuous values of average RNFL thickness. Average RNFL thickness values lower than 86μm were associated with positive LRs, i.e., LRs greater than 1; whereas RNFL thickness values higher than 86μm were associated with negative LRs, i.e., LRs smaller than 1. A modified Fagan nomogram was provided to assist calculation of post-test probability of disease from the calculated likelihood ratios and pretest probability of disease. Conclusion The methodology allowed calculation of likelihood ratios for specific RNFL thickness values. By avoiding arbitrary categorization of test results, it potentially allows for an improved integration of test results into diagnostic clinical decision-making. PMID:23972303

  11. Teleseismic Lg of Semipalatinsk and Novaya Zemlya Nuclear Explosions Recorded by the GRF (Gräfenberg) Array: Comparison with Regional Lg (BRV) and their Potential for Accurate Yield Estimation

    NASA Astrophysics Data System (ADS)

    Schlittenhardt, J.

    - A comparison of regional and teleseismic log rms (root-mean-square) Lg amplitude measurements have been made for 14 underground nuclear explosions from the East Kazakh test site recorded both by the BRV (Borovoye) station in Kazakhstan and the GRF (Gräfenberg) array in Germany. The log rms Lg amplitudes observed at the BRV regional station at a distance of 690km and at the teleseismic GRF array at a distance exceeding 4700km show very similar relative values (standard deviation 0.048 magnitude units) for underground explosions of different sizes at the Shagan River test site. This result as well as the comparison of BRV rms Lg magnitudes (which were calculated from the log rms amplitudes using an appropriate calibration) with magnitude determinations for P waves of global seismic networks (standard deviation 0.054 magnitude units) point to a high precision in estimating the relative source sizes of explosions from Lg-based single station data. Similar results were also obtained by other investigators (Patton, 1988; Ringdaletal., 1992) using Lg data from different stations at different distances.Additionally, GRF log rms Lg and P-coda amplitude measurements were made for a larger data set from Novaya Zemlya and East Kazakh explosions, which were supplemented with mb(Lg) amplitude measurements using a modified version of Nuttli's (1973, 1986a) method. From this test of the relative performance of the three different magnitude scales, it was found that the Lg and P-coda based magnitudes performed equally well, whereas the modified Nuttli mb(Lg) magnitudes show greater scatter when compared to the worldwide mb reference magnitudes. Whether this result indicates that the rms amplitude measurements are superior to the zero-to-peak amplitude measurement of a single cycle used for the modified Nuttli method, however, cannot be finally assessed, since the calculated mb(Lg) magnitudes are only preliminary until appropriate attenuation corrections are available for the specific path to GRF.

  12. Cytologic diagnosis: expression of probability by clinical pathologists.

    PubMed

    Christopher, Mary M; Hotz, Christine S

    2004-01-01

    Clinical pathologists use descriptive terms or modifiers to express the probability or likelihood of a cytologic diagnosis. Words are imprecise in meaning, however, and may be used and interpreted differently by pathologists and clinicians. The goals of this study were to 1) assess the frequency of use of 18 modifiers, 2) determine the probability of a positive diagnosis implied by the modifiers, 3) identify preferred modifiers for different levels of probability, 4) ascertain the importance of factors that affect expression of diagnostic certainty, and 5) evaluate differences based on gender, employment, and experience. We surveyed 202 clinical pathologists who were board-certified by the American College of Veterinary Pathologists (Clinical Pathology). Surveys were distributed in October 2001 and returned by e-mail, fax, or surface mail over a 2-month period. Results were analyzed by parametric and nonparametric tests. Survey response rate was 47.5% (n = 96) and primarily included clinical pathologists at veterinary schools (n = 58) and diagnostic laboratories (n = 31). Eleven of 18 terms were used "often" or "sometimes" by >/= 50% of respondents. Broad variability was found in the probability assigned to each term, especially those with median values of 75 to 90%. Preferred modifiers for 7 numerical probabilities ranging from 0 to 100% included 68 unique terms; however, a set of 10 terms was used by >/= 50% of respondents. Cellularity and quality of the sample, experience of the pathologist, and implications of the diagnosis were the most important factors affecting the expression of probability. Because of wide discrepancy in the implied likelihood of a diagnosis using words, defined terminology and controlled vocabulary may be useful in improving communication and the quality of data in cytology reporting.

  13. Evaluating topic model interpretability from a primary care physician perspective.

    PubMed

    Arnold, Corey W; Oh, Andrea; Chen, Shawn; Speier, William

    2016-02-01

    Probabilistic topic models provide an unsupervised method for analyzing unstructured text. These models discover semantically coherent combinations of words (topics) that could be integrated in a clinical automatic summarization system for primary care physicians performing chart review. However, the human interpretability of topics discovered from clinical reports is unknown. Our objective is to assess the coherence of topics and their ability to represent the contents of clinical reports from a primary care physician's point of view. Three latent Dirichlet allocation models (50 topics, 100 topics, and 150 topics) were fit to a large collection of clinical reports. Topics were manually evaluated by primary care physicians and graduate students. Wilcoxon Signed-Rank Tests for Paired Samples were used to evaluate differences between different topic models, while differences in performance between students and primary care physicians (PCPs) were tested using Mann-Whitney U tests for each of the tasks. While the 150-topic model produced the best log likelihood, participants were most accurate at identifying words that did not belong in topics learned by the 100-topic model, suggesting that 100 topics provides better relative granularity of discovered semantic themes for the data set used in this study. Models were comparable in their ability to represent the contents of documents. Primary care physicians significantly outperformed students in both tasks. This work establishes a baseline of interpretability for topic models trained with clinical reports, and provides insights on the appropriateness of using topic models for informatics applications. Our results indicate that PCPs find discovered topics more coherent and representative of clinical reports relative to students, warranting further research into their use for automatic summarization. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. Evaluating Topic Model Interpretability from a Primary Care Physician Perspective

    PubMed Central

    Arnold, Corey W.; Oh, Andrea; Chen, Shawn; Speier, William

    2015-01-01

    Background and Objective Probabilistic topic models provide an unsupervised method for analyzing unstructured text. These models discover semantically coherent combinations of words (topics) that could be integrated in a clinical automatic summarization system for primary care physicians performing chart review. However, the human interpretability of topics discovered from clinical reports is unknown. Our objective is to assess the coherence of topics and their ability to represent the contents of clinical reports from a primary care physician’s point of view. Methods Three latent Dirichlet allocation models (50 topics, 100 topics, and 150 topics) were fit to a large collection of clinical reports. Topics were manually evaluated by primary care physicians and graduate students. Wilcoxon Signed-Rank Tests for Paired Samples were used to evaluate differences between different topic models, while differences in performance between students and primary care physicians (PCPs) were tested using Mann-Whitney U tests for each of the tasks. Results While the 150-topic model produced the best log likelihood, participants were most accurate at identifying words that did not belong in topics learned by the 100-topic model, suggesting that 100 topics provides better relative granularity of discovered semantic themes for the data set used in this study. Models were comparable in their ability to represent the contents of documents. Primary care physicians significantly outperformed students in both tasks. Conclusion This work establishes a baseline of interpretability for topic models trained with clinical reports, and provides insights on the appropriateness of using topic models for informatics applications. Our results indicate that PCPs find discovered topics more coherent and representative of clinical reports relative to students, warranting further research into their use for automatic summarization. PMID:26614020

  15. Blend sign predicts poor outcome in patients with intracerebral hemorrhage

    PubMed Central

    Cao, Du; Zhu, Dan; Lv, Fa-Jin; Liu, Yang; Yuan, Liang; Zhang, Gang; Xiong, Xin; Li, Rui; Hu, Yun-Xin; Qin, Xin-Yue; Xie, Peng

    2017-01-01

    Introduction Blend sign has been recently described as a novel imaging marker that predicts hematoma expansion. The purpose of our study was to investigate the prognostic value of CT blend sign in patients with ICH. Objectives and methods Patients with intracerebral hemorrhage who underwent baseline CT scan within 6 hours were included. The presence of blend sign on admission nonenhanced CT was independently assessed by two readers. The functional outcome was assessed by using the modified Rankin Scale (mRS) at 90 days. Results Blend sign was identified in 40 of 238 (16.8%) patients on admission CT scan. The proportion of patients with a poor functional outcome was significantly higher in patients with blend sign than those without blend sign (75.0% versus 47.5%, P = 0.001). The multivariate logistic regression analysis demonstrated that age, intraventricular hemorrhage, admission GCS score, baseline hematoma volume and presence of blend sign on baseline CT independently predict poor functional outcome at 90 days. The CT blend sign independently predicts poor outcome in patients with ICH (odds ratio 3.61, 95% confidence interval [1.47–8.89];p = 0.005). Conclusions Early identification of blend sign is useful in prognostic stratification and may serve as a potential therapeutic target for prospective interventional studies. PMID:28829797

  16. Design Factors Affecting the Reaction Time for Identifying Toilet Signs: A Preliminary Study.

    PubMed

    Chen, Yi-Lang; Sie, Cai-Cin

    2016-04-01

    This study focused on the manner in which design factors affect the reaction time for identifying toilet signs. Taiwanese university students and staff members (50 men, 50 women; M age = 23.5 year, SD = 5.7) participated in the study. The 36 toilet signs were modified on three factors (six presenting styles, two figure-ground exchanges, and three colors), and the reaction time data of all participants were collected when the signs were presented in a simulation onscreen. Participants were quickest when reading Chinese text, followed by graphics and English texts. The findings also showed that men and women had different reaction times across various design combinations. These findings can serve as a reference for practically designing toilet signs, since design factors can lead to difficulties with comprehension based on reaction time measurements. © The Author(s) 2016.

  17. Studying parents and grandparents to assess genetic contributions to early-onset disease.

    PubMed

    Weinberg, Clarice R

    2003-02-01

    Suppose DNA is available from affected individuals, their parents, and their grandparents. Particularly for early-onset diseases, maternally mediated genetic effects can play a role, because the mother determines the prenatal environment. The proposed maximum-likelihood approach for the detection of apparent transmission distortion treats the triad consisting of the affected individual and his or her two parents as the outcome, conditioning on grandparental mating types. Under a null model in which the allele under study does not confer susceptibility, either through linkage or directly, and when there are no maternally mediated genetic effects, conditional probabilities for specific triads are easily derived. A log-linear model permits a likelihood-ratio test (LRT) and allows the estimation of relative penetrances. The proposed approach is robust against genetic population stratification. Missing-data methods permit the inclusion of incomplete families, even if the missing person is the affected grandchild, as is the case when an induced abortion has followed the detection of a malformation. When screening multiple markers, one can begin by genotyping only the grandparents and the affected grandchildren. LRTs based on conditioning on grandparental mating types (i.e., ignoring the parents) have asymptotic relative efficiencies that are typically >150% (per family), compared with tests based on parents. A test for asymmetry in the number of copies carried by maternal versus paternal grandparents yields an LRT specific to maternal effects. One can then genotype the parents for only the genes that passed the initial screen. Conditioning on both the grandparents' and the affected grandchild's genotypes, a third log-linear model captures the remaining information, in an independent LRT for maternal effects.

  18. Optical Communications Channel Combiner

    NASA Technical Reports Server (NTRS)

    Quirk, Kevin J.; Quirk, Kevin J.; Nguyen, Danh H.; Nguyen, Huy

    2012-01-01

    NASA has identified deep-space optical communications links as an integral part of a unified space communication network in order to provide data rates in excess of 100 Mb/s. The distances and limited power inherent in a deep-space optical downlink necessitate the use of photon-counting detectors and a power-efficient modulation such as pulse position modulation (PPM). For the output of each photodetector, whether from a separate telescope or a portion of the detection area, a communication receiver estimates a log-likelihood ratio for each PPM slot. To realize the full effective aperture of these receivers, their outputs must be combined prior to information decoding. A channel combiner was developed to synchronize the log-likelihood ratio (LLR) sequences of multiple receivers, and then combines these into a single LLR sequence for information decoding. The channel combiner synchronizes the LLR sequences of up to three receivers and then combines these into a single LLR sequence for output. The channel combiner has three channel inputs, each of which takes as input a sequence of four-bit LLRs for each PPM slot in a codeword via a XAUI 10 Gb/s quad optical fiber interface. The cross-correlation between the channels LLR time series are calculated and used to synchronize the sequences prior to combining. The output of the channel combiner is a sequence of four-bit LLRs for each PPM slot in a codeword via a XAUI 10 Gb/s quad optical fiber interface. The unit is controlled through a 1 Gb/s Ethernet UDP/IP interface. A deep-space optical communication link has not yet been demonstrated. This ground-station channel combiner was developed to demonstrate this capability and is unique in its ability to process such a signal.

  19. Top pair production in the dilepton decay channel with a tau lepton

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corbo, Matteo

    2012-09-19

    The top quark pair production and decay into leptons with at least one being a τ lepton is studied in the framework of the CDF experiment at the Tevatron proton antiproton collider at Fermilab (USA). The selection requires an electron or a muon produced either by the τ lepton decay or by a W decay. The analysis uses the complete Run II data set i.e. 9.0 fb -1, selected by one trigger based on a low transverse momentum electron or muon plus one isolated charged track. The top quark pair production cross section at 1.96 TeV is measured at 8.2more » ± 1.7 +1.2 -1.1 ± 0.5 pb, and the top branching ratio into τ lepton is measured at 0.120 ± 0.027 +0.022 -0.019 ± 0.007 with statistical, systematics and luminosity uncertainties. These are up to date the most accurate results in this top decay channel and are in good agreement with the results obtained using other decay channels of the top at the Tevatron. The branching ratio is also measured separating the single lepton from the two leptons events with a log likelihood method. This is the first time these two signatures are separately identified. With a fit to data along the log-likelihood variable an alternative measurement of the branching ratio is made: 0.098 ± 0.022(stat:) ± 0.014(syst:); it is in good agreement with the expectations of the Standard Model (with lepton universality) within the experimental uncertainties. The branching ratio is constrained to be less than 0.159 at 95% con dence level. This limit translates into a limit of a top branching ratio into a potential charged Higgs boson.« less

  20. Predicting Grade 3 Acute Diarrhea During Radiation Therapy for Rectal Cancer Using a Cutoff-Dose Logistic Regression Normal Tissue Complication Probability Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, John M., E-mail: jrobertson@beaumont.ed; Soehn, Matthias; Yan Di

    Purpose: Understanding the dose-volume relationship of small bowel irradiation and severe acute diarrhea may help reduce the incidence of this side effect during adjuvant treatment for rectal cancer. Methods and Materials: Consecutive patients treated curatively for rectal cancer were reviewed, and the maximum grade of acute diarrhea was determined. The small bowel was outlined on the treatment planning CT scan, and a dose-volume histogram was calculated for the initial pelvic treatment (45 Gy). Logistic regression models were fitted for varying cutoff-dose levels from 5 to 45 Gy in 5-Gy increments. The model with the highest LogLikelihood was used to developmore » a cutoff-dose normal tissue complication probability (NTCP) model. Results: There were a total of 152 patients (48% preoperative, 47% postoperative, 5% other), predominantly treated prone (95%) with a three-field technique (94%) and a protracted venous infusion of 5-fluorouracil (78%). Acute Grade 3 diarrhea occurred in 21%. The largest LogLikelihood was found for the cutoff-dose logistic regression model with 15 Gy as the cutoff-dose, although the models for 20 Gy and 25 Gy had similar significance. According to this model, highly significant correlations (p <0.001) between small bowel volumes receiving at least 15 Gy and toxicity exist in the considered patient population. Similar findings applied to both the preoperatively (p = 0.001) and postoperatively irradiated groups (p = 0.001). Conclusion: The incidence of Grade 3 diarrhea was significantly correlated with the volume of small bowel receiving at least 15 Gy using a cutoff-dose NTCP model.« less

Top