Trzepacz, Paula T; Meagher, David J; Franco, José G
2016-05-01
Diagnostic classification systems do not incorporate phenomenological research findings about the three core symptom domains of delirium (Attentional/Cognitive, Circadian, Higher Level Thinking). We evaluated classification performances of novel Trzepacz, Meagher, and Franco research diagnostic criteria (TMF) that incorporate those domains and ICD-10, DSM-III-R, DSM-IV, and DSM-5. Primary data analysis of 641 patients with mixed neuropsychiatric profiles. Delirium (n=429) and nondelirium (n=212) reference standard groups were identified using cluster analysis of symptoms assessed using the Delirium Rating Scale-Revised-98. Accuracy, sensitivity, specificity, positive and negative predictive values (PPV, NPV), and likelihood ratios (LR+, LR-) are reported. TMF criteria had high sensitivity and specificity (87.4% and 89.2%), more balanced than DSM-III-R (100% and 31.6%), DSM-IV (97.7% and 74.1%), DSM-5 (97.7% and 72.6%), and ICD-10 (66.2% and 100%). PPV of DSM-III-R, DSM-IV, and DSM-5 were <90.0%, while PPV for ICD-10 and TMF were >90%. ICD-10 had the lowest NPV (59.4%). TMF had the highest LR+ (8.06) and DSM-III-R the lowest LR- (0.0). Overall, values for DSM-IV and DSM-5 were similar, whereas for ICD-10 and DSM-III-R were inverse of each other. In the pre-existing cognitive impairment/dementia subsample (n=128), TMF retained its highest LR+ though specificity (58.3%) became less well balanced with sensitivity (87.9%), which still exceeded that of DSM. TMF research diagnostic criteria performed well, with more balanced sensitivity and specificity and the highest likelihood ratio for delirium identification. Reflecting the three core domains of delirium, TMF criteria may have advantages in biological research where delineation of this syndrome is important. Copyright © 2016. Published by Elsevier Inc.
Dugué, Audrey Emmanuelle; Pulido, Marina; Chabaud, Sylvie; Belin, Lisa; Gal, Jocelyn
2016-12-01
We describe how to estimate progression-free survival while dealing with interval-censored data in the setting of clinical trials in oncology. Three procedures with SAS and R statistical software are described: one allowing for a nonparametric maximum likelihood estimation of the survival curve using the EM-ICM (Expectation and Maximization-Iterative Convex Minorant) algorithm as described by Wellner and Zhan in 1997; a sensitivity analysis procedure in which the progression time is assigned (i) at the midpoint, (ii) at the upper limit (reflecting the standard analysis when the progression time is assigned at the first radiologic exam showing progressive disease), or (iii) at the lower limit of the censoring interval; and finally, two multiple imputations are described considering a uniform or the nonparametric maximum likelihood estimation (NPMLE) distribution. Clin Cancer Res; 22(23); 5629-35. ©2016 AACR. ©2016 American Association for Cancer Research.
A New Monte Carlo Method for Estimating Marginal Likelihoods.
Wang, Yu-Bo; Chen, Ming-Hui; Kuo, Lynn; Lewis, Paul O
2018-06-01
Evaluating the marginal likelihood in Bayesian analysis is essential for model selection. Estimators based on a single Markov chain Monte Carlo sample from the posterior distribution include the harmonic mean estimator and the inflated density ratio estimator. We propose a new class of Monte Carlo estimators based on this single Markov chain Monte Carlo sample. This class can be thought of as a generalization of the harmonic mean and inflated density ratio estimators using a partition weighted kernel (likelihood times prior). We show that our estimator is consistent and has better theoretical properties than the harmonic mean and inflated density ratio estimators. In addition, we provide guidelines on choosing optimal weights. Simulation studies were conducted to examine the empirical performance of the proposed estimator. We further demonstrate the desirable features of the proposed estimator with two real data sets: one is from a prostate cancer study using an ordinal probit regression model with latent variables; the other is for the power prior construction from two Eastern Cooperative Oncology Group phase III clinical trials using the cure rate survival model with similar objectives.
Vlacich, Gregory; Samson, Pamela P; Perkins, Stephanie M; Roach, Michael C; Parikh, Parag J; Bradley, Jeffrey D; Lockhart, A Craig; Puri, Varun; Meyers, Bryan F; Kozower, Benjamin; Robinson, Cliff G
2017-12-01
For elderly patients with locally advanced esophageal cancer, therapeutic approaches and outcomes in a modern cohort are not well characterized. Patients ≥70 years old with clinical stage II and III esophageal cancer diagnosed between 1998 and 2012 were identified from the National Cancer Database and stratified based on treatment type. Variables associated with treatment utilization were evaluated using logistic regression and survival evaluated using Cox proportional hazards analysis. Propensity matching (1:1) was performed to help account for selection bias. A total of 21,593 patients were identified. Median and maximum ages were 77 and 90, respectively. Treatment included palliative therapy (24.3%), chemoradiation (37.1%), trimodality therapy (10.0%), esophagectomy alone (5.6%), or no therapy (12.9%). Age ≥80 (OR 0.73), female gender (OR 0.81), Charlson-Deyo comorbidity score ≥2 (OR 0.82), and high-volume centers (OR 0.83) were associated with a decreased likelihood of palliative therapy versus no treatment. Age ≥80 (OR 0.79) and Clinical Stage III (OR 0.33) were associated with a decreased likelihood, while adenocarcinoma histology (OR 1.33) and nonacademic cancer centers (OR 3.9), an increased likelihood of esophagectomy alone compared to definitive chemoradiation. Age ≥80 (OR 0.15), female gender (OR 0.80), and non-Caucasian race (OR 0.63) were associated with a decreased likelihood, while adenocarcinoma histology (OR 2.10) and high-volume centers (OR 2.34), an increased likelihood of trimodality therapy compared to definitive chemoradiation. Each treatment type demonstrated improved survival compared to no therapy: palliative treatment (HR 0.49) to trimodality therapy (HR 0.25) with significance between all groups. Any therapy, including palliative care, was associated with improved survival; however, subsets of elderly patients with locally advanced esophageal cancer are less likely to receive aggressive therapy. Care should be taken to not unnecessarily deprive these individuals of treatment that may improve survival. © 2017 The Authors. Cancer Medicine published by John Wiley & Sons Ltd.
Using DNA fingerprints to infer familial relationships within NHANES III households
Katki, Hormuzd A.; Sanders, Christopher L.; Graubard, Barry I.; Bergen, Andrew W.
2009-01-01
Developing, targeting, and evaluating genomic strategies for population-based disease prevention require population-based data. In response to this urgent need, genotyping has been conducted within the Third National Health and Nutrition Examination (NHANES III), the nationally-representative household-interview health survey in the U.S. However, before these genetic analyses can occur, family relationships within households must be accurately ascertained. Unfortunately, reported family relationships within NHANES III households based on questionnaire data are incomplete and inconclusive with regards to actual biological relatedness of family members. We inferred family relationships within households using DNA fingerprints (Identifiler®) that contain the DNA loci used by law enforcement agencies for forensic identification of individuals. However, performance of these loci for relationship inference is not well understood. We evaluated two competing statistical methods for relationship inference on pairs of household members: an exact likelihood ratio relying on allele frequencies to an Identical By State (IBS) likelihood ratio that only requires matching alleles. We modified these methods to account for genotyping errors and population substructure. The two methods usually agree on the rankings of the most likely relationships. However, the IBS method underestimates the likelihood ratio by not accounting for the informativeness of matching rare alleles. The likelihood ratio is sensitive to estimates of population substructure, and parent-child relationships are sensitive to the specified genotyping error rate. These loci were unable to distinguish second-degree relationships and cousins from being unrelated. The genetic data is also useful for verifying reported relationships and identifying data quality issues. An important by-product is the first explicitly nationally-representative estimates of allele frequencies at these ubiquitous forensic loci. PMID:20664713
A Likelihood Ratio Test Regarding Two Nested But Oblique Order Restricted Hypotheses.
1982-11-01
Report #90 DIC JAN 2 411 ISMO. H American Mathematical Society 1979 subject classification Primary 62F03 Secondary 62E15 Key words and phrases: Order...model. A likelihood ratio test for these two restrictions is studied . Asa *a .on . r 373 RA&J *iii - ,sa~m muwod [] v~ -F: :.v"’. os "- 1...investigation was stimulated partly by a problem encountered in psychiatric research. [Winokur et al., 1971] studied data on psychiatric illnesses afflicting
Algorithms of maximum likelihood data clustering with applications
NASA Astrophysics Data System (ADS)
Giada, Lorenzo; Marsili, Matteo
2002-12-01
We address the problem of data clustering by introducing an unsupervised, parameter-free approach based on maximum likelihood principle. Starting from the observation that data sets belonging to the same cluster share a common information, we construct an expression for the likelihood of any possible cluster structure. The likelihood in turn depends only on the Pearson's coefficient of the data. We discuss clustering algorithms that provide a fast and reliable approximation to maximum likelihood configurations. Compared to standard clustering methods, our approach has the advantages that (i) it is parameter free, (ii) the number of clusters need not be fixed in advance and (iii) the interpretation of the results is transparent. In order to test our approach and compare it with standard clustering algorithms, we analyze two very different data sets: time series of financial market returns and gene expression data. We find that different maximization algorithms produce similar cluster structures whereas the outcome of standard algorithms has a much wider variability.
NASA Astrophysics Data System (ADS)
Silberman, L.; Dekel, A.; Eldar, A.; Zehavi, I.
2001-08-01
We allow for nonlinear effects in the likelihood analysis of galaxy peculiar velocities and obtain ~35% lower values for the cosmological density parameter Ωm and for the amplitude of mass density fluctuations σ8Ω0.6m. This result is obtained under the assumption that the power spectrum in the linear regime is of the flat ΛCDM model (h=0.65, n=1, COBE normalized) with only Ωm as a free parameter. Since the likelihood is driven by the nonlinear regime, we ``break'' the power spectrum at kb~0.2 (h-1 Mpc)-1 and fit a power law at k>kb. This allows for independent matching of the nonlinear behavior and an unbiased fit in the linear regime. The analysis assumes Gaussian fluctuations and errors and a linear relation between velocity and density. Tests using mock catalogs that properly simulate nonlinear effects demonstrate that this procedure results in a reduced bias and a better fit. We find for the Mark III and SFI data Ωm=0.32+/-0.06 and 0.37+/-0.09, respectively, with σ8Ω0.6m=0.49+/-0.06 and 0.63+/-0.08, in agreement with constraints from other data. The quoted 90% errors include distance errors and cosmic variance, for fixed values of the other parameters. The improvement in the likelihood due to the nonlinear correction is very significant for Mark III and moderately significant for SFI. When allowing deviations from ΛCDM, we find an indication for a wiggle in the power spectrum: an excess near k~0.05 (h-1 Mpc)-1 and a deficiency at k~0.1 (h-1 Mpc)-1, or a ``cold flow.'' This may be related to the wiggle seen in the power spectrum from redshift surveys and the second peak in the cosmic microwave background (CMB) anisotropy. A χ2 test applied to modes of a principal component analysis (PCA) shows that the nonlinear procedure improves the goodness of fit and reduces a spatial gradient that was of concern in the purely linear analysis. The PCA allows us to address spatial features of the data and to evaluate and fine-tune the theoretical and error models. It demonstrates in particular that the models used are appropriate for the cosmological parameter estimation performed. We address the potential for optimal data compression using PCA.
Rodríguez-Escudero, Juan Pablo; López-Jiménez, Francisco; Trejo-Gutiérrez, Jorge F
2011-01-01
This article reviews different characteristics of validity in a clinical diagnostic test. In particular, we emphasize the likelihood ratio as an instrument that facilitates the use of epidemiologic concepts in clinical diagnosis.
Ji, B; Jin, X-B
2017-08-01
We conducted this prospective comparative study to examine the hypothesis that varicocele was associated with hypogonadism and impaired erectile function as reflected in International Index of Erectile Function-5 (IIEF-5) scores as well as nocturnal penile tumescence and rigidity (NPTR) parameters. From December 2014 to December 2015, a total of 130 males with varicocele complaining of infertility or scrotal discomfort and 130 age-matched healthy males chosen from volunteer healthy hospital staff as controls were recruited in this study. Serum testosterone (TT) levels and IIEF-5 scores as well as NPTR parameters were evaluated and compared between varicocele and control subjects. All participants were further grouped into hypogonadism based on the cut-off value 300 ng/dL. A total of 45 of 130 patients were identified as hypogonadism, while it was not found in control subjects. A multivariate logistic regression with likelihood ratio test revealed that TT levels as well as grade III and II varicocele posed significant indicators for hypogonadism occurrence (chi-square of likelihood ratio = 12.40, df = 3, p < .01). Furthermore, TT levels and infertility duration were associated with IIEF-5 scores in a multivariate linear regression analysis (adjusted R 2 = 0.545). In conclusion, the correlation of grade III and II varicocele with an increased risk of hypogonadism was confirmed in this study and an impaired erectile function correlated with TT levels and infertility duration was also observed. © 2016 Blackwell Verlag GmbH.
Onishi, Taku; Tsukamoto, Katsura; Matsumaru, Naoki; Waki, Takashi
2018-01-01
Efforts to promote the development of pediatric pharmacotherapy include regulatory frameworks and close collaboration between the US Food and Drug Administration and the European Medicines Agency. We characterized the current status of pediatric clinical trials conducted in the United States by the pharmaceutical industry, focusing on the involvement of the European Union member countries, to clarify the industry perspective. Data on US pediatric clinical trials were obtained from ClinicalTrials.gov . Binary regression analysis was performed to identify what factors influence the likelihood of involvement of European Union countries. A total of 633 US pediatric clinical trials that met inclusion criteria were extracted and surveyed. Of these, 206 (32.5%) involved a European Union country site(s). The results of binary regression analysis indicated that attribution of industry, phase, disease area, and age of pediatric participants influenced the likelihood of the involvement of European Union countries in US pediatric clinical trials. Relatively complicated or large pediatric clinical trials, such as phase II and III trials and those that included a broad age range of participants, had a significantly greater likelihood of the involvement of European Union countries ( P < .05). Our results suggest that (1) the pharmaceutical industry utilizes regulatory frameworks in making business decisions regarding pediatric clinical trials, (2) disease area affects the involvement of European Union countries, and (3) feasibility of clinical trials is mainly concerned by pharmaceutical industry for pediatric drug development. Additional incentives for high marketability may further motivate pharmaceutical industry to develop pediatric drugs.
NASA Astrophysics Data System (ADS)
Baluev, Roman V.
2013-08-01
We present PlanetPack, a new software tool that we developed to facilitate and standardize the advanced analysis of radial velocity (RV) data for the goal of exoplanets detection, characterization, and basic dynamical N-body simulations. PlanetPack is a command-line interpreter, that can run either in an interactive mode or in a batch mode of automatic script interpretation. Its major abilities include: (i) advanced RV curve fitting with the proper maximum-likelihood treatment of unknown RV jitter; (ii) user-friendly multi-Keplerian as well as Newtonian N-body RV fits; (iii) use of more efficient maximum-likelihood periodograms that involve the full multi-planet fitting (sometimes called as “residual” or “recursive” periodograms); (iv) easily calculatable parametric 2D likelihood function level contours, reflecting the asymptotic confidence regions; (v) fitting under some useful functional constraints is user-friendly; (vi) basic tasks of short- and long-term planetary dynamical simulation using a fast Everhart-type integrator based on Gauss-Legendre spacings; (vii) fitting the data with red noise (auto-correlated errors); (viii) various analytical and numerical methods for the tasks of determining the statistical significance. It is planned that further functionality may be added to PlanetPack in the future. During the development of this software, a lot of effort was made to improve the calculational speed, especially for CPU-demanding tasks. PlanetPack was written in pure C++ (standard of 1998/2003), and is expected to be compilable and useable on a wide range of platforms.
Jindal, Shveta; Dada, Tanuj; Sreenivas, V; Gupta, Viney; Sihota, Ramanjit; Panda, Anita
2010-01-01
Purpose: To compare the diagnostic performance of the Heidelberg retinal tomograph (HRT) glaucoma probability score (GPS) with that of Moorfield’s regression analysis (MRA). Materials and Methods: The study included 50 eyes of normal subjects and 50 eyes of subjects with early-to-moderate primary open angle glaucoma. Images were obtained by using HRT version 3.0. Results: The agreement coefficient (weighted k) for the overall MRA and GPS classification was 0.216 (95% CI: 0.119 – 0.315). The sensitivity and specificity were evaluated using the most specific (borderline results included as test negatives) and least specific criteria (borderline results included as test positives). The MRA sensitivity and specificity were 30.61 and 98% (most specific) and 57.14 and 98% (least specific). The GPS sensitivity and specificity were 81.63 and 73.47% (most specific) and 95.92 and 34.69% (least specific). The MRA gave a higher positive likelihood ratio (28.57 vs. 3.08) and the GPS gave a higher negative likelihood ratio (0.25 vs. 0.44).The sensitivity increased with increasing disc size for both MRA and GPS. Conclusions: There was a poor agreement between the overall MRA and GPS classifications. GPS tended to have higher sensitivities, lower specificities, and lower likelihood ratios than the MRA. The disc size should be taken into consideration when interpreting the results of HRT, as both the GPS and MRA showed decreased sensitivity for smaller discs and the GPS showed decreased specificity for larger discs. PMID:20952832
40 CFR 300.415 - Removal action.
Code of Federal Regulations, 2014 CFR
2014-07-01
... food chain from hazardous substances or pollutants or contaminants; (ii) Actual or potential contamination of drinking water supplies or sensitive ecosystems; (iii) Hazardous substances or pollutants or...—where it will reduce the likelihood of spillage; leakage; exposure to humans, animals, or food chain; or...
40 CFR 300.415 - Removal action.
Code of Federal Regulations, 2012 CFR
2012-07-01
... food chain from hazardous substances or pollutants or contaminants; (ii) Actual or potential contamination of drinking water supplies or sensitive ecosystems; (iii) Hazardous substances or pollutants or...—where it will reduce the likelihood of spillage; leakage; exposure to humans, animals, or food chain; or...
O'Connell, Michael J.; Lavery, Ian; Yothers, Greg; Paik, Soonmyung; Clark-Langone, Kim M.; Lopatin, Margarita; Watson, Drew; Baehner, Frederick L.; Shak, Steven; Baker, Joffre; Cowens, J. Wayne; Wolmark, Norman
2010-01-01
Purpose These studies were conducted to determine the relationship between quantitative tumor gene expression and risk of cancer recurrence in patients with stage II or III colon cancer treated with surgery alone or surgery plus fluorouracil (FU) and leucovorin (LV) to develop multigene algorithms to quantify the risk of recurrence as well as the likelihood of differential treatment benefit of FU/LV adjuvant chemotherapy for individual patients. Patients and Methods We performed quantitative reverse transcription polymerase chain reaction (RT-qPCR) on RNA extracted from fixed, paraffin-embedded (FPE) tumor blocks from patients with stage II or III colon cancer who were treated with surgery alone (n = 270 from National Surgical Adjuvant Breast and Bowel Project [NSABP] C-01/C-02 and n = 765 from Cleveland Clinic [CC]) or surgery plus FU/LV (n = 308 from NSABP C-04 and n = 508 from NSABP C-06). Overall, 761 candidate genes were studied in C-01/C-02 and C-04, and a subset of 375 genes was studied in CC/C-06. Results A combined analysis of the four studies identified 48 genes significantly associated with risk of recurrence and 66 genes significantly associated with FU/LV benefit (with four genes in common). Seven recurrence-risk genes, six FU/LV-benefit genes, and five reference genes were selected, and algorithms were developed to identify groups of patients with low, intermediate, and high likelihood of recurrence and benefit from FU/LV. Conclusion RT-qPCR of FPE colon cancer tissue applied to four large independent populations has been used to develop multigene algorithms for estimating recurrence risk and benefit from FU/LV. These algorithms are being independently validated, and their clinical utility is being evaluated in the Quick and Simple and Reliable (QUASAR) study. PMID:20679606
O'Connell, Michael J; Lavery, Ian; Yothers, Greg; Paik, Soonmyung; Clark-Langone, Kim M; Lopatin, Margarita; Watson, Drew; Baehner, Frederick L; Shak, Steven; Baker, Joffre; Cowens, J Wayne; Wolmark, Norman
2010-09-01
These studies were conducted to determine the relationship between quantitative tumor gene expression and risk of cancer recurrence in patients with stage II or III colon cancer treated with surgery alone or surgery plus fluorouracil (FU) and leucovorin (LV) to develop multigene algorithms to quantify the risk of recurrence as well as the likelihood of differential treatment benefit of FU/LV adjuvant chemotherapy for individual patients. We performed quantitative reverse transcription polymerase chain reaction (RT-qPCR) on RNA extracted from fixed, paraffin-embedded (FPE) tumor blocks from patients with stage II or III colon cancer who were treated with surgery alone (n = 270 from National Surgical Adjuvant Breast and Bowel Project [NSABP] C-01/C-02 and n = 765 from Cleveland Clinic [CC]) or surgery plus FU/LV (n = 308 from NSABP C-04 and n = 508 from NSABP C-06). Overall, 761 candidate genes were studied in C-01/C-02 and C-04, and a subset of 375 genes was studied in CC/C-06. A combined analysis of the four studies identified 48 genes significantly associated with risk of recurrence and 66 genes significantly associated with FU/LV benefit (with four genes in common). Seven recurrence-risk genes, six FU/LV-benefit genes, and five reference genes were selected, and algorithms were developed to identify groups of patients with low, intermediate, and high likelihood of recurrence and benefit from FU/LV. RT-qPCR of FPE colon cancer tissue applied to four large independent populations has been used to develop multigene algorithms for estimating recurrence risk and benefit from FU/LV. These algorithms are being independently validated, and their clinical utility is being evaluated in the Quick and Simple and Reliable (QUASAR) study.
NASA Astrophysics Data System (ADS)
Pellejero-Ibanez, Marcos; Chuang, Chia-Hsun; Rubiño-Martín, J. A.; Cuesta, Antonio J.; Wang, Yuting; Zhao, Gongbo; Ross, Ashley J.; Rodríguez-Torres, Sergio; Prada, Francisco; Slosar, Anže; Vazquez, Jose A.; Alam, Shadab; Beutler, Florian; Eisenstein, Daniel J.; Gil-Marín, Héctor; Grieb, Jan Niklas; Ho, Shirley; Kitaura, Francisco-Shu; Percival, Will J.; Rossi, Graziano; Salazar-Albornoz, Salvador; Samushia, Lado; Sánchez, Ariel G.; Satpathy, Siddharth; Seo, Hee-Jong; Tinker, Jeremy L.; Tojeiro, Rita; Vargas-Magaña, Mariana; Brownstein, Joel R.; Nichol, Robert C.; Olmstead, Matthew D.
2017-07-01
We develop a new computationally efficient methodology called double-probe analysis with the aim of minimizing informative priors (those coming from extra probes) in the estimation of cosmological parameters. Using our new methodology, we extract the dark energy model-independent cosmological constraints from the joint data sets of the Baryon Oscillation Spectroscopic Survey (BOSS) galaxy sample and Planck cosmic microwave background (CMB) measurements. We measure the mean values and covariance matrix of {R, la, Ωbh2, ns, log(As), Ωk, H(z), DA(z), f(z)σ8(z)}, which give an efficient summary of the Planck data and two-point statistics from the BOSS galaxy sample. The CMB shift parameters are R=√{Ω _m H_0^2} r(z_*) and la = πr(z*)/rs(z*), where z* is the redshift at the last scattering surface, and r(z*) and rs(z*) denote our comoving distance to the z* and sound horizon at z*, respectively; Ωb is the baryon fraction at z = 0. This approximate methodology guarantees that we will not need to put informative priors on the cosmological parameters that galaxy clustering is unable to constrain, I.e. Ωbh2 and ns. The main advantage is that the computational time required for extracting these parameters is decreased by a factor of 60 with respect to exact full-likelihood analyses. The results obtained show no tension with the flat Λ cold dark matter (ΛCDM) cosmological paradigm. By comparing with the full-likelihood exact analysis with fixed dark energy models, on one hand we demonstrate that the double-probe method provides robust cosmological parameter constraints that can be conveniently used to study dark energy models, and on the other hand we provide a reliable set of measurements assuming dark energy models to be used, for example, in distance estimations. We extend our study to measure the sum of the neutrino mass using different methodologies, including double-probe analysis (introduced in this study), full-likelihood analysis and single-probe analysis. From full-likelihood analysis, we obtain Σmν < 0.12 (68 per cent), assuming ΛCDM and Σmν < 0.20 (68 per cent) assuming owCDM. We also find that there is degeneracy between observational systematics and neutrino masses, which suggests that one should take great care when estimating these parameters in the case of not having control over the systematics of a given sample.
Hopfer, Suellen; Tan, Xianming; Wylie, John L
2014-05-01
We assessed whether a meaningful set of latent risk profiles could be identified in an inner-city population through individual and network characteristics of substance use, sexual behaviors, and mental health status. Data came from 600 participants in Social Network Study III, conducted in 2009 in Winnipeg, Manitoba, Canada. We used latent class analysis (LCA) to identify risk profiles and, with covariates, to identify predictors of class. A 4-class model of risk profiles fit the data best: (1) solitary users reported polydrug use at the individual level, but low probabilities of substance use or concurrent sexual partners with network members; (2) social-all-substance users reported polydrug use at the individual and network levels; (3) social-noninjection drug users reported less likelihood of injection drug and solvent use; (4) low-risk users reported low probabilities across substances. Unstable housing, preadolescent substance use, age, and hepatitis C status predicted risk profiles. Incorporation of social network variables into LCA can distinguish important subgroups with varying patterns of risk behaviors that can lead to sexually transmitted and bloodborne infections.
Challenges in Species Tree Estimation Under the Multispecies Coalescent Model
Xu, Bo; Yang, Ziheng
2016-01-01
The multispecies coalescent (MSC) model has emerged as a powerful framework for inferring species phylogenies while accounting for ancestral polymorphism and gene tree-species tree conflict. A number of methods have been developed in the past few years to estimate the species tree under the MSC. The full likelihood methods (including maximum likelihood and Bayesian inference) average over the unknown gene trees and accommodate their uncertainties properly but involve intensive computation. The approximate or summary coalescent methods are computationally fast and are applicable to genomic datasets with thousands of loci, but do not make an efficient use of information in the multilocus data. Most of them take the two-step approach of reconstructing the gene trees for multiple loci by phylogenetic methods and then treating the estimated gene trees as observed data, without accounting for their uncertainties appropriately. In this article we review the statistical nature of the species tree estimation problem under the MSC, and explore the conceptual issues and challenges of species tree estimation by focusing mainly on simple cases of three or four closely related species. We use mathematical analysis and computer simulation to demonstrate that large differences in statistical performance may exist between the two classes of methods. We illustrate that several counterintuitive behaviors may occur with the summary methods but they are due to inefficient use of information in the data by summary methods and vanish when the data are analyzed using full-likelihood methods. These include (i) unidentifiability of parameters in the model, (ii) inconsistency in the so-called anomaly zone, (iii) singularity on the likelihood surface, and (iv) deterioration of performance upon addition of more data. We discuss the challenges and strategies of species tree inference for distantly related species when the molecular clock is violated, and highlight the need for improving the computational efficiency and model realism of the likelihood methods as well as the statistical efficiency of the summary methods. PMID:27927902
75 FR 60563 - Family Day, 2010
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-30
... Part III The President Proclamation 8570--Family Day, 2010 #0; #0; #0; Presidential Documents #0... Family Day, 2010 By the President of the United States of America A Proclamation Committed families shape... likelihood their loved ones will use alcohol and illicit drugs. On Family Day, we honor the devotion of...
Code of Federal Regulations, 2010 CFR
2010-01-01
... that the facts that caused the deficient share-asset ratio no longer exist; and (ii) The likelihood of further depreciation of the share-asset ratio is not probable; and (iii) The return of the share-asset ratio to its normal limits within a reasonable time for the credit union concerned is probable; and (iv...
NASA Astrophysics Data System (ADS)
Kutzleb, C. D.
1997-02-01
The high incidence of recidivism (repeat offenders) in the criminal population makes the use of the IAFIS III/FBI criminal database an important tool in law enforcement. The problems and solutions employed by IAFIS III/FBI criminal subject searches are discussed for the following topics: (1) subject search selectivity and reliability; (2) the difficulty and limitations of identifying subjects whose anonymity may be a prime objective; (3) database size, search workload, and search response time; (4) techniques and advantages of normalizing the variability in an individual's name and identifying features into identifiable and discrete categories; and (5) the use of database demographics to estimate the likelihood of a match between a search subject and database subjects.
Muhoozi, Grace K M; Atukunda, Prudence; Mwadime, Robert; Iversen, Per Ole; Westerberg, Ane C
2016-01-01
Undernutrition continues to pose challenges to Uganda's children, but there is limited knowledge on its association with physical and intellectual development. In this cross-sectional study, we assessed the nutritional status and milestone development of 6- to 8-month-old children and associated factors in two districts of southwestern Uganda. Five hundred and twelve households with mother-infant (6-8 months) pairs were randomly sampled. Data about background variables (e.g. household characteristics, poverty likelihood, and child dietary diversity scores (CDDS)) were collected using questionnaires. Bayley Scales of Infant and Toddler Development (BSID III) and Ages and Stages questionnaires (ASQ) were used to collect data on child development. Anthropometric measures were used to determine z-scores for weight-for-age (WAZ), length-for-age (LAZ), weight-for-length (WLZ), head circumference (HCZ), and mid-upper arm circumference. Chi-square tests, correlation coefficients, and linear regression analyses were used to relate background variables, nutritional status indicators, and infant development. The prevalence of underweight, stunting, and wasting was 12.1, 24.6, and 4.7%, respectively. Household head education, gender, sanitation, household size, maternal age and education, birth order, poverty likelihood, and CDDS were associated (p<0.05) with WAZ, LAZ, and WLZ. Regression analysis showed that gender, sanitation, CDDS, and likelihood to be below the poverty line were predictors (p<0.05) of undernutrition. BSID III indicated development delay of 1.3% in cognitive and language, and 1.6% in motor development. The ASQ indicated delayed development of 24, 9.1, 25.2, 12.2, and 15.1% in communication, fine motor, gross motor, problem solving, and personal social ability, respectively. All nutritional status indicators except HCZ were positively and significantly associated with development domains. WAZ was the main predictor for all development domains. Undernutrition among infants living in impoverished rural Uganda was associated with household sanitation, poverty, and low dietary diversity. Development domains were positively and significantly associated with nutritional status. Nutritional interventions might add value to improvement of child growth and development.
Muhoozi, Grace K. M.; Atukunda, Prudence; Mwadime, Robert; Iversen, Per Ole; Westerberg, Ane C.
2016-01-01
Background Undernutrition continues to pose challenges to Uganda's children, but there is limited knowledge on its association with physical and intellectual development. Objective In this cross-sectional study, we assessed the nutritional status and milestone development of 6- to 8-month-old children and associated factors in two districts of southwestern Uganda. Design Five hundred and twelve households with mother–infant (6–8 months) pairs were randomly sampled. Data about background variables (e.g. household characteristics, poverty likelihood, and child dietary diversity scores (CDDS)) were collected using questionnaires. Bayley Scales of Infant and Toddler Development (BSID III) and Ages and Stages questionnaires (ASQ) were used to collect data on child development. Anthropometric measures were used to determine z-scores for weight-for-age (WAZ), length-for-age (LAZ), weight-for-length (WLZ), head circumference (HCZ), and mid-upper arm circumference. Chi-square tests, correlation coefficients, and linear regression analyses were used to relate background variables, nutritional status indicators, and infant development. Results The prevalence of underweight, stunting, and wasting was 12.1, 24.6, and 4.7%, respectively. Household head education, gender, sanitation, household size, maternal age and education, birth order, poverty likelihood, and CDDS were associated (p<0.05) with WAZ, LAZ, and WLZ. Regression analysis showed that gender, sanitation, CDDS, and likelihood to be below the poverty line were predictors (p<0.05) of undernutrition. BSID III indicated development delay of 1.3% in cognitive and language, and 1.6% in motor development. The ASQ indicated delayed development of 24, 9.1, 25.2, 12.2, and 15.1% in communication, fine motor, gross motor, problem solving, and personal social ability, respectively. All nutritional status indicators except HCZ were positively and significantly associated with development domains. WAZ was the main predictor for all development domains. Conclusion Undernutrition among infants living in impoverished rural Uganda was associated with household sanitation, poverty, and low dietary diversity. Development domains were positively and significantly associated with nutritional status. Nutritional interventions might add value to improvement of child growth and development. PMID:27238555
Nonlinear Peculiar-Velocity Analysis and PCA
NASA Astrophysics Data System (ADS)
Dekel, Avishai; Eldar, Amiram; Silberman, Lior; Zehavi, Idit
We allow for nonlinear effects in the likelihood analysis of peculiar velocities, and obtain ˜35%-lower values for the cosmological density parameter and for the amplitude of mass-density fluctuations. The power spectrum in the linear regime is assumed to be of the flat ΛCDM model (h = 0.65, n = 1) with only Ω_m free. Since the likelihood is driven by the nonlinear regime, we "break" the power spectrum at k_b˜ 0.2 (h^{-1}Mpc)^{-1} and fit a two-parameter power-law at k > k b . This allows for an unbiased fit in the linear regime. Tests using improved mock catalogs demonstrate a reduced bias and a better fit. We find for the Mark III and SFI data Ω_m = 0.35± 0.09 with σ_8Ω_m^{0.6} = 0.55± 0.10 (90% errors). When allowing deviations from ΛCDM, we find an indication for a wiggle in the power spectrum in the form of an excess near k ˜ 0.05 and a deficiency at k ˜ 0.1 (h^{-1}Mpc)^{-1} - a "cold flow" which may be related to a feature indicated from redshift surveys and the second peak in the CMB anisotropy. A χ^2 test applied to principal modes demonstrates that the nonlinear procedure improves the goodness of fit. The Principal Component Analysis (PCA) helps identifying spatial features of the data and fine-tuning the theoretical and error models. We address the potential for optimal data compression using PCA.
Liu, Dungang; Liu, Regina; Xie, Minge
2014-01-01
Meta-analysis has been widely used to synthesize evidence from multiple studies for common hypotheses or parameters of interest. However, it has not yet been fully developed for incorporating heterogeneous studies, which arise often in applications due to different study designs, populations or outcomes. For heterogeneous studies, the parameter of interest may not be estimable for certain studies, and in such a case, these studies are typically excluded from conventional meta-analysis. The exclusion of part of the studies can lead to a non-negligible loss of information. This paper introduces a metaanalysis for heterogeneous studies by combining the confidence density functions derived from the summary statistics of individual studies, hence referred to as the CD approach. It includes all the studies in the analysis and makes use of all information, direct as well as indirect. Under a general likelihood inference framework, this new approach is shown to have several desirable properties, including: i) it is asymptotically as efficient as the maximum likelihood approach using individual participant data (IPD) from all studies; ii) unlike the IPD analysis, it suffices to use summary statistics to carry out the CD approach. Individual-level data are not required; and iii) it is robust against misspecification of the working covariance structure of the parameter estimates. Besides its own theoretical significance, the last property also substantially broadens the applicability of the CD approach. All the properties of the CD approach are further confirmed by data simulated from a randomized clinical trials setting as well as by real data on aircraft landing performance. Overall, one obtains an unifying approach for combining summary statistics, subsuming many of the existing meta-analysis methods as special cases. PMID:26190875
28 CFR 50.9 - Policy with regard to open judicial proceedings.
Code of Federal Regulations, 2011 CFR
2011-07-01
... safety of parties, witnesses, or other persons; or (iii) A substantial likelihood that ongoing... judicial proceedings pursuant to 18 U.S.C. 3509 (d) and (e) for the protection of child victims or child witnesses. (f) Because of the vital public interest in open judicial proceedings, the records of any...
28 CFR 50.9 - Policy with regard to open judicial proceedings.
Code of Federal Regulations, 2014 CFR
2014-07-01
... safety of parties, witnesses, or other persons; or (iii) A substantial likelihood that ongoing... judicial proceedings pursuant to 18 U.S.C. 3509 (d) and (e) for the protection of child victims or child witnesses. (f) Because of the vital public interest in open judicial proceedings, the records of any...
28 CFR 50.9 - Policy with regard to open judicial proceedings.
Code of Federal Regulations, 2012 CFR
2012-07-01
... safety of parties, witnesses, or other persons; or (iii) A substantial likelihood that ongoing... judicial proceedings pursuant to 18 U.S.C. 3509 (d) and (e) for the protection of child victims or child witnesses. (f) Because of the vital public interest in open judicial proceedings, the records of any...
28 CFR 50.9 - Policy with regard to open judicial proceedings.
Code of Federal Regulations, 2013 CFR
2013-07-01
... safety of parties, witnesses, or other persons; or (iii) A substantial likelihood that ongoing... judicial proceedings pursuant to 18 U.S.C. 3509 (d) and (e) for the protection of child victims or child witnesses. (f) Because of the vital public interest in open judicial proceedings, the records of any...
Decisions Under Uncertainty III: Rationality Issues, Sex Stereotypes, and Sex Role Appropriateness.
ERIC Educational Resources Information Center
Bonoma, Thomas V.
The explanatory cornerstone of most currently viable social theories is a strict cost-gain assumption. The clearest formal explication of this view is contained in subjective expected utility models (SEU), in which individuals are assumed to scale their subjective likelihood estimates of decisional consequences and the personalistic worth or…
Identification of the remains of King Richard III.
King, Turi E; Fortes, Gloria Gonzalez; Balaresque, Patricia; Thomas, Mark G; Balding, David; Maisano Delser, Pierpaolo; Neumann, Rita; Parson, Walther; Knapp, Michael; Walsh, Susan; Tonasso, Laure; Holt, John; Kayser, Manfred; Appleby, Jo; Forster, Peter; Ekserdjian, David; Hofreiter, Michael; Schürer, Kevin
2014-12-02
In 2012, a skeleton was excavated at the presumed site of the Grey Friars friary in Leicester, the last-known resting place of King Richard III. Archaeological, osteological and radiocarbon dating data were consistent with these being his remains. Here we report DNA analyses of both the skeletal remains and living relatives of Richard III. We find a perfect mitochondrial DNA match between the sequence obtained from the remains and one living relative, and a single-base substitution when compared with a second relative. Y-chromosome haplotypes from male-line relatives and the remains do not match, which could be attributed to a false-paternity event occurring in any of the intervening generations. DNA-predicted hair and eye colour are consistent with Richard's appearance in an early portrait. We calculate likelihood ratios for the non-genetic and genetic data separately, and combined, and conclude that the evidence for the remains being those of Richard III is overwhelming.
Kastorini, Christina-Maria; Panagiotakos, Demosthenes B; Chrysohoou, Christina; Georgousopoulou, Ekavi; Pitaraki, Evangelia; Puddu, Paolo Emilio; Tousoulis, Dimitrios; Stefanadis, Christodoulos; Pitsavos, Christos
2016-03-01
To better understand the metabolic syndrome (MS) spectrum through principal components analysis and further evaluate the role of the Mediterranean diet on MS presence. During 2001-2002, 1514 men and 1528 women (>18 y) without any clinical evidence of CVD or any other chronic disease, at baseline, living in greater Athens area, Greece, were enrolled. In 2011-2012, the 10-year follow-up was performed in 2583 participants (15% of the participants were lost to follow-up). Incidence of fatal or non-fatal CVD was defined according to WHO-ICD-10 criteria. MS was defined by the National Cholesterol Education Program Adult Treatment panel III (revised NCEP ATP III) definition. Adherence to the Mediterranean diet was assessed using the MedDietScore (range 0-55). Five principal components were derived, explaining 73.8% of the total variation, characterized by the: a) body weight and lipid profile, b) blood pressure, c) lipid profile, d) glucose profile, e) inflammatory factors. All components were associated with higher likelihood of CVD incidence. After adjusting for various potential confounding factors, adherence to the Mediterranean dietary pattern for each 10% increase in the MedDietScore, was associated with 15% lower odds of CVD incidence (95%CI: 0.71-1.06). For the participants with low adherence to the Mediterranean diet all five components were significantly associated with increased likelihood of CVD incidence. However, for the ones following closely the Mediterranean pattern positive, yet not significant associations were observed. Results of the present work propose a wider MS definition, while highlighting the beneficial role of the Mediterranean dietary pattern. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Tibau, Ariadna; Anguera, Geòrgia; Andrés-Pretel, Fernando; Templeton, Arnoud J; Seruga, Bostjan; Barnadas, Agustí; Amir, Eitan; Ocana, Alberto
2018-03-13
Clinical research is conducted by academia, cooperative groups (CGs) or pharmaceutical industry. Here, we evaluate the role of CGs and funding sources in the development of guidelines for breast cancer therapies. We identified 94 studies. CGs were involved in 28 (30%) studies while industry either partially or fully sponsored 64 (68%) studies. The number of industry funded studies increased over time (from 0% in 1976 to 100% in 2014; p for trend = 0.048). Only 10 (11%) government or academic studies were identified. Studies conducted by GCs included a greater number of subjects (median 448 vs. 284; p = 0.015), were more common in the neo/adjuvant setting ( p < 0.0001), and were more often randomized ( p = 0.018) phase III ( p < 0.0001) trials. Phase III trial remained significant predictor for CG-sponsored trials (OR 7.1 p = 0.004) in a multivariable analysis. Industry funding was associated with higher likelihood of positive outcomes favoring the sponsored experimental arm ( p = 0.013) but this relationship was not seen for CG-sponsored trials ( p = 0.53). ASCO, ESMO, and NCCN guidelines were searched to identify systemic anti-cancer therapies for early-stage and metastatic breast cancer. Trial characteristics and outcomes were collected. We identified sponsors and/or the funding source(s) and determined whether CGs, industry, or government or academic institutions were involved. Chi-square tests were used for comparison between studies. Industry funding is present in the majority of studies providing the basis for which recommendations about treatment of breast cancer are made. Industry funding, but not CG-based funding, was associated with higher likelihood of positive outcomes in clinical studies supporting guidelines for systemic therapy.
Katz, Brian S.; McMullan, Jason T.; Sucharew, Heidi; Adeoye, Opeolu; Broderick, Joseph P.
2015-01-01
Background and Purpose We derived and validated the Cincinnati Prehospital Stroke Severity Scale (CPSSS) to identify patients with severe strokes and large vessel occlusion (LVO). Methods CPSSS was developed with regression tree analysis, objectivity, anticipated ease in administration by EMS personnel, and the presence of cortical signs. We derived and validated the tool using the two NINDS t-PA Stroke Study trials and IMS III Trial cohorts, respectively, to predict severe stroke [NIH stroke scale (NIHSS) ≥15] and LVO. Standard test characteristics were determined and receiver operator curves were generated and summarized by the area under the curve (AUC). Results CPSSS score ranges from 0-4; composed and scored by individual NIHSS items: 2 points for presence of conjugate gaze (NIHSS ≥1); 1 point for presence of arm weakness (NIHSS ≥2); and 1 point for presence abnormal level of consciousness (LOC) commands and questions (NIHSS LOC ≥1 each). In the derivation set, CPSSS had an AUC of 0.89; score ≥2 was 89% sensitive and 73% specific in identifying NIHSS ≥15. Validation results were similar with an AUC of 0.83; score ≥2 was 92% sensitive, 51% specific, a positive likelihood ratio (PLR) of 3.3 and a negative likelihood ratio (NLR) of 0.15 in predicting severe stroke. For 222/303 IMS III subjects with LVO, CPSSS had an AUC of 0.67; a score ≥2 was 83% sensitive, 40% specific, PLR of 1.4, and NLR of 0.4 in predicting LVO. Conclusions CPSSS can identify stroke patients with NIHSS ≥15 and LVO. Prospective prehospital validation is warranted. PMID:25899242
Shih, Weichung Joe; Li, Gang; Wang, Yining
2016-03-01
Sample size plays a crucial role in clinical trials. Flexible sample-size designs, as part of the more general category of adaptive designs that utilize interim data, have been a popular topic in recent years. In this paper, we give a comparative review of four related methods for such a design. The likelihood method uses the likelihood ratio test with an adjusted critical value. The weighted method adjusts the test statistic with given weights rather than the critical value. The dual test method requires both the likelihood ratio statistic and the weighted statistic to be greater than the unadjusted critical value. The promising zone approach uses the likelihood ratio statistic with the unadjusted value and other constraints. All four methods preserve the type-I error rate. In this paper we explore their properties and compare their relationships and merits. We show that the sample size rules for the dual test are in conflict with the rules of the promising zone approach. We delineate what is necessary to specify in the study protocol to ensure the validity of the statistical procedure and what can be kept implicit in the protocol so that more flexibility can be attained for confirmatory phase III trials in meeting regulatory requirements. We also prove that under mild conditions, the likelihood ratio test still preserves the type-I error rate when the actual sample size is larger than the re-calculated one. Copyright © 2015 Elsevier Inc. All rights reserved.
Multi-Stage Maximum Likelihood Target Estimator
2006-03-31
Compute the range at ti with respect to 19 the sensor associated with the ith 20 measurement (Ri): 23 1 R, = VRx ’+ Ryl’ (3 2 2.) Compute the range...the ith measurement 12 ii. Compute the range at tj with respect tothe 13 sensor associated with the ith measurement (R4) 14 = VRx + Ry (323) 15 iii
Figler, Bradley D; Mack, Christopher D; Kaufman, Robert; Wessells, Hunter; Bulger, Eileen; Smith, Thomas G; Voelzke, Bryan
2014-03-01
The National Highway Traffic Safety Administration's New Car Assessment Program (NCAP) implemented side-impact crash testing on all new vehicles since 1998 to assess the likelihood of major thoracoabdominal injuries during a side-impact crash. Higher crash test rating is intended to indicate a safer car, but the real-world applicability of these ratings is unknown. Our objective was to determine the relationship between a vehicle's NCAP side-impact crash test rating and the risk of major thoracoabdominal injury among the vehicle's occupants in real-world side-impact motor vehicle crashes. The National Automotive Sampling System Crashworthiness Data System contains detailed crash and injury data in a sample of major crashes in the United States. For model years 1998 to 2010 and crash years 1999 to 2010, 68,124 occupants were identified in the Crashworthiness Data System database. Because 47% of cases were missing crash severity (ΔV), multiple imputation was used to estimate the missing values. The primary predictor of interest was the occupant vehicle's NCAP side-impact crash test rating, and the outcome of interest was the presence of major (Abbreviated Injury Scale [AIS] score ≥ 3) thoracoabdominal injury. In multivariate analysis, increasing NCAP crash test rating was associated with lower likelihood of major thoracoabdominal injury at high (odds ratio [OR], 0.8; 95% confidence interval [CI], 0.7-0.9; p < 0.01) and medium (OR, 0.9; 95% CI, 0.8-1.0; p < 0.05) crash severity (ΔV), but not at low ΔV (OR, 0.95; 95% CI, 0.8-1.2; p = 0.55). In our model, older age and absence of seat belt use were associated with greater likelihood of major thoracoabdominal injury at low and medium ΔV (p < 0.001), but not at high ΔV (p ≥ 0.09). Among adults in model year 1998 to 2010 vehicles involved in medium and high severity motor vehicle crashes, a higher NCAP side-impact crash test rating is associated with a lower likelihood of major thoracoabdominal trauma. Epidemiologic study, level III.
Estimating the variance for heterogeneity in arm-based network meta-analysis.
Piepho, Hans-Peter; Madden, Laurence V; Roger, James; Payne, Roger; Williams, Emlyn R
2018-04-19
Network meta-analysis can be implemented by using arm-based or contrast-based models. Here we focus on arm-based models and fit them using generalized linear mixed model procedures. Full maximum likelihood (ML) estimation leads to biased trial-by-treatment interaction variance estimates for heterogeneity. Thus, our objective is to investigate alternative approaches to variance estimation that reduce bias compared with full ML. Specifically, we use penalized quasi-likelihood/pseudo-likelihood and hierarchical (h) likelihood approaches. In addition, we consider a novel model modification that yields estimators akin to the residual maximum likelihood estimator for linear mixed models. The proposed methods are compared by simulation, and 2 real datasets are used for illustration. Simulations show that penalized quasi-likelihood/pseudo-likelihood and h-likelihood reduce bias and yield satisfactory coverage rates. Sum-to-zero restriction and baseline contrasts for random trial-by-treatment interaction effects, as well as a residual ML-like adjustment, also reduce bias compared with an unconstrained model when ML is used, but coverage rates are not quite as good. Penalized quasi-likelihood/pseudo-likelihood and h-likelihood are therefore recommended. Copyright © 2018 John Wiley & Sons, Ltd.
Structural effect of size on interracial friendship
Cheng, Siwei; Xie, Yu
2013-01-01
Social contexts exert structural effects on individuals’ social relationships, including interracial friendships. In this study, we posit that, net of group composition, total context size has a distinct effect on interracial friendship. Under the assumptions of (i) maximization of preference in choosing a friend, (ii) multidimensionality of preference, and (iii) preference for same-race friends, we conducted analyses using microsimulation that yielded three main findings. First, increased context size decreases the likelihood of forming an interracial friendship. Second, the size effect increases with the number of preference dimensions. Third, the size effect is diluted by noise, i.e., the random component affecting friendship formation. Analysis of actual friendship data among 4,745 American high school students yielded results consistent with the main conclusion that increased context size promotes racial segregation and discourages interracial friendship. PMID:23589848
Identification of the remains of King Richard III
King, Turi E.; Fortes, Gloria Gonzalez; Balaresque, Patricia; Thomas, Mark G.; Balding, David; Delser, Pierpaolo Maisano; Neumann, Rita; Parson, Walther; Knapp, Michael; Walsh, Susan; Tonasso, Laure; Holt, John; Kayser, Manfred; Appleby, Jo; Forster, Peter; Ekserdjian, David; Hofreiter, Michael; Schürer, Kevin
2014-01-01
In 2012, a skeleton was excavated at the presumed site of the Grey Friars friary in Leicester, the last-known resting place of King Richard III. Archaeological, osteological and radiocarbon dating data were consistent with these being his remains. Here we report DNA analyses of both the skeletal remains and living relatives of Richard III. We find a perfect mitochondrial DNA match between the sequence obtained from the remains and one living relative, and a single-base substitution when compared with a second relative. Y-chromosome haplotypes from male-line relatives and the remains do not match, which could be attributed to a false-paternity event occurring in any of the intervening generations. DNA-predicted hair and eye colour are consistent with Richard’s appearance in an early portrait. We calculate likelihood ratios for the non-genetic and genetic data separately, and combined, and conclude that the evidence for the remains being those of Richard III is overwhelming. PMID:25463651
Determining the accuracy of maximum likelihood parameter estimates with colored residuals
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.; Klein, Vladislav
1994-01-01
An important part of building high fidelity mathematical models based on measured data is calculating the accuracy associated with statistical estimates of the model parameters. Indeed, without some idea of the accuracy of parameter estimates, the estimates themselves have limited value. In this work, an expression based on theoretical analysis was developed to properly compute parameter accuracy measures for maximum likelihood estimates with colored residuals. This result is important because experience from the analysis of measured data reveals that the residuals from maximum likelihood estimation are almost always colored. The calculations involved can be appended to conventional maximum likelihood estimation algorithms. Simulated data runs were used to show that the parameter accuracy measures computed with this technique accurately reflect the quality of the parameter estimates from maximum likelihood estimation without the need for analysis of the output residuals in the frequency domain or heuristically determined multiplication factors. The result is general, although the application studied here is maximum likelihood estimation of aerodynamic model parameters from flight test data.
Phylogenetic Analyses of Meloidogyne Small Subunit rDNA.
De Ley, Irma Tandingan; De Ley, Paul; Vierstraete, Andy; Karssen, Gerrit; Moens, Maurice; Vanfleteren, Jacques
2002-12-01
Phylogenies were inferred from nearly complete small subunit (SSU) 18S rDNA sequences of 12 species of Meloidogyne and 4 outgroup taxa (Globodera pallida, Nacobbus abberans, Subanguina radicicola, and Zygotylenchus guevarai). Alignments were generated manually from a secondary structure model, and computationally using ClustalX and Treealign. Trees were constructed using distance, parsimony, and likelihood algorithms in PAUP* 4.0b4a. Obtained tree topologies were stable across algorithms and alignments, supporting 3 clades: clade I = [M. incognita (M. javanica, M. arenaria)]; clade II = M. duytsi and M. maritima in an unresolved trichotomy with (M. hapla, M. microtyla); and clade III = (M. exigua (M. graminicola, M. chitwoodi)). Monophyly of [(clade I, clade II) clade III] was given maximal bootstrap support (mbs). M. artiellia was always a sister taxon to this joint clade, while M. ichinohei was consistently placed with mbs as a basal taxon within the genus. Affinities with the outgroup taxa remain unclear, although G. pallida and S. radicicola were never placed as closest relatives of Meloidogyne. Our results show that SSU sequence data are useful in addressing deeper phylogeny within Meloidogyne, and that both M. ichinohei and M. artiellia are credible outgroups for phylogenetic analysis of speciations among the major species.
Phylogenetic Analyses of Meloidogyne Small Subunit rDNA
De Ley, Irma Tandingan; De Ley, Paul; Vierstraete, Andy; Karssen, Gerrit; Moens, Maurice; Vanfleteren, Jacques
2002-01-01
Phylogenies were inferred from nearly complete small subunit (SSU) 18S rDNA sequences of 12 species of Meloidogyne and 4 outgroup taxa (Globodera pallida, Nacobbus abberans, Subanguina radicicola, and Zygotylenchus guevarai). Alignments were generated manually from a secondary structure model, and computationally using ClustalX and Treealign. Trees were constructed using distance, parsimony, and likelihood algorithms in PAUP* 4.0b4a. Obtained tree topologies were stable across algorithms and alignments, supporting 3 clades: clade I = [M. incognita (M. javanica, M. arenaria)]; clade II = M. duytsi and M. maritima in an unresolved trichotomy with (M. hapla, M. microtyla); and clade III = (M. exigua (M. graminicola, M. chitwoodi)). Monophyly of [(clade I, clade II) clade III] was given maximal bootstrap support (mbs). M. artiellia was always a sister taxon to this joint clade, while M. ichinohei was consistently placed with mbs as a basal taxon within the genus. Affinities with the outgroup taxa remain unclear, although G. pallida and S. radicicola were never placed as closest relatives of Meloidogyne. Our results show that SSU sequence data are useful in addressing deeper phylogeny within Meloidogyne, and that both M. ichinohei and M. artiellia are credible outgroups for phylogenetic analysis of speciations among the major species. PMID:19265950
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Chun Chieh, E-mail: anna.lin@cancer.org; Bruinooge, Suanna S.; Kirkwood, M. Kelsey
Purpose: Trimodality therapy (chemoradiation and surgery) is the standard of care for stage II/III rectal cancer but nearly one third of patients do not receive radiation therapy (RT). We examined the relationship between the density of radiation oncologists and the travel distance to receipt of RT. Methods and Materials: A retrospective study based on the National Cancer Data Base identified 26,845 patients aged 18 to 80 years with stage II/III rectal cancer diagnosed from 2007 to 2010. Radiation oncologists were identified through the Physician Compare dataset. Generalized estimating equations clustering by hospital service area was used to examine the association betweenmore » geographic access and receipt of RT, controlling for patient sociodemographic and clinical characteristics. Results: Of the 26,845 patients, 70% received RT within 180 days of diagnosis or within 90 days of surgery. Compared with a travel distance of <12.5 miles, patients diagnosed at a reporting facility who traveled ≥50 miles had a decreased likelihood of receipt of RT (50-249 miles, adjusted odds ratio 0.75, P<.001; ≥250 miles, adjusted odds ratio 0.46; P=.002), all else being equal. The density level of radiation oncologists was not significantly associated with the receipt of RT. Patients who were female, nonwhite, and aged ≥50 years and had comorbidities were less likely to receive RT (P<.05). Patients who were uninsured but self-paid for their medical services, initially diagnosed elsewhere but treated at a reporting facility, and resided in Midwest had an increased the likelihood of receipt of RT (P<.05). Conclusions: An increased travel burden was associated with a decreased likelihood of receiving RT for patients with stage II/III rectal cancer, all else being equal; however, radiation oncologist density was not. Further research of geographic access and establishing transportation assistance programs or lodging services for patients with an unmet need might help decrease geographic barriers and improve the quality of rectal cancer care.« less
Lin, Chun Chieh; Bruinooge, Suanna S; Kirkwood, M Kelsey; Hershman, Dawn L; Jemal, Ahmedin; Guadagnolo, B Ashleigh; Yu, James B; Hopkins, Shane; Goldstein, Michael; Bajorin, Dean; Giordano, Sharon H; Kosty, Michael; Arnone, Anna; Hanley, Amy; Stevens, Stephanie; Olsen, Christine
2016-03-15
Trimodality therapy (chemoradiation and surgery) is the standard of care for stage II/III rectal cancer but nearly one third of patients do not receive radiation therapy (RT). We examined the relationship between the density of radiation oncologists and the travel distance to receipt of RT. A retrospective study based on the National Cancer Data Base identified 26,845 patients aged 18 to 80 years with stage II/III rectal cancer diagnosed from 2007 to 2010. Radiation oncologists were identified through the Physician Compare dataset. Generalized estimating equations clustering by hospital service area was used to examine the association between geographic access and receipt of RT, controlling for patient sociodemographic and clinical characteristics. Of the 26,845 patients, 70% received RT within 180 days of diagnosis or within 90 days of surgery. Compared with a travel distance of <12.5 miles, patients diagnosed at a reporting facility who traveled ≥50 miles had a decreased likelihood of receipt of RT (50-249 miles, adjusted odds ratio 0.75, P<.001; ≥250 miles, adjusted odds ratio 0.46; P=.002), all else being equal. The density level of radiation oncologists was not significantly associated with the receipt of RT. Patients who were female, nonwhite, and aged ≥50 years and had comorbidities were less likely to receive RT (P<.05). Patients who were uninsured but self-paid for their medical services, initially diagnosed elsewhere but treated at a reporting facility, and resided in Midwest had an increased the likelihood of receipt of RT (P<.05). An increased travel burden was associated with a decreased likelihood of receiving RT for patients with stage II/III rectal cancer, all else being equal; however, radiation oncologist density was not. Further research of geographic access and establishing transportation assistance programs or lodging services for patients with an unmet need might help decrease geographic barriers and improve the quality of rectal cancer care. Copyright © 2016 Elsevier Inc. All rights reserved.
Eng, Kenny; Carlisle, Daren M.; Wolock, David M.; Falcone, James A.
2013-01-01
An approach is presented in this study to aid water-resource managers in characterizing streamflow alteration at ungauged rivers. Such approaches can be used to take advantage of the substantial amounts of biological data collected at ungauged rivers to evaluate the potential ecological consequences of altered streamflows. National-scale random forest statistical models are developed to predict the likelihood that ungauged rivers have altered streamflows (relative to expected natural condition) for five hydrologic metrics (HMs) representing different aspects of the streamflow regime. The models use human disturbance variables, such as number of dams and road density, to predict the likelihood of streamflow alteration. For each HM, separate models are derived to predict the likelihood that the observed metric is greater than (‘inflated’) or less than (‘diminished’) natural conditions. The utility of these models is demonstrated by applying them to all river segments in the South Platte River in Colorado, USA, and for all 10-digit hydrologic units in the conterminous United States. In general, the models successfully predicted the likelihood of alteration to the five HMs at the national scale as well as in the South Platte River basin. However, the models predicting the likelihood of diminished HMs consistently outperformed models predicting inflated HMs, possibly because of fewer sites across the conterminous United States where HMs are inflated. The results of these analyses suggest that the primary predictors of altered streamflow regimes across the Nation are (i) the residence time of annual runoff held in storage in reservoirs, (ii) the degree of urbanization measured by road density and (iii) the extent of agricultural land cover in the river basin.
On the complex quantification of risk: systems-based perspective on terrorism.
Haimes, Yacov Y
2011-08-01
This article highlights the complexity of the quantification of the multidimensional risk function, develops five systems-based premises on quantifying the risk of terrorism to a threatened system, and advocates the quantification of vulnerability and resilience through the states of the system. The five premises are: (i) There exists interdependence between a specific threat to a system by terrorist networks and the states of the targeted system, as represented through the system's vulnerability, resilience, and criticality-impact. (ii) A specific threat, its probability, its timing, the states of the targeted system, and the probability of consequences can be interdependent. (iii) The two questions in the risk assessment process: "What is the likelihood?" and "What are the consequences?" can be interdependent. (iv) Risk management policy options can reduce both the likelihood of a threat to a targeted system and the associated likelihood of consequences by changing the states (including both vulnerability and resilience) of the system. (v) The quantification of risk to a vulnerable system from a specific threat must be built on a systemic and repeatable modeling process, by recognizing that the states of the system constitute an essential step to construct quantitative metrics of the consequences based on intelligence gathering, expert evidence, and other qualitative information. The fact that the states of all systems are functions of time (among other variables) makes the time frame pivotal in each component of the process of risk assessment, management, and communication. Thus, risk to a system, caused by an initiating event (e.g., a threat) is a multidimensional function of the specific threat, its probability and time frame, the states of the system (representing vulnerability and resilience), and the probabilistic multidimensional consequences. © 2011 Society for Risk Analysis.
Guiberson, Mark; Rodríguez, Barbara L
2010-08-01
To describe the concurrent validity and classification accuracy of 2 Spanish parent surveys of language development, the Spanish Ages and Stages Questionnaire (ASQ; Squires, Potter, & Bricker, 1999) and the Pilot Inventario-III (Pilot INV-III; Guiberson, 2008a). Forty-eight Spanish-speaking parents of preschool-age children participated. Twenty-two children had expressive language delays, and 26 had typical language development. The parents completed the Spanish ASQ and the Pilot INV-III at home, and the Preschool Language Scale, Fourth Edition: Spanish Edition (PLS-4 Spanish; Zimmerman, Steiner, & Pond, 2002) was administered to the children at preschool centers. The Spanish ASQ and Pilot INV-III were significantly correlated with the PLS-4 Spanish, establishing concurrent validity. On both surveys, children with expressive language delays scored significantly lower than children with typical development. The Spanish ASQ demonstrated unacceptably low sensitivity (59%) and good specificity (92%), while the Pilot INV-III demonstrated fair sensitivity (82%) and specificity (81%). Likelihood ratios and posttest probability revealed that the Pilot INV-III may assist in detection of expressive language delays, but viewed alone it is insufficient to make an unconditional screening determination. Results suggest that Spanish parent surveys hold promise for screening language delay in Spanish-speaking preschool children; however, further refinement of these tools is needed.
Koshy, Matthew; Malik, Renuka; Spiotto, Michael; Mahmood, Usama; Rusthoven, Chad G; Sher, David J
2017-06-01
To determine the effect of radiotherapy (RT) technique on treatment compliance and overall survival (OS) in patients with stage III non-small lung cancer (NSCLC) treated with definitive chemoradiotherapy (CRT). This study included patients with stage III NSCLC in the National Cancer Database treated between 2003 and 2011 with definitive CRT to 60-63 Gray (Gy). Radiation treatment interruption (RTI) was defined as a break of ≥4 days. Treatment technique was dichotomized as intensity modulated (IMRT) or non-IMRT techniques. Out of the cohort of 7492, 35% had a RTI and 10% received IMRT. With a median follow-up of surviving patients of 32 months, the median survival for those with non-IMRT vs. IMRT was 18.2 months vs. 20 months (p<0.0001). Median survival for those with and without an RTI≥4 days was 16.1 months vs. 19.8 months (p<0.0001). Use of IMRT predicted for a decreased likelihood of RTI (odds ratio, 0.84, p=0.04). On multivariable analysis for OS, IMRT had a HR of 0.89 (95% CI: 0.80-0.98, p=0.01) and RTI had a HR of 1.2 (95% confidence interval (CI): 1.14-1.27, p=0.001). IMRT was associated with small but significant survival advantage for patients with stage III NSCLC treated with CRT. A RTI led to inferior survival, and both IMRT and RTI were independently associated with OS. Additional research should investigate whether improved tolerability, reduced normal tissue exposure, or superior coverage drives the association between IMRT and improved survival. Copyright © 2017 Elsevier B.V. All rights reserved.
Crisi, Girolamo; Filice, Silvano; Michiara, Maria; Crafa, Pellegrino; Lana, Silvia
The objective of this study was to assess the effective performance of short echo time magnetic resonance spectroscopy (short TE MRS) for 2HG detection as biomarker of isocitrate dehydrogenase (IDH) status in all grade glioma (GL). A total of 82 GL patients were prospectively investigated by short TE MRS at 3.0 T as part of a multimodal magnetic resonance imaging study protocol. Spectral analysis was performed using linear combination model. Tumor specimens were diagnosed as IDH mutant or wild type according to the 2016 World Health Organization (WHO) classification of brain tumors. Spectra were analyzed for the presence of 2HG. The performance of short TE MRS was evaluated in terms of sensitivity, specificity, and positive and negative likelihood ratio on the overall sample and on GL WHO grades II and III and glioblastoma separately. The specificity and sensitivity estimated on the overall sample were 88% and 77%, respectively. In GL WHO grades II and III, 100% specificity and 75% sensitivity were estimated. We reiterate the feasibility to identify IDH status of brain GL using short TE MRS at 3.0 T. The method can correctly detect 2HG as expression of IDH mutation in WHO grades II and III GL with a 100% specificity but a 75% sensitivity. In the evaluation of glioblastoma, short TE MRS performs poorly having a 17% false positive rate.
Hurdle models for multilevel zero-inflated data via h-likelihood.
Molas, Marek; Lesaffre, Emmanuel
2010-12-30
Count data often exhibit overdispersion. One type of overdispersion arises when there is an excess of zeros in comparison with the standard Poisson distribution. Zero-inflated Poisson and hurdle models have been proposed to perform a valid likelihood-based analysis to account for the surplus of zeros. Further, data often arise in clustered, longitudinal or multiple-membership settings. The proper analysis needs to reflect the design of a study. Typically random effects are used to account for dependencies in the data. We examine the h-likelihood estimation and inference framework for hurdle models with random effects for complex designs. We extend the h-likelihood procedures to fit hurdle models, thereby extending h-likelihood to truncated distributions. Two applications of the methodology are presented. Copyright © 2010 John Wiley & Sons, Ltd.
Ozge, Aynur; Aydinlar, Elif; Tasdelen, Bahar
2015-01-01
Exploring clinical characteristics and migraine covariates may be useful in the diagnosis of migraine without aura. To evaluate the diagnostic value of the International Classification of Headache Disorders (ICHD)-III beta-based diagnosis of migraine without aura; to explore the covariates of possible migraine without aura using an analysis of grey zones in this area; and, finally, to make suggestions for the final version of the ICHD-III. A total of 1365 patients (mean [± SD] age 38.5±10.4 years, 82.8% female) diagnosed with migraine without aura according to the criteria of the ICHD-III beta were included in the present tertiary care-based retrospective study. Patients meeting all of the criteria of the ICHD-III beta were classified as having full migraine without aura, while those who did not meet one, two or ≥3 of the diagnostic criteria were classified as zones I, II and III, respectively. The diagnostic value of the clinical characteristics and covariates of migraine were determined. Full migraine without aura was evident in 25.7% of the migraineurs. A higher likelihood of zone I classification was shown for an attack lasting 4 h to 72 h (OR 1.560; P=0.002), with pulsating quality (OR 4.096; P<0.001), concomitant nausea⁄vomiting (OR 2.300; P<0.001) and photophobia⁄phonophobia (OR 4.865; P<0.001). The first-rank determinants for full migraine without aura were sleep irregularities (OR 1.596; P=0.005) and periodic vomiting (OR 1.464; P=0.026). However, even if not mentioned in ICHD-III beta, the authors determined that motion sickness, abdominal pain or infantile colic attacks in childhood, associated dizziness and osmophobia have important diagnostic value. In cases that do not fulfill all of the diagnostic criteria although they are largely consistent with the characteristics of migraine in clinical terms, the authors believe that a history of infantile colic; periodic vomiting (but not periodic vomiting syndrome); recurrent abdominal pain; the presence of motion sickness or vertigo, dizziness or osmophobia accompanying the pain; and comorbid atopic disorder are characteristics that should to be discussed and considered as additional diagnostic criteria (covariates) in the preparation of the final version of ICHD-III.
Assessment of parametric uncertainty for groundwater reactive transport modeling,
Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun
2014-01-01
The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood functions, improve model calibration, and reduce predictive uncertainty in other groundwater reactive transport and environmental modeling.
Prigerson, H G; Shear, M K; Bierhals, A J; Zonarich, D L; Reynolds, C F
1996-01-01
The purpose of this study was to examine the ways in which childhood adversity, attachment and personality styles influenced the likelihood of having an anxiety disorder among aged caregivers for terminally ill spouses. We also sought to determine how childhood adversity and attachment/personality styles jointly influenced the likelihood of developing an anxiety disorder among aged caregivers. Data were derived from semistructured interviews with 50 spouses (aged 60 and above) of terminally ill patients. The Childhood Experience of Care and Abuse (CECA) record provided retrospective, behaviorally based information on childhood adversity. Measures of attachment and personality styles were obtained from self-report questionnaires, and the Structured Clinical Interview for the DSM-III-R (SCID) was used to determine diagnoses for anxiety disorders. Logistic regression models estimated the effects of childhood adversity, attachment/personality disturbances, and the interaction between the two on the likelihood of having an anxiety disorder. Results indicated that childhood adversity and paranoid, histrionic and self-defeating styles all directly increase the odds of having an anxiety disorder as an elderly spousal caregiver. In addition, childhood adversity in conjunction with borderline, antisocial and excessively dependent styles increased the likelihood of having an anxiety disorder. The results indicate the need to investigate further the interaction between childhood experiences and current attachment/personality styles in their effects on the development of anxiety disorders.
Hock, Sabrina; Hasenauer, Jan; Theis, Fabian J
2013-01-01
Diffusion is a key component of many biological processes such as chemotaxis, developmental differentiation and tissue morphogenesis. Since recently, the spatial gradients caused by diffusion can be assessed in-vitro and in-vivo using microscopy based imaging techniques. The resulting time-series of two dimensional, high-resolutions images in combination with mechanistic models enable the quantitative analysis of the underlying mechanisms. However, such a model-based analysis is still challenging due to measurement noise and sparse observations, which result in uncertainties of the model parameters. We introduce a likelihood function for image-based measurements with log-normal distributed noise. Based upon this likelihood function we formulate the maximum likelihood estimation problem, which is solved using PDE-constrained optimization methods. To assess the uncertainty and practical identifiability of the parameters we introduce profile likelihoods for diffusion processes. As proof of concept, we model certain aspects of the guidance of dendritic cells towards lymphatic vessels, an example for haptotaxis. Using a realistic set of artificial measurement data, we estimate the five kinetic parameters of this model and compute profile likelihoods. Our novel approach for the estimation of model parameters from image data as well as the proposed identifiability analysis approach is widely applicable to diffusion processes. The profile likelihood based method provides more rigorous uncertainty bounds in contrast to local approximation methods.
1982-07-01
palladium acetate and the appropriate phosphine . This procedure is known to be effective for bromoarenes. In the early screen- ing runs, 4...Delaware), he indicated that he also had screened many phosphines , and the likelihood of success was very small. Dr. Heck reported that the palladium...any simple modification of the palla- dium phosphine catalyst system will effect the desired reaction. 5 III. PREPARATION OF OLIGOMERIC BENZILS AND
MANTIS: a phylogenetic framework for multi-species genome comparisons.
Tzika, Athanasia C; Helaers, Raphaël; Van de Peer, Yves; Milinkovitch, Michel C
2008-01-15
Practitioners of comparative genomics face huge analytical challenges as whole genome sequences and functional/expression data accumulate. Furthermore, the field would greatly benefit from a better integration of this wealth of data with evolutionary concepts. Here, we present MANTIS, a relational database for the analysis of (i) gains and losses of genes on specific branches of the metazoan phylogeny, (ii) reconstructed genome content of ancestral species and (iii) over- or under-representation of functions/processes and tissue specificity of gained, duplicated and lost genes. MANTIS estimates the most likely positions of gene losses on the true phylogeny using a maximum-likelihood function. A user-friendly interface and an extensive query system allow to investigate questions pertaining to gene identity, phylogenetic mapping and function/expression parameters. MANTIS is freely available at http://www.mantisdb.org and constitutes the missing link between multi-species genome comparisons and functional analyses.
NASA Astrophysics Data System (ADS)
Gritsan, Andrei V.; Röntsch, Raoul; Schulze, Markus; Xiao, Meng
2016-09-01
In this paper, we investigate anomalous interactions of the Higgs boson with heavy fermions, employing shapes of kinematic distributions. We study the processes p p →t t ¯+H , b b ¯+H , t q +H , and p p →H →τ+τ- and present applications of event generation, reweighting techniques for fast simulation of anomalous couplings, as well as matrix element techniques for optimal sensitivity. We extend the matrix element likelihood approach (MELA) technique, which proved to be a powerful matrix element tool for Higgs boson discovery and characterization during Run I of the LHC, and implement all analysis tools in the JHU generator framework. A next-to-leading-order QCD description of the p p →t t ¯+H process allows us to investigate the performance of the MELA in the presence of extra radiation. Finally, projections for LHC measurements through the end of Run III are presented.
THESEUS: maximum likelihood superpositioning and analysis of macromolecular structures
Theobald, Douglas L.; Wuttke, Deborah S.
2008-01-01
Summary THESEUS is a command line program for performing maximum likelihood (ML) superpositions and analysis of macromolecular structures. While conventional superpositioning methods use ordinary least-squares (LS) as the optimization criterion, ML superpositions provide substantially improved accuracy by down-weighting variable structural regions and by correcting for correlations among atoms. ML superpositioning is robust and insensitive to the specific atoms included in the analysis, and thus it does not require subjective pruning of selected variable atomic coordinates. Output includes both likelihood-based and frequentist statistics for accurate evaluation of the adequacy of a superposition and for reliable analysis of structural similarities and differences. THESEUS performs principal components analysis for analyzing the complex correlations found among atoms within a structural ensemble. PMID:16777907
DOUBLE-PEAKED NARROW-LINE ACTIVE GALACTIC NUCLEI. II. THE CASE OF EQUAL PEAKS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, K. L.; Shields, G. A.; Salviander, S.
Active galactic nuclei (AGNs) with double-peaked narrow lines (DPAGNs) may be caused by kiloparsec-scale binary AGNs, bipolar outflows, or rotating gaseous disks. We examine the class of DPAGNs in which the two narrow-line components have closely similar intensity as being especially likely to involve disks or jets. Two spectroscopic indicators support this likelihood. For DPAGNs from Smith et al., the 'equal-peaked' objects (EPAGNs) have [Ne V]/[O III]ratios lower than for a control sample of non-double-peaked AGNs. This is unexpected for a pair of normal AGNs in a galactic merger, but may be consistent with [O III] emission from a rotatingmore » ring with relatively little gas at small radii. Also, [O III]/H{beta} ratios of the redshifted and blueshifted systems in the EPAGN are more similar to each other than in a control sample, suggestive of a single ionizing source and inconsistent with the binary interpretation.« less
Kuch, Ulrich; Keogh, J Scott; Weigel, John; Smith, Laurie A; Mebs, Dietrich
2005-03-01
King brown snakes or mulga snakes (Pseudechis australis) are the largest and among the most dangerous and wide-ranging venomous snakes in Australia and New Guinea. They occur in diverse habitats, are important predators, and exhibit considerable morphological variation. We infer the relationships and historical biogeography of P. australis based on phylogenetic analysis of 1,249 base pairs from the mitochondrial cytochrome b, NADH dehydrogenase subunit 4 and three adjacent tRNA genes using Bayesian, maximum-likelihood, and maximum-parsimony methods. All methods reveal deep phylogenetic structure with four strongly supported clades comprising snakes from New Guinea (I), localities all over Australia (II), the Kimberleys of Western Australia (III), and north-central Australia (IV), suggesting a much more ancient radiation than previously believed. This conclusion is robust to different molecular clock estimations indicating divergence in Pliocene or Late Miocene, after landbridge dispersal to New Guinea had occurred. While members of clades I, III and IV are medium-sized, slender snakes, those of clade II attain large sizes and a robust build, rendering them top predators in their ecosystems. Genetic differentiation within clade II is low and haplotype distribution largely incongruent with geography or colour morphs, suggesting Pleistocene dispersal and recent ecomorph evolution. Significant haplotype diversity exists in clades III and IV, implying that clade IV comprises two species. Members of clade II are broadly sympatric with members of both northern Australian clades. Thus, our data support the recognition of at least five species from within P. australis (auct.) under various criteria. We discuss biogeographical, ecological and medical implications of our findings.
NASA Astrophysics Data System (ADS)
Kuch, Ulrich; Keogh, J. Scott; Weigel, John; Smith, Laurie A.; Mebs, Dietrich
2005-03-01
King brown snakes or mulga snakes (Pseudechis australis) are the largest and among the most dangerous and wide-ranging venomous snakes in Australia and New Guinea. They occur in diverse habitats, are important predators, and exhibit considerable morphological variation. We infer the relationships and historical biogeography of P. australis based on phylogenetic analysis of 1,249 base pairs from the mitochondrial cytochrome b, NADH dehydrogenase subunit 4 and three adjacent tRNA genes using Bayesian, maximum-likelihood, and maximum-parsimony methods. All methods reveal deep phylogenetic structure with four strongly supported clades comprising snakes from New Guinea (I), localities all over Australia (II), the Kimberleys of Western Australia (III), and north-central Australia (IV), suggesting a much more ancient radiation than previously believed. This conclusion is robust to different molecular clock estimations indicating divergence in Pliocene or Late Miocene, after landbridge dispersal to New Guinea had occurred. While members of clades I, III and IV are medium-sized, slender snakes, those of clade II attain large sizes and a robust build, rendering them top predators in their ecosystems. Genetic differentiation within clade II is low and haplotype distribution largely incongruent with geography or colour morphs, suggesting Pleistocene dispersal and recent ecomorph evolution. Significant haplotype diversity exists in clades III and IV, implying that clade IV comprises two species. Members of clade II are broadly sympatric with members of both northern Australian clades. Thus, our data support the recognition of at least five species from within P. australis (auct.) under various criteria. We discuss biogeographical, ecological and medical implications of our findings.
Pettinger, L.R.
1982-01-01
This paper documents the procedures, results, and final products of a digital analysis of Landsat data used to produce a vegetation and landcover map of the Blackfoot River watershed in southeastern Idaho. Resource classes were identified at two levels of detail: generalized Level I classes (for example, forest land and wetland) and detailed Levels II and III classes (for example, conifer forest, aspen, wet meadow, and riparian hardwoods). Training set statistics were derived using a modified clustering approach. Environmental stratification that separated uplands from lowlands improved discrimination between resource classes having similar spectral signatures. Digital classification was performed using a maximum likelihood algorithm. Classification accuracy was determined on a single-pixel basis from a random sample of 25-pixel blocks. These blocks were transferred to small-scale color-infrared aerial photographs, and the image area corresponding to each pixel was interpreted. Classification accuracy, expressed as percent agreement of digital classification and photo-interpretation results, was 83.0:t 2.1 percent (0.95 probability level) for generalized (Level I) classes and 52.2:t 2.8 percent (0.95 probability level) for detailed (Levels II and III) classes. After the classified images were geometrically corrected, two types of maps were produced of Level I and Levels II and III resource classes: color-coded maps at a 1:250,000 scale, and flatbed-plotter overlays at a 1:24,000 scale. The overlays are more useful because of their larger scale, familiar format to users, and compatibility with other types of topographic and thematic maps of the same scale.
Thomson, Jessica L; Tussing-Humphreys, Lisa M; Zoellner, Jamie M; Goodman, Melissa H
2016-08-01
Evaluating an intervention's theoretical basis can inform design modifications to produce more effective interventions. Hence the present study's purpose was to determine if effects from a multicomponent lifestyle intervention were mediated by changes in the psychosocial constructs decisional balance, self-efficacy and social support. Delta Body and Soul III, conducted from August 2011 to May 2012, was a 6-month, church-based, lifestyle intervention designed to improve diet quality and increase physical activity. Primary outcomes, diet quality and aerobic and strength/flexibility physical activity, as well as psychosocial constructs, were assessed via self-report, interviewer-administered surveys at baseline and post intervention. Mediation analyses were conducted using ordinary least squares (continuous outcomes) and maximum likelihood logistic (dichotomous outcomes) regression path analysis. Churches (five intervention and three control) were recruited from four counties in the Lower Mississippi Delta region of the USA. Rural, Southern, primarily African-American adults (n 321). Based upon results from the multiple mediation models, there was no evidence that treatment (intervention v. control) indirectly influenced changes in diet quality or physical activity through its effects on decisional balance, self-efficacy and social support. However, there was evidence for direct effects of social support for exercise on physical activity and of self-efficacy for sugar-sweetened beverages on diet quality. Results do not support the hypothesis that the psychosocial constructs decisional balance, self-efficacy and social support were the theoretical mechanisms by which the Delta Body and Soul III intervention influenced changes in diet quality and physical activity.
PAMLX: a graphical user interface for PAML.
Xu, Bo; Yang, Ziheng
2013-12-01
This note announces pamlX, a graphical user interface/front end for the paml (for Phylogenetic Analysis by Maximum Likelihood) program package (Yang Z. 1997. PAML: a program package for phylogenetic analysis by maximum likelihood. Comput Appl Biosci. 13:555-556; Yang Z. 2007. PAML 4: Phylogenetic analysis by maximum likelihood. Mol Biol Evol. 24:1586-1591). pamlX is written in C++ using the Qt library and communicates with paml programs through files. It can be used to create, edit, and print control files for paml programs and to launch paml runs. The interface is available for free download at http://abacus.gene.ucl.ac.uk/software/paml.html.
Robust Multipoint Water-Fat Separation Using Fat Likelihood Analysis
Yu, Huanzhou; Reeder, Scott B.; Shimakawa, Ann; McKenzie, Charles A.; Brittain, Jean H.
2016-01-01
Fat suppression is an essential part of routine MRI scanning. Multiecho chemical-shift based water-fat separation methods estimate and correct for Bo field inhomogeneity. However, they must contend with the intrinsic challenge of water-fat ambiguity that can result in water-fat swapping. This problem arises because the signals from two chemical species, when both are modeled as a single discrete spectral peak, may appear indistinguishable in the presence of Bo off-resonance. In conventional methods, the water-fat ambiguity is typically removed by enforcing field map smoothness using region growing based algorithms. In reality, the fat spectrum has multiple spectral peaks. Using this spectral complexity, we introduce a novel concept that identifies water and fat for multiecho acquisitions by exploiting the spectral differences between water and fat. A fat likelihood map is produced to indicate if a pixel is likely to be water-dominant or fat-dominant by comparing the fitting residuals of two different signal models. The fat likelihood analysis and field map smoothness provide complementary information, and we designed an algorithm (Fat Likelihood Analysis for Multiecho Signals) to exploit both mechanisms. It is demonstrated in a wide variety of data that the Fat Likelihood Analysis for Multiecho Signals algorithm offers highly robust water-fat separation for 6-echo acquisitions, particularly in some previously challenging applications. PMID:21842498
Peyrot, W J; Lee, S H; Milaneschi, Y; Abdellaoui, A; Byrne, E M; Esko, T; de Geus, E J C; Hemani, G; Hottenga, J J; Kloiber, S; Levinson, D F; Lucae, S; Martin, N G; Medland, S E; Metspalu, A; Milani, L; Noethen, M M; Potash, J B; Rietschel, M; Rietveld, C A; Ripke, S; Shi, J; Willemsen, G; Zhu, Z; Boomsma, D I; Wray, N R; Penninx, B W J H
2015-06-01
An association between lower educational attainment (EA) and an increased risk for depression has been confirmed in various western countries. This study examines whether pleiotropic genetic effects contribute to this association. Therefore, data were analyzed from a total of 9662 major depressive disorder (MDD) cases and 14,949 controls (with no lifetime MDD diagnosis) from the Psychiatric Genomics Consortium with additional Dutch and Estonian data. The association of EA and MDD was assessed with logistic regression in 15,138 individuals indicating a significantly negative association in our sample with an odds ratio for MDD 0.78 (0.75-0.82) per standard deviation increase in EA. With data of 884,105 autosomal common single-nucleotide polymorphisms (SNPs), three methods were applied to test for pleiotropy between MDD and EA: (i) genetic profile risk scores (GPRS) derived from training data for EA (independent meta-analysis on ~120,000 subjects) and MDD (using a 10-fold leave-one-out procedure in the current sample), (ii) bivariate genomic-relationship-matrix restricted maximum likelihood (GREML) and (iii) SNP effect concordance analysis (SECA). With these methods, we found (i) that the EA-GPRS did not predict MDD status, and MDD-GPRS did not predict EA, (ii) a weak negative genetic correlation with bivariate GREML analyses, but this correlation was not consistently significant, (iii) no evidence for concordance of MDD and EA SNP effects with SECA analysis. To conclude, our study confirms an association of lower EA and MDD risk, but this association was not because of measurable pleiotropic genetic effects, which suggests that environmental factors could be involved, for example, socioeconomic status.
Scale of Unpredictability Beliefs: Reliability and Validity.
Ross, Lisa Thomson; Short, Stephen D; Garofano, Marina
2016-11-16
Experiencing unpredictability in the environment has a variety of negative outcomes. However, these are difficult to ascertain due to the lack of a psychometrically sound measure of unpredictability beliefs. This article summarizes the development of the Scale of Unpredictability Beliefs (SUB), which assesses perceptions about unpredictability in one's life, in other people, and in the world. In Study I, college students (N = 305) responded to 68 potential items as well as other scales. Exploratory factor analysis yielded three internally consistent subscales (Self, People, and World; 16 items total). Higher SUB scores correlated with more childhood family unpredictability, greater likelihood of parental alcohol abuse, stronger causal uncertainty, and lower self-efficacy. In Study II, a confirmatory factor analysis supported the three-factor solution (N = 186 college students). SUB scores correlated with personality, childhood family unpredictability, and control beliefs. In most instances the SUB predicted family unpredictability and control beliefs beyond existing unpredictability measures. Study III confirmed the factor structure and replicated family unpredictability associations in an adult sample (N = 483). This article provides preliminary support for this new multi-dimensional, self-report assessment of unpredictability beliefs, and ideas for future research are discussed.
A Study of Item Bias for Attitudinal Measurement Using Maximum Likelihood Factor Analysis.
ERIC Educational Resources Information Center
Mayberry, Paul W.
A technique for detecting item bias that is responsive to attitudinal measurement considerations is a maximum likelihood factor analysis procedure comparing multivariate factor structures across various subpopulations, often referred to as SIFASP. The SIFASP technique allows for factorial model comparisons in the testing of various hypotheses…
Personality patterns and Smoking behavior among students in Tabriz, Iran
Fakharri, Ali; Jahani, Ali; Sadeghi-Bazargani, Homayoun; Farahbakhsh, Mostafa; Asl, Asghar Mohammadpour
2017-01-01
Introduction Psychological factors have always been considered for their role on risk taking behavior such as substance abuse, risky driving and smoking. The aim of this study was to determine the association between smoking behavior and potential personality patterns among high school students in Tabriz, Iran. Methods Through a multistage sampling in a cross-sectional study, 1000 students were enrolled to represent the final grade high school student population of Tabriz, Iran in 2013. The personality patterns along with smoking status and some background information were collected through standard questionnaires along with Millon Clinical Multiaxial Inventory-III (MCMI-III). Fourteen personality patterns and ten clinical syndromes. ANOVA and Kruskal Wallis tests were used to compare numeric scales among the study participants, with respect to their smoking status. Stata version 13 statistical software package was used to analyze the data. Multivariate logistic regression was used to predict likelihood of smoking by personality status. Results Two logistic models were developed in both of whom male sex was identified as a determinant of regular smoking (1st model) and ever-smoking (2nd model). Depressive personality increased the likelihood of being a regular smoker by 2.8 times (OR=2.8, 95% CI: 1.3–6.1). The second personality disorder included in the model was sadistic personality with an odds ratio of 7.9 (96% CI: 1.2–53%). Histrionic personality increased the likelihood of experiencing smoking by 2.2 times (OR=2.2, 95% CI: 1.6–3.1) followed by borderline personality (OR=2.8, 95% CI: 0.97–8.1). Conclusion Histrionic and depressive personalities could be considered as strong associates of smoking, followed by borderline and sadistic personalities. A causal relationship couldn’t be assumed unless well controlled longitudinal studies reached the same findings using psychiatric interviews. PMID:28461869
NASA Astrophysics Data System (ADS)
Alsing, Justin; Wandelt, Benjamin; Feeney, Stephen
2018-07-01
Many statistical models in cosmology can be simulated forwards but have intractable likelihood functions. Likelihood-free inference methods allow us to perform Bayesian inference from these models using only forward simulations, free from any likelihood assumptions or approximations. Likelihood-free inference generically involves simulating mock data and comparing to the observed data; this comparison in data space suffers from the curse of dimensionality and requires compression of the data to a small number of summary statistics to be tractable. In this paper, we use massive asymptotically optimal data compression to reduce the dimensionality of the data space to just one number per parameter, providing a natural and optimal framework for summary statistic choice for likelihood-free inference. Secondly, we present the first cosmological application of Density Estimation Likelihood-Free Inference (DELFI), which learns a parametrized model for joint distribution of data and parameters, yielding both the parameter posterior and the model evidence. This approach is conceptually simple, requires less tuning than traditional Approximate Bayesian Computation approaches to likelihood-free inference and can give high-fidelity posteriors from orders of magnitude fewer forward simulations. As an additional bonus, it enables parameter inference and Bayesian model comparison simultaneously. We demonstrate DELFI with massive data compression on an analysis of the joint light-curve analysis supernova data, as a simple validation case study. We show that high-fidelity posterior inference is possible for full-scale cosmological data analyses with as few as ˜104 simulations, with substantial scope for further improvement, demonstrating the scalability of likelihood-free inference to large and complex cosmological data sets.
NASA Technical Reports Server (NTRS)
Hoffbeck, Joseph P.; Landgrebe, David A.
1994-01-01
Many analysis algorithms for high-dimensional remote sensing data require that the remotely sensed radiance spectra be transformed to approximate reflectance to allow comparison with a library of laboratory reflectance spectra. In maximum likelihood classification, however, the remotely sensed spectra are compared to training samples, thus a transformation to reflectance may or may not be helpful. The effect of several radiance-to-reflectance transformations on maximum likelihood classification accuracy is investigated in this paper. We show that the empirical line approach, LOWTRAN7, flat-field correction, single spectrum method, and internal average reflectance are all non-singular affine transformations, and that non-singular affine transformations have no effect on discriminant analysis feature extraction and maximum likelihood classification accuracy. (An affine transformation is a linear transformation with an optional offset.) Since the Atmosphere Removal Program (ATREM) and the log residue method are not affine transformations, experiments with Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data were conducted to determine the effect of these transformations on maximum likelihood classification accuracy. The average classification accuracy of the data transformed by ATREM and the log residue method was slightly less than the accuracy of the original radiance data. Since the radiance-to-reflectance transformations allow direct comparison of remotely sensed spectra with laboratory reflectance spectra, they can be quite useful in labeling the training samples required by maximum likelihood classification, but these transformations have only a slight effect or no effect at all on discriminant analysis and maximum likelihood classification accuracy.
Likelihood-Based Confidence Intervals in Exploratory Factor Analysis
ERIC Educational Resources Information Center
Oort, Frans J.
2011-01-01
In exploratory or unrestricted factor analysis, all factor loadings are free to be estimated. In oblique solutions, the correlations between common factors are free to be estimated as well. The purpose of this article is to show how likelihood-based confidence intervals can be obtained for rotated factor loadings and factor correlations, by…
Vocational Qualifications, Employment Status and Income: 2006 Census Analysis. Technical Paper
ERIC Educational Resources Information Center
Daly, Anne
2011-01-01
Two features of the labour market for vocationally qualified workers are explored in this technical paper: the likelihood of self-employment versus wage employment and the determinants of income. The analysis showed that demographic, occupational and local labour market characteristics all influence the likelihood of self-employment. Self-employed…
High-Dimensional Exploratory Item Factor Analysis by a Metropolis-Hastings Robbins-Monro Algorithm
ERIC Educational Resources Information Center
Cai, Li
2010-01-01
A Metropolis-Hastings Robbins-Monro (MH-RM) algorithm for high-dimensional maximum marginal likelihood exploratory item factor analysis is proposed. The sequence of estimates from the MH-RM algorithm converges with probability one to the maximum likelihood solution. Details on the computer implementation of this algorithm are provided. The…
Reeder-Hayes, Katherine; Peacock Hinton, Sharon; Meng, Ke; Carey, Lisa A; Dusetzina, Stacie B
2016-06-10
Trastuzumab is a key component of adjuvant therapy for stage I to III human epidermal growth factor receptor 2 (HER2)-positive breast cancer. The rates and patterns of trastuzumab use have never been described in a population-based sample. The recent addition of HER2 information to the SEER-Medicare database offers an opportunity to examine patterns of trastuzumab use and to evaluate possible disparities in receipt of trastuzumab. We examined a national cohort of Medicare beneficiaries with incident stage I to III HER2-positive breast cancer diagnosed in 2010 and 2011 (n = 1,362). We used insurance claims data to track any use of trastuzumab in the 12 months after diagnosis as well as to identify chemotherapy drugs used in partnership with trastuzumab. We used modified Poisson regression analysis to evaluate the independent effect of race on likelihood of receiving trastuzumab by controlling for clinical need, comorbidity, and community-level socioeconomic status. Overall, 50% of white women and 40% of black women received some trastuzumab therapy. Among women with stage III disease, 74% of whites and 56% of blacks received trastuzumab. After adjustment for tumor characteristics, poverty, and comorbidity, black women were 25% less likely to receive trastuzumab within 1 year of diagnosis than white women (risk ratio, 0.745; 95% CI, 0.60 to 0.93). Approxemately one half of patients 65 years of age and older with stage I to III breast cancer do not receive trastuzumab-based therapy, which includes many with locally advanced disease. Significant racial disparities exist in the receipt of this highly effective therapy. Further research that identifies barriers to use and increases uptake of trastuzumab could potentially improve recurrence and survival outcomes in this population, particularly among minority women. © 2016 by American Society of Clinical Oncology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eifler, Tim; Krause, Elisabeth; Dodelson, Scott
2014-05-28
Systematic uncertainties that have been subdominant in past large-scale structure (LSS) surveys are likely to exceed statistical uncertainties of current and future LSS data sets, potentially limiting the extraction of cosmological information. Here we present a general framework (PCA marginalization) to consistently incorporate systematic effects into a likelihood analysis. This technique naturally accounts for degeneracies between nuisance parameters and can substantially reduce the dimension of the parameter space that needs to be sampled. As a practical application, we apply PCA marginalization to account for baryonic physics as an uncertainty in cosmic shear tomography. Specifically, we use CosmoLike to run simulatedmore » likelihood analyses on three independent sets of numerical simulations, each covering a wide range of baryonic scenarios differing in cooling, star formation, and feedback mechanisms. We simulate a Stage III (Dark Energy Survey) and Stage IV (Large Synoptic Survey Telescope/Euclid) survey and find a substantial bias in cosmological constraints if baryonic physics is not accounted for. We then show that PCA marginalization (employing at most 3 to 4 nuisance parameters) removes this bias. Our study demonstrates that it is possible to obtain robust, precise constraints on the dark energy equation of state even in the presence of large levels of systematic uncertainty in astrophysical processes. We conclude that the PCA marginalization technique is a powerful, general tool for addressing many of the challenges facing the precision cosmology program.« less
Daher, Aqil Mohammad; Noor Khan Nor-Ashikin, Mohamed; Mat-Nasir, Nafiza; Keat Ng, Kien; Ambigga, Krishnapillai S.; Ariffin, Farnaza; Yasin Mazapuspavina, Md; Abdul-Razak, Suraya; Abdul-Hamid, Hasidah; Abd-Majid, Fadhlina; Abu-Bakar, Najmin; Nawawi, Hapizah; Yusoff, Khalid
2013-01-01
Metabolic syndrome (MetS) is a steering force for the cardiovascular diseases epidemic in Asia. This study aimed to compare the prevalence of MetS in Malaysian adults using NCEP-ATP III, IDF, and JIS definitions, identify the demographic factors associated with MetS, and determine the level of agreement between these definitions. The analytic sample consisted of 8,836 adults aged ≥30 years recruited at baseline in 2007–2011 from the Cardiovascular Risk Prevention Study (CRisPS), an ongoing, prospective cohort study involving 18 urban and 22 rural communities in Malaysia. JIS definition gave the highest overall prevalence (43.4%) compared to NCEP-ATP III (26.5%) and IDF (37.4%), P < 0.001. Indians had significantly higher age-adjusted prevalence compared to other ethnic groups across all MetS definitions (30.1% by NCEP-ATP III, 50.8% by IDF, and 56.5% by JIS). The likelihood of having MetS amongst the rural and urban populations was similar across all definitions. A high level of agreement between the IDF and JIS was observed (Kappa index = 0.867), while there was a lower level of agreement between the IDF and NCEP-ATP III (Kappa index = 0.580). JIS definition identified more Malaysian adults with MetS and therefore should be recommended as the preferred diagnostic criterion. PMID:24175300
Long-Term Incisal Relationships After Palatoplasty in Patients With Isolated Cleft Palate.
Odom, Elizabeth B; Woo, Albert S; Mendonca, Derick A; Huebener, Donald V; Nissen, Richard J; Skolnick, Gary B; Patel, Kamlesh B
2016-06-01
Various palatoplasty techniques have limited incisions in the hard palate due to concerns that these incisions may limit maxillary growth. There is little convincing long-term evidence to support this. Our purpose is to determine incisal relationships, an indicator for future orthognathic procedure, in patients after repair of an isolated cleft of the secondary palate. Our craniofacial database was used to identify patients aged 10 years or greater with an isolated cleft of the secondary palate who underwent palatoplasty between 1985 and 2002. Data collected included age at palatoplasty and follow-up, cleft type, associated syndrome, Robin sequence, surgeon, repair technique, number of operations, and occlusion. Incisal relationship was determined through clinical observation by a pediatric dentist and orthodontist. Seventy eligible patients operated on by 9 surgeons were identified. Class III incisal relationship was seen in 5 patients (7.1%). Palatoplasty techniques over the hard palate (63 of 70 patients) included 2-flap palatoplasty, VY-pushback, and Von Langenbeck repair. There was an association between class III incisal relationship and syndromic diagnosis (P <0.001). Other study variables were not associated with class III incisal relationships. In patients with an isolated cleft of the secondary palate, there was no association between class III incisal relationship and surgeon, age at repair, cleft type, palatoplasty technique, or number of operations. Increased likelihood of class III incisal relationship was associated primarily with syndromic diagnosis.
THESEUS: maximum likelihood superpositioning and analysis of macromolecular structures.
Theobald, Douglas L; Wuttke, Deborah S
2006-09-01
THESEUS is a command line program for performing maximum likelihood (ML) superpositions and analysis of macromolecular structures. While conventional superpositioning methods use ordinary least-squares (LS) as the optimization criterion, ML superpositions provide substantially improved accuracy by down-weighting variable structural regions and by correcting for correlations among atoms. ML superpositioning is robust and insensitive to the specific atoms included in the analysis, and thus it does not require subjective pruning of selected variable atomic coordinates. Output includes both likelihood-based and frequentist statistics for accurate evaluation of the adequacy of a superposition and for reliable analysis of structural similarities and differences. THESEUS performs principal components analysis for analyzing the complex correlations found among atoms within a structural ensemble. ANSI C source code and selected binaries for various computing platforms are available under the GNU open source license from http://monkshood.colorado.edu/theseus/ or http://www.theseus3d.org.
On the Likelihood Ratio Test for the Number of Factors in Exploratory Factor Analysis
ERIC Educational Resources Information Center
Hayashi, Kentaro; Bentler, Peter M.; Yuan, Ke-Hai
2007-01-01
In the exploratory factor analysis, when the number of factors exceeds the true number of factors, the likelihood ratio test statistic no longer follows the chi-square distribution due to a problem of rank deficiency and nonidentifiability of model parameters. As a result, decisions regarding the number of factors may be incorrect. Several…
Tavaglione, Nicolas; Martin, Angela K; Mezger, Nathalie; Durieux-Paillard, Sophie; François, Anne; Jackson, Yves; Hurst, Samia A
2015-02-01
In the literature on medical ethics, it is generally admitted that vulnerable persons or groups deserve special attention, care or protection. One can define vulnerable persons as those having a greater likelihood of being wronged - that is, of being denied adequate satisfaction of certain legitimate claims. The conjunction of these two points entails what we call the Special Protection Thesis. It asserts that persons with a greater likelihood of being denied adequate satisfaction of their legitimate claims deserve special attention, care or protection. Such a thesis remains vague, however, as long as we do not know what legitimate claims are. This article aims at dispelling this vagueness by exploring what claims we have in relation to health care - thus fleshing out a claim-based conception of vulnerability. We argue that the Special Protection Thesis must be enriched as follows: If individual or group X has a greater likelihood of being denied adequate satisfaction of some of their legitimate claims to (i) physical integrity, (ii) autonomy, (iii) freedom, (iv) social provision, (v) impartial quality of government, (vi) social bases of self-respect or (vii) communal belonging, then X deserves special attention, care or protection. With this improved understanding of vulnerability, vulnerability talk in healthcare ethics can escape vagueness and serve as an adequate basis for practice. © 2013 John Wiley & Sons Ltd.
Cheng, Steven K; Dietrich, Mary S; Dilts, David M
2010-11-15
Postactivation barriers to oncology clinical trial accruals are well documented; however, potential barriers prior to trial opening are not. We investigate one such barrier: trial development time. National Cancer Institute Cancer Therapy Evaluation Program (CTEP)-sponsored trials for all therapeutic, nonpediatric phase I, I/II, II, and III studies activated between 2000 and 2004 were investigated for an 8-year period (n = 419). Successful trials were those achieving 100% of minimum accrual goal. Time to open a study was the calendar time from initial CTEP submission to trial activation. Multivariate logistic regression analysis was used to calculate unadjusted and adjusted odds ratios (OR), controlling for study phase and size of expected accruals. Among the CTEP-approved oncology trials, 37.9% (n = 221) failed to attain the minimum accrual goals, with 70.8% (n = 14) of phase III trials resulting in poor accrual. A total of 16,474 patients (42.5% of accruals) accrued to those studies were unable to achieve the projected minimum accrual goal. Trials requiring less than 12 months of development were significantly more likely to achieve accrual goals (OR, 2.15; 95% confidence interval, 1.29-3.57, P = 0.003) than trials with the median development times of 12 to 18 months. Trials requiring a development time of greater than 24 months were significantly less likely to achieve accrual goals (OR, 0.40; 95% confidence interval, 0.20-0.78; P = 0.011) than trials with the median development time. A large percentage of oncology clinical trials do not achieve minimum projected accruals. Trial development time appears to be one important predictor of the likelihood of successfully achieving the minimum accrual goals. ©2010 AACR.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pinkham, Mark B., E-mail: mark.pinkham@health.qld.gov.au; University of Queensland, Brisbane; Foote, Matthew C.
Purpose: To describe the anatomic distribution of regionally recurrent disease in patients with stage III melanoma in the axilla after curative-intent surgery with and without adjuvant radiation therapy. Methods and Materials: A single-institution, retrospective analysis of a prospective database of 277 patients undergoing curative-intent treatment for stage III melanoma in the axilla between 1992 and 2012 was completed. For patients who received radiation therapy and those who did not, patterns of regional recurrence were analyzed, and univariate analyses were performed to assess for potential factors associated with location of recurrence. Results: There were 121 patients who received adjuvant radiation therapymore » because their clinicopathologic features conferred a greater risk of regional recurrence. There were 156 patients who received no radiation therapy. The overall axillary control rate was 87%. There were 37 patients with regional recurrence; 17 patients had received adjuvant radiation therapy (14%), and 20 patients (13%) had not. The likelihood of in-field nodal recurrence was significantly less in the adjuvant radiation therapy group (P=.01) and significantly greater in sites adjacent to the axilla (P=.02). Patients with high-risk clinicopathologic features who did not receive adjuvant radiation therapy also tended to experience in-field failure rather than adjacent-field failure. Conclusions: Patients who received adjuvant radiation therapy were more likely to experience recurrence in the adjacent-field regions rather than in the in-field regions. This may not simply reflect higher-risk pathology. Using this data, it may be possible to improve outcomes by reducing the number of adjacent-field recurrences after adjuvant radiation therapy.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wagstaff, J.; Phadke, K.; Adam, N.
1982-02-01
Of patients with Stage II and III malignant melanoma, 34.7% display reversal of the liver-spleen ratio on technetium-99m-sulfhur colloid isotope scans. Such an occurrence does not suggest a greater likelihood of relapse or a worse survival. The phenomenom is more common in female patients and there is a significant relationship between the presence of a ''hot spleen'' and a high IgM level. Patients with Stage II disease and high IgM levels have relapses more quickly than do those with normal IgM levels. Lymphopenia is common in patients with Stage II and III disease and the survival of these patients ismore » worse than that of those with normal lymphocyte counts. In this report, the data are discussed together with results from other investigations, and a unifying hypothesis is presented which explains the phenomenon and relates it to increased activity of macrophages as a result of the presence of the tumor. The usefulness of isotope liver scanning in stage III malignant melanoma is also discussed.« less
Scarpassa, Vera Margarete; Cunha-Machado, Antonio Saulo; Saraiva, José Ferreira
2016-04-12
Anopheles nuneztovari sensu lato comprises cryptic species in northern South America, and the Brazilian populations encompass distinct genetic lineages within the Brazilian Amazon region. This study investigated, based on two molecular markers, whether these lineages might actually deserve species status. Specimens were collected in five localities of the Brazilian Amazon, including Manaus, Careiro Castanho and Autazes, in the State of Amazonas; Tucuruí, in the State of Pará; and Abacate da Pedreira, in the State of Amapá, and analysed for the COI gene (Barcode region) and 12 microsatellite loci. Phylogenetic analyses were performed using the maximum likelihood (ML) approach. Intra and inter samples genetic diversity were estimated using population genetics analyses, and the genetic groups were identified by means of the ML, Bayesian and factorial correspondence analyses and the Bayesian analysis of population structure. The Barcode region dataset (N = 103) generated 27 haplotypes. The haplotype network suggested three lineages. The ML tree retrieved five monophyletic groups. Group I clustered all specimens from Manaus and Careiro Castanho, the majority of Autazes and a few from Abacate da Pedreira. Group II clustered most of the specimens from Abacate da Pedreira and a few from Autazes and Tucuruí. Group III clustered only specimens from Tucuruí (lineage III), strongly supported (97 %). Groups IV and V clustered specimens of A. nuneztovari s.s. and A. dunhami, strongly (98 %) and weakly (70 %) supported, respectively. In the second phylogenetic analysis, the sequences from GenBank, identified as A. goeldii, clustered to groups I and II, but not to group III. Genetic distances (Kimura-2 parameters) among the groups ranged from 1.60 % (between I and II) to 2.32 % (between I and III). Microsatellite data revealed very high intra-population genetic variability. Genetic distances showed the highest and significant values (P = 0.005) between Tucuruí and all the other samples, and between Abacate da Pedreira and all the other samples. Genetic distances, Bayesian (Structure and BAPS) analyses and FCA suggested three distinct biological groups, supporting the barcode region results. The two markers revealed three genetic lineages for A. nuneztovari s.l. in the Brazilian Amazon region. Lineages I and II may represent genetically distinct groups or species within A. goeldii. Lineage III may represent a new species, distinct from the A. goeldii group, and may be the most ancestral in the Brazilian Amazon. They may have differences in Plasmodium susceptibility and should therefore be investigated further.
Empirical Likelihood in Nonignorable Covariate-Missing Data Problems.
Xie, Yanmei; Zhang, Biao
2017-04-20
Missing covariate data occurs often in regression analysis, which frequently arises in the health and social sciences as well as in survey sampling. We study methods for the analysis of a nonignorable covariate-missing data problem in an assumed conditional mean function when some covariates are completely observed but other covariates are missing for some subjects. We adopt the semiparametric perspective of Bartlett et al. (Improving upon the efficiency of complete case analysis when covariates are MNAR. Biostatistics 2014;15:719-30) on regression analyses with nonignorable missing covariates, in which they have introduced the use of two working models, the working probability model of missingness and the working conditional score model. In this paper, we study an empirical likelihood approach to nonignorable covariate-missing data problems with the objective of effectively utilizing the two working models in the analysis of covariate-missing data. We propose a unified approach to constructing a system of unbiased estimating equations, where there are more equations than unknown parameters of interest. One useful feature of these unbiased estimating equations is that they naturally incorporate the incomplete data into the data analysis, making it possible to seek efficient estimation of the parameter of interest even when the working regression function is not specified to be the optimal regression function. We apply the general methodology of empirical likelihood to optimally combine these unbiased estimating equations. We propose three maximum empirical likelihood estimators of the underlying regression parameters and compare their efficiencies with other existing competitors. We present a simulation study to compare the finite-sample performance of various methods with respect to bias, efficiency, and robustness to model misspecification. The proposed empirical likelihood method is also illustrated by an analysis of a data set from the US National Health and Nutrition Examination Survey (NHANES).
Han, Jubong; Lee, K B; Lee, Jong-Man; Park, Tae Soon; Oh, J S; Oh, Pil-Jei
2016-03-01
We discuss a new method to incorporate Type B uncertainty into least-squares procedures. The new method is based on an extension of the likelihood function from which a conventional least-squares function is derived. The extended likelihood function is the product of the original likelihood function with additional PDFs (Probability Density Functions) that characterize the Type B uncertainties. The PDFs are considered to describe one's incomplete knowledge on correction factors being called nuisance parameters. We use the extended likelihood function to make point and interval estimations of parameters in the basically same way as the least-squares function used in the conventional least-squares method is derived. Since the nuisance parameters are not of interest and should be prevented from appearing in the final result, we eliminate such nuisance parameters by using the profile likelihood. As an example, we present a case study for a linear regression analysis with a common component of Type B uncertainty. In this example we compare the analysis results obtained from using our procedure with those from conventional methods. Copyright © 2015. Published by Elsevier Ltd.
ERIC Educational Resources Information Center
Khattab, Ali-Maher; And Others
1982-01-01
A causal modeling system, using confirmatory maximum likelihood factor analysis with the LISREL IV computer program, evaluated the construct validity underlying the higher order factor structure of a given correlation matrix of 46 structure-of-intellect tests emphasizing the product of transformations. (Author/PN)
Assarroudi, Abdolghader; Heshmati Nabavi, Fatemeh; Ebadi, Abbas; Esmaily, Habibollah
2017-06-01
Rescuers' psychological competence, particularly their motivation, can improve the cardiopulmonary resuscitation outcomes. Data were collected using semistructured interviews with 24 cardiopulmonary resuscitation team members and analyzed through deductive content analysis based on Vroom's expectancy theory. Nine generic categories were developed: (i) estimation of the chance of survival; (ii) estimation of self-efficacy; (iii) looking for a sign of effectiveness; (iv) supportive organizational structure; (v) revival; (vi) acquisition of external incentives; (vii) individual drives; (viii) commitment to personal values; and (ix) avoiding undesirable social outcomes. When professional rescuers were called to perform cardiopulmonary resuscitation, they subjectively evaluated the patient's chance of survival, the likelihood of achieving of the desired outcome, and the ability to perform cardiopulmonary resuscitation interventions. If their evaluations were positive, and the consequences of cardiopulmonary resuscitation were considered favorable, they were strongly motivated to perform it. Beyond the scientific aspects, the motivation to perform cardiopulmonary resuscitation was influenced by intuitive, emotional, and spiritual aspects. © 2017 John Wiley & Sons Australia, Ltd.
COSMIC MICROWAVE BACKGROUND LIKELIHOOD APPROXIMATION FOR BANDED PROBABILITY DISTRIBUTIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gjerløw, E.; Mikkelsen, K.; Eriksen, H. K.
We investigate sets of random variables that can be arranged sequentially such that a given variable only depends conditionally on its immediate predecessor. For such sets, we show that the full joint probability distribution may be expressed exclusively in terms of uni- and bivariate marginals. Under the assumption that the cosmic microwave background (CMB) power spectrum likelihood only exhibits correlations within a banded multipole range, Δl{sub C}, we apply this expression to two outstanding problems in CMB likelihood analysis. First, we derive a statistically well-defined hybrid likelihood estimator, merging two independent (e.g., low- and high-l) likelihoods into a single expressionmore » that properly accounts for correlations between the two. Applying this expression to the Wilkinson Microwave Anisotropy Probe (WMAP) likelihood, we verify that the effect of correlations on cosmological parameters in the transition region is negligible in terms of cosmological parameters for WMAP; the largest relative shift seen for any parameter is 0.06σ. However, because this may not hold for other experimental setups (e.g., for different instrumental noise properties or analysis masks), but must rather be verified on a case-by-case basis, we recommend our new hybridization scheme for future experiments for statistical self-consistency reasons. Second, we use the same expression to improve the convergence rate of the Blackwell-Rao likelihood estimator, reducing the required number of Monte Carlo samples by several orders of magnitude, and thereby extend it to high-l applications.« less
Program for Weibull Analysis of Fatigue Data
NASA Technical Reports Server (NTRS)
Krantz, Timothy L.
2005-01-01
A Fortran computer program has been written for performing statistical analyses of fatigue-test data that are assumed to be adequately represented by a two-parameter Weibull distribution. This program calculates the following: (1) Maximum-likelihood estimates of the Weibull distribution; (2) Data for contour plots of relative likelihood for two parameters; (3) Data for contour plots of joint confidence regions; (4) Data for the profile likelihood of the Weibull-distribution parameters; (5) Data for the profile likelihood of any percentile of the distribution; and (6) Likelihood-based confidence intervals for parameters and/or percentiles of the distribution. The program can account for tests that are suspended without failure (the statistical term for such suspension of tests is "censoring"). The analytical approach followed in this program for the software is valid for type-I censoring, which is the removal of unfailed units at pre-specified times. Confidence regions and intervals are calculated by use of the likelihood-ratio method.
Battersby, Nick J; Dattani, Mit; Rao, Sheela; Cunningham, David; Tait, Diana; Adams, Richard; Moran, Brendan J; Khakoo, Shelize; Tekkis, Paris; Rasheed, Shahnawaz; Mirnezami, Alex; Quirke, Philip; West, Nicholas P; Nagtegaal, Iris; Chong, Irene; Sadanandam, Anguraj; Valeri, Nicola; Thomas, Karen; Frost, Michelle; Brown, Gina
2017-08-29
Pre-operative chemoradiotherapy (CRT) for MRI-defined, locally advanced rectal cancer is primarily intended to reduce local recurrence rates by downstaging tumours, enabling an improved likelihood of curative resection. However, in a subset of patients complete tumour regression occurs implying that no viable tumour is present within the surgical specimen. This raises the possibility that surgery may have been avoided. It is also recognised that response to CRT is a key determinant of prognosis. Recent radiological advances enable this response to be assessed pre-operatively using the MRI tumour regression grade (mrTRG). Potentially, this allows modification of the baseline MRI-derived treatment strategy. Hence, in a 'good' mrTRG responder, with little or no evidence of tumour, surgery may be deferred. Conversely, a 'poor response' identifies an adverse prognostic group which may benefit from additional pre-operative therapy. TRIGGER is a multicentre, open, interventional, randomised control feasibility study with an embedded phase III design. Patients with MRI-defined, locally advanced rectal adenocarcinoma deemed to require CRT will be eligible for recruitment. During CRT, patients will be randomised (1:2) between conventional management, according to baseline MRI, versus mrTRG-directed management. The primary endpoint of the feasibility phase is to assess the rate of patient recruitment and randomisation. Secondary endpoints include the rate of unit recruitment, acute drug toxicity, reproducibility of mrTRG reporting, surgical morbidity, pathological circumferential resection margin involvement, pathology regression grade, residual tumour cell density and surgical/specimen quality rates. The phase III trial will focus on long-term safety, regrowth rates, oncological survival analysis, quality of life and health economics analysis. The TRIGGER trial aims to determine whether patients with locally advanced rectal cancer can be recruited and subsequently randomised into a control trial that offers MRI-directed patient management according to radiological response to CRT (mrTRG). The feasibility study will inform a phase III trial design investigating stratified treatment of good and poor responders according to 3-year disease-free survival, colostomy-free survival as well as an increase in cases managed without a major resection. ClinicalTrials.gov, ID: NCT02704520 . Registered on 5 February 2016.
Mitchell, Jonathan S.; Chang, Jonathan
2017-01-01
Bayesian analysis of macroevolutionary mixtures (BAMM) is a statistical framework that uses reversible jump Markov chain Monte Carlo to infer complex macroevolutionary dynamics of diversification and phenotypic evolution on phylogenetic trees. A recent article by Moore et al. (MEA) reported a number of theoretical and practical concerns with BAMM. Major claims from MEA are that (i) BAMM’s likelihood function is incorrect, because it does not account for unobserved rate shifts; (ii) the posterior distribution on the number of rate shifts is overly sensitive to the prior; and (iii) diversification rate estimates from BAMM are unreliable. Here, we show that these and other conclusions from MEA are generally incorrect or unjustified. We first demonstrate that MEA’s numerical assessment of the BAMM likelihood is compromised by their use of an invalid likelihood function. We then show that “unobserved rate shifts” appear to be irrelevant for biologically plausible parameterizations of the diversification process. We find that the purportedly extreme prior sensitivity reported by MEA cannot be replicated with standard usage of BAMM v2.5, or with any other version when conventional Bayesian model selection is performed. Finally, we demonstrate that BAMM performs very well at estimating diversification rate variation across the \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}${\\sim}$\\end{document}20% of simulated trees in MEA’s data set for which it is theoretically possible to infer rate shifts with confidence. Due to ascertainment bias, the remaining 80% of their purportedly variable-rate phylogenies are statistically indistinguishable from those produced by a constant-rate birth–death process and were thus poorly suited for the summary statistics used in their performance assessment. We demonstrate that inferences about diversification rates have been accurate and consistent across all major previous releases of the BAMM software. We recognize an acute need to address the theoretical foundations of rate-shift models for phylogenetic trees, and we expect BAMM and other modeling frameworks to improve in response to mathematical and computational innovations. However, we remain optimistic that that the imperfect tools currently available to comparative biologists have provided and will continue to provide important insights into the diversification of life on Earth. PMID:28334223
Rabosky, Daniel L; Mitchell, Jonathan S; Chang, Jonathan
2017-07-01
Bayesian analysis of macroevolutionary mixtures (BAMM) is a statistical framework that uses reversible jump Markov chain Monte Carlo to infer complex macroevolutionary dynamics of diversification and phenotypic evolution on phylogenetic trees. A recent article by Moore et al. (MEA) reported a number of theoretical and practical concerns with BAMM. Major claims from MEA are that (i) BAMM's likelihood function is incorrect, because it does not account for unobserved rate shifts; (ii) the posterior distribution on the number of rate shifts is overly sensitive to the prior; and (iii) diversification rate estimates from BAMM are unreliable. Here, we show that these and other conclusions from MEA are generally incorrect or unjustified. We first demonstrate that MEA's numerical assessment of the BAMM likelihood is compromised by their use of an invalid likelihood function. We then show that "unobserved rate shifts" appear to be irrelevant for biologically plausible parameterizations of the diversification process. We find that the purportedly extreme prior sensitivity reported by MEA cannot be replicated with standard usage of BAMM v2.5, or with any other version when conventional Bayesian model selection is performed. Finally, we demonstrate that BAMM performs very well at estimating diversification rate variation across the ${\\sim}$20% of simulated trees in MEA's data set for which it is theoretically possible to infer rate shifts with confidence. Due to ascertainment bias, the remaining 80% of their purportedly variable-rate phylogenies are statistically indistinguishable from those produced by a constant-rate birth-death process and were thus poorly suited for the summary statistics used in their performance assessment. We demonstrate that inferences about diversification rates have been accurate and consistent across all major previous releases of the BAMM software. We recognize an acute need to address the theoretical foundations of rate-shift models for phylogenetic trees, and we expect BAMM and other modeling frameworks to improve in response to mathematical and computational innovations. However, we remain optimistic that that the imperfect tools currently available to comparative biologists have provided and will continue to provide important insights into the diversification of life on Earth. © The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.
NASA Technical Reports Server (NTRS)
Cash, W.
1979-01-01
Many problems in the experimental estimation of parameters for models can be solved through use of the likelihood ratio test. Applications of the likelihood ratio, with particular attention to photon counting experiments, are discussed. The procedures presented solve a greater range of problems than those currently in use, yet are no more difficult to apply. The procedures are proved analytically, and examples from current problems in astronomy are discussed.
ATAC Autocuer Modeling Analysis.
1981-01-01
the analysis of the simple rectangular scrnentation (1) is based on detection and estimation theory (2). This approach uses the concept of maximum ...continuous wave forms. In order to develop the principles of maximum likelihood, it is con- venient to develop the principles for the "classical...the concept of maximum likelihood is significant in that it provides the optimum performance of the detection/estimation problem. With a knowledge of
Long-Term Incisal Relationships after Palatoplasty in Patients with Isolated Cleft Palate
Odom, Elizabeth B.; Woo, Albert S.; Mendonca, Derick A.; Huebener, Donald V.; Nissen, Richard J.; Skolnick, Gary B.; Patel, Kamlesh B.
2016-01-01
Purpose Various palatoplasty techniques have limited incisions in the hard palate due to concerns that these incisions may limit maxillary growth. There is little convincing long-term evidence to support this. Our purpose is to determine incisal relationships, an indicator for future orthognathic procedure, in patients after repair of an isolated cleft of the secondary palate. Methods Our craniofacial database was used to identify patients aged ten years or greater with an isolated cleft of the secondary palate who underwent palatoplasty between 1985 and 2002. Data collected included age at palatoplasty and follow-up, cleft type, associated syndrome, Robin sequence, surgeon, repair technique, number of operations, and occlusion. Incisal relationship was determined through clinical observation by a pediatric dentist and orthodontist. Results Seventy eligible patients operated on by 9 surgeons were identified. Class III incisal relationship was seen in 5 patients (7.1%). Palatoplasty techniques over the hard palate (63 of 70 patients) included two-flap palatoplasty, VY-pushback, and Von Langenbeck repair. There was an association between class III incisal relationship and syndromic diagnosis (p < 0.001). Other study variables were not associated with class III incisal relationships. Conclusion In patients with an isolated cleft of the secondary palate, there was no association between class III incisal relationship and surgeon, age at repair, cleft type, palatoplasty technique, or number of operations. Increased likelihood of class III incisal relationship was associated primarily with syndromic diagnosis. PMID:27171942
Jeon, Jihyoun; Hsu, Li; Gorfine, Malka
2012-07-01
Frailty models are useful for measuring unobserved heterogeneity in risk of failures across clusters, providing cluster-specific risk prediction. In a frailty model, the latent frailties shared by members within a cluster are assumed to act multiplicatively on the hazard function. In order to obtain parameter and frailty variate estimates, we consider the hierarchical likelihood (H-likelihood) approach (Ha, Lee and Song, 2001. Hierarchical-likelihood approach for frailty models. Biometrika 88, 233-243) in which the latent frailties are treated as "parameters" and estimated jointly with other parameters of interest. We find that the H-likelihood estimators perform well when the censoring rate is low, however, they are substantially biased when the censoring rate is moderate to high. In this paper, we propose a simple and easy-to-implement bias correction method for the H-likelihood estimators under a shared frailty model. We also extend the method to a multivariate frailty model, which incorporates complex dependence structure within clusters. We conduct an extensive simulation study and show that the proposed approach performs very well for censoring rates as high as 80%. We also illustrate the method with a breast cancer data set. Since the H-likelihood is the same as the penalized likelihood function, the proposed bias correction method is also applicable to the penalized likelihood estimators.
Ting, Chih-Chung; Yu, Chia-Chen; Maloney, Laurence T.
2015-01-01
In Bayesian decision theory, knowledge about the probabilities of possible outcomes is captured by a prior distribution and a likelihood function. The prior reflects past knowledge and the likelihood summarizes current sensory information. The two combined (integrated) form a posterior distribution that allows estimation of the probability of different possible outcomes. In this study, we investigated the neural mechanisms underlying Bayesian integration using a novel lottery decision task in which both prior knowledge and likelihood information about reward probability were systematically manipulated on a trial-by-trial basis. Consistent with Bayesian integration, as sample size increased, subjects tended to weigh likelihood information more compared with prior information. Using fMRI in humans, we found that the medial prefrontal cortex (mPFC) correlated with the mean of the posterior distribution, a statistic that reflects the integration of prior knowledge and likelihood of reward probability. Subsequent analysis revealed that both prior and likelihood information were represented in mPFC and that the neural representations of prior and likelihood in mPFC reflected changes in the behaviorally estimated weights assigned to these different sources of information in response to changes in the environment. Together, these results establish the role of mPFC in prior-likelihood integration and highlight its involvement in representing and integrating these distinct sources of information. PMID:25632152
Royle, J. Andrew; Sutherland, Christopher S.; Fuller, Angela K.; Sun, Catherine C.
2015-01-01
We develop a likelihood analysis framework for fitting spatial capture-recapture (SCR) models to data collected on class structured or stratified populations. Our interest is motivated by the necessity of accommodating the problem of missing observations of individual class membership. This is particularly problematic in SCR data arising from DNA analysis of scat, hair or other material, which frequently yields individual identity but fails to identify the sex. Moreover, this can represent a large fraction of the data and, given the typically small sample sizes of many capture-recapture studies based on DNA information, utilization of the data with missing sex information is necessary. We develop the class structured likelihood for the case of missing covariate values, and then we address the scaling of the likelihood so that models with and without class structured parameters can be formally compared regardless of missing values. We apply our class structured model to black bear data collected in New York in which sex could be determined for only 62 of 169 uniquely identified individuals. The models containing sex-specificity of both the intercept of the SCR encounter probability model and the distance coefficient, and including a behavioral response are strongly favored by log-likelihood. Estimated population sex ratio is strongly influenced by sex structure in model parameters illustrating the importance of rigorous modeling of sex differences in capture-recapture models.
Maximum likelihood decoding analysis of accumulate-repeat-accumulate codes
NASA Technical Reports Server (NTRS)
Abbasfar, A.; Divsalar, D.; Yao, K.
2004-01-01
In this paper, the performance of the repeat-accumulate codes with (ML) decoding are analyzed and compared to random codes by very tight bounds. Some simple codes are shown that perform very close to Shannon limit with maximum likelihood decoding.
Ma, Zhaoxu; Zhao, Shanshan; Cao, Tingting; Liu, Chongxi; Huang, Ying; Gao, Yuhang; Yan, Kai; Xiang, Wensheng; Wang, Xiangjing
2016-12-01
A novel actinobacterium, designated strain NEAU-QY3T, was isolated from the leaves of Sonchus oleraceus L. and examined using a polyphasic taxonomic approach. The organism formed single spores with smooth surface on substrate mycelia. Phylogenetic analysis based on the 16S rRNA gene sequence indicated that the strain had a close association with the genus Verrucosispora and shared the highest sequence similarity with Verrucosispora qiuiae RtIII47T (99.17 %), an association that was supported by a bootstrap value of 94 % in the neighbour-joining tree and also recovered with the maximum-likelihood algorithm. The strain also showed high 16S rRNA gene sequence similarities to Xiangella phaseoli NEAU-J5T (98.78 %), Jishengella endophytica 202201T (98.51 %), Micromonospora eburnea LK2-10T (98.28 %), Verrucosispora lutea YIM 013T (98.23 %) and Salinispora pacifica CNR-114T (98.23 %). Furthermore, phylogenetic analysis based on the gyrB gene sequences supported the conclusion that strain NEAU-QY3T should be assigned to the genus Verrucosispora. However, the DNA-DNA hybridization relatedness values between strain NEAU-QY3T and V. qiuiae RtIII47T and V. lutea YIM 013T were below 70 %. With reference to phenotypic characteristics, phylogenetic data and DNA-DNA hybridization results, strain NEAU-QY3T was readily distinguished from its most closely related strains and classified as a new species, for which the name Verrucosispora sonchi sp. nov. is proposed. The type strain is NEAU-QY3T (=CGMCC 4.7312T=DSM 101530T).
Dahabreh, Issa J; Trikalinos, Thomas A; Lau, Joseph; Schmid, Christopher H
2017-03-01
To compare statistical methods for meta-analysis of sensitivity and specificity of medical tests (e.g., diagnostic or screening tests). We constructed a database of PubMed-indexed meta-analyses of test performance from which 2 × 2 tables for each included study could be extracted. We reanalyzed the data using univariate and bivariate random effects models fit with inverse variance and maximum likelihood methods. Analyses were performed using both normal and binomial likelihoods to describe within-study variability. The bivariate model using the binomial likelihood was also fit using a fully Bayesian approach. We use two worked examples-thoracic computerized tomography to detect aortic injury and rapid prescreening of Papanicolaou smears to detect cytological abnormalities-to highlight that different meta-analysis approaches can produce different results. We also present results from reanalysis of 308 meta-analyses of sensitivity and specificity. Models using the normal approximation produced sensitivity and specificity estimates closer to 50% and smaller standard errors compared to models using the binomial likelihood; absolute differences of 5% or greater were observed in 12% and 5% of meta-analyses for sensitivity and specificity, respectively. Results from univariate and bivariate random effects models were similar, regardless of estimation method. Maximum likelihood and Bayesian methods produced almost identical summary estimates under the bivariate model; however, Bayesian analyses indicated greater uncertainty around those estimates. Bivariate models produced imprecise estimates of the between-study correlation of sensitivity and specificity. Differences between methods were larger with increasing proportion of studies that were small or required a continuity correction. The binomial likelihood should be used to model within-study variability. Univariate and bivariate models give similar estimates of the marginal distributions for sensitivity and specificity. Bayesian methods fully quantify uncertainty and their ability to incorporate external evidence may be useful for imprecisely estimated parameters. Copyright © 2017 Elsevier Inc. All rights reserved.
Hospital mergers and market overlap.
Brooks, G R; Jones, V G
1997-01-01
OBJECTIVE: To address two questions: What are the characteristics of hospitals that affect the likelihood of their being involved in a merger? What characteristics of particular pairs of hospitals affect the likelihood of the pair engaging in a merger? DATA SOURCES/STUDY SETTING: Hospitals in the 12 county region surrounding the San Francisco Bay during the period 1983 to 1992 were the focus of the study. Data were drawn from secondary sources, including the Lexis/Nexis database, the American Hospital Association, and the Office of Statewide Health Planning and Development of the State of California. STUDY DESIGN: Seventeen hospital mergers during the study period were identified. A random sample of pairs of hospitals that did not merge was drawn to establish a statistically efficient control set. Models constructed from hypotheses regarding hospital and market characteristics believed to be related to merger likelihood were tested using logistic regression analysis. DATA COLLECTION: See Data Sources/Study Setting. PRINCIPAL FINDINGS: The analysis shows that the likelihood of a merger between a particular pair of hospitals is positively related to the degree of market overlap that exists between them. Furthermore, market overlap and performance difference interact in their effect on merger likelihood. In an analysis of individual hospitals, conditions of rivalry, hospital market share, and hospital size were not found to influence the likelihood that a hospital will engage in a merger. CONCLUSIONS: Mergers between hospitals are not driven directly by considerations of market power or efficiency as much as by the existence of specific merger opportunities in the hospitals' local markets. Market overlap is a condition that enables a merger to occur, but other factors, such as the relative performance levels of the hospitals in question and their ownership and teaching status, also play a role in influencing the likelihood that a merger will in fact take place. PMID:9018212
Co-morbid substance use behaviors among youth: any impact of school environment?
Costello, Mary Jean E; Leatherdale, Scott T; Ahmed, Rashid; Church, Dana L; Cunningham, John A
2012-03-01
Substance use is common among youth; however, our understanding of co-morbid tobacco, alcohol and marijuana use remains limited. The school-environment may play an important role in the likelihood a student engages in high risk substance use behaviors, including co-morbid use. This study aims to: (i) describe the prevalence of co-morbid substance use behaviors among youth; (ii) identify and compare the characteristics of youth who currently use a single substance, any two substances, and all three substances; (iii) examine if the likelihood of co-morbid use varies by school and; (iv) examine what factors are associated with co-morbid use. This study used nationally representative data collected from students in grades 9 to 12 (n = 41,886) as part of the 2006-2007 Canadian Youth Smoking Survey (YSS). Demographic and behavioral data were collected including, current cigarette, alcohol and marijuana use. Results. 6.5% (n = 107,000) reported current use of all three substances and 20.3% (n = 333,000) of any two substances. Multi-level analysis revealed significant between school variability in the odds a student used all three substances and any two substances; accounting for 16.9% and 13.5% of the variability, respectively. Co-morbid use was associated with sex, grade, amount of available spending money and perceived academic performance. Co-morbid substance use is high among youth; however, not all schools share the same prevalence. Knowing the school characteristics that place particular schools at risk for student substance use is important for tailoring drug and alcohol education programs. Interventions that target the prevention of co-morbid substance use are required.
On the validity of cosmological Fisher matrix forecasts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolz, Laura; Kilbinger, Martin; Weller, Jochen
2012-09-01
We present a comparison of Fisher matrix forecasts for cosmological probes with Monte Carlo Markov Chain (MCMC) posterior likelihood estimation methods. We analyse the performance of future Dark Energy Task Force (DETF) stage-III and stage-IV dark-energy surveys using supernovae, baryon acoustic oscillations and weak lensing as probes. We concentrate in particular on the dark-energy equation of state parameters w{sub 0} and w{sub a}. For purely geometrical probes, and especially when marginalising over w{sub a}, we find considerable disagreement between the two methods, since in this case the Fisher matrix can not reproduce the highly non-elliptical shape of the likelihood function.more » More quantitatively, the Fisher method underestimates the marginalized errors for purely geometrical probes between 30%-70%. For cases including structure formation such as weak lensing, we find that the posterior probability contours from the Fisher matrix estimation are in good agreement with the MCMC contours and the forecasted errors only changing on the 5% level. We then explore non-linear transformations resulting in physically-motivated parameters and investigate whether these parameterisations exhibit a Gaussian behaviour. We conclude that for the purely geometrical probes and, more generally, in cases where it is not known whether the likelihood is close to Gaussian, the Fisher matrix is not the appropriate tool to produce reliable forecasts.« less
Chaudhuri, Shomesh E; Merfeld, Daniel M
2013-03-01
Psychophysics generally relies on estimating a subject's ability to perform a specific task as a function of an observed stimulus. For threshold studies, the fitted functions are called psychometric functions. While fitting psychometric functions to data acquired using adaptive sampling procedures (e.g., "staircase" procedures), investigators have encountered a bias in the spread ("slope" or "threshold") parameter that has been attributed to the serial dependency of the adaptive data. Using simulations, we confirm this bias for cumulative Gaussian parametric maximum likelihood fits on data collected via adaptive sampling procedures, and then present a bias-reduced maximum likelihood fit that substantially reduces the bias without reducing the precision of the spread parameter estimate and without reducing the accuracy or precision of the other fit parameters. As a separate topic, we explain how to implement this bias reduction technique using generalized linear model fits as well as other numeric maximum likelihood techniques such as the Nelder-Mead simplex. We then provide a comparison of the iterative bootstrap and observed information matrix techniques for estimating parameter fit variance from adaptive sampling procedure data sets. The iterative bootstrap technique is shown to be slightly more accurate; however, the observed information technique executes in a small fraction (0.005 %) of the time required by the iterative bootstrap technique, which is an advantage when a real-time estimate of parameter fit variance is required.
Measurement of CIB power spectra with CAM-SPEC from Planck HFI maps
NASA Astrophysics Data System (ADS)
Mak, Suet Ying; Challinor, Anthony; Efstathiou, George; Lagache, Guilaine
2015-08-01
We present new measurements of the cosmic infrared background (CIB) anisotropies and its first likelihood using Planck HFI data at 353, 545, and 857 GHz. The measurements are based on cross-frequency power spectra and likelihood analysis using the CAM-SPEC package, rather than map based template removal of foregrounds as done in previous Planck CIB analysis. We construct the likelihood of the CIB temperature fluctuations, an extension of CAM-SPEC likelihood as used in CMB analysis to higher frequency, and use it to drive the best estimate of the CIB power spectrum over three decades in multiple moment, l, covering 50 ≤ l ≤ 2500. We adopt parametric models of the CIB and foreground contaminants (Galactic cirrus, infrared point sources, and cosmic microwave background anisotropies), and calibrate the dataset uniformly across frequencies with known Planck beam and noise properties in the likelihood construction. We validate our likelihood through simulations and extensive suite of consistency tests, and assess the impact of instrumental and data selection effects on the final CIB power spectrum constraints. Two approaches are developed for interpreting the CIB power spectrum. The first approach is based on simple parametric model which model the cross frequency power using amplitudes, correlation coefficients, and known multipole dependence. The second approach is based on the physical models for galaxy clustering and the evolution of infrared emission of galaxies. The new approaches fit all auto- and cross- power spectra very well, with the best fit of χ2ν = 1.04 (parametric model). Using the best foreground solution, we find that the cleaned CIB power spectra are in good agreement with previous Planck and Herschel measurements.
NASA Astrophysics Data System (ADS)
Jaranowski, Piotr; Królak, Andrzej
2000-03-01
We develop the analytic and numerical tools for data analysis of the continuous gravitational-wave signals from spinning neutron stars for ground-based laser interferometric detectors. The statistical data analysis method that we investigate is maximum likelihood detection which for the case of Gaussian noise reduces to matched filtering. We study in detail the statistical properties of the optimum functional that needs to be calculated in order to detect the gravitational-wave signal and estimate its parameters. We find it particularly useful to divide the parameter space into elementary cells such that the values of the optimal functional are statistically independent in different cells. We derive formulas for false alarm and detection probabilities both for the optimal and the suboptimal filters. We assess the computational requirements needed to do the signal search. We compare a number of criteria to build sufficiently accurate templates for our data analysis scheme. We verify the validity of our concepts and formulas by means of the Monte Carlo simulations. We present algorithms by which one can estimate the parameters of the continuous signals accurately. We find, confirming earlier work of other authors, that given a 100 Gflops computational power an all-sky search for observation time of 7 days and directed search for observation time of 120 days are possible whereas an all-sky search for 120 days of observation time is computationally prohibitive.
Cohn, T.A.; Lane, W.L.; Baier, W.G.
1997-01-01
This paper presents the expected moments algorithm (EMA), a simple and efficient method for incorporating historical and paleoflood information into flood frequency studies. EMA can utilize three types of at-site flood information: systematic stream gage record; information about the magnitude of historical floods; and knowledge of the number of years in the historical period when no large flood occurred. EMA employs an iterative procedure to compute method-of-moments parameter estimates. Initial parameter estimates are calculated from systematic stream gage data. These moments are then updated by including the measured historical peaks and the expected moments, given the previously estimated parameters, of the below-threshold floods from the historical period. The updated moments result in new parameter estimates, and the last two steps are repeated until the algorithm converges. Monte Carlo simulations compare EMA, Bulletin 17B's [United States Water Resources Council, 1982] historically weighted moments adjustment, and maximum likelihood estimators when fitting the three parameters of the log-Pearson type III distribution. These simulations demonstrate that EMA is more efficient than the Bulletin 17B method, and that it is nearly as efficient as maximum likelihood estimation (MLE). The experiments also suggest that EMA has two advantages over MLE when dealing with the log-Pearson type III distribution: It appears that EMA estimates always exist and that they are unique, although neither result has been proven. EMA can be used with binomial or interval-censored data and with any distributional family amenable to method-of-moments estimation.
Drinking despite health problems among individuals with liver disease across the United States.
Elliott, Jennifer C; Stohl, Malka; Hasin, Deborah S
2017-07-01
Heavy drinking is harmful for individuals with liver disease. However, some of these individuals drink despite knowledge of the risks. The current study aims to identify factors underlying drinking despite health problems among individuals with liver disease. The current study utilizes a subsample of individuals reporting past-year liver disease and at least one drink in the past year (n=331), taken from the National Epidemiologic Survey on Alcohol and Related Conditions-III (NESARC-III), a large nationally representative survey of the United States. Participants reported on drinking despite health problems, symptoms of psychopathology, and family history of alcohol problems in a cross-sectional survey. Drug use disorders (Adjusted Odds Ratio [AOR]=2.68), as well as borderline, antisocial, and schizotypal personality disorders (AORs=2.50-4.10), were associated with increased likelihood of drinking despite health problems among individuals with liver disease, all ps<0.05. Any anxiety disorder trended toward significance (AOR=2.22), p=0.06, but major depressive disorder was not associated with increased risk, (AOR=0.99), ps=0.97. Individuals with a family history of alcohol problems were also more likely to drink despite health problems (AOR=2.79), p<0.05. Several types of psychopathology, as well as a family history of alcohol problems, increased the likelihood of drinking despite health problems among individuals with liver disease. These findings highlight the need to intervene with heavily drinking individuals with liver disease, who may be drinking due to familial risk and/or comorbid psychopathology. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Cohn, T. A.; Lane, W. L.; Baier, W. G.
This paper presents the expected moments algorithm (EMA), a simple and efficient method for incorporating historical and paleoflood information into flood frequency studies. EMA can utilize three types of at-site flood information: systematic stream gage record; information about the magnitude of historical floods; and knowledge of the number of years in the historical period when no large flood occurred. EMA employs an iterative procedure to compute method-of-moments parameter estimates. Initial parameter estimates are calculated from systematic stream gage data. These moments are then updated by including the measured historical peaks and the expected moments, given the previously estimated parameters, of the below-threshold floods from the historical period. The updated moments result in new parameter estimates, and the last two steps are repeated until the algorithm converges. Monte Carlo simulations compare EMA, Bulletin 17B's [United States Water Resources Council, 1982] historically weighted moments adjustment, and maximum likelihood estimators when fitting the three parameters of the log-Pearson type III distribution. These simulations demonstrate that EMA is more efficient than the Bulletin 17B method, and that it is nearly as efficient as maximum likelihood estimation (MLE). The experiments also suggest that EMA has two advantages over MLE when dealing with the log-Pearson type III distribution: It appears that EMA estimates always exist and that they are unique, although neither result has been proven. EMA can be used with binomial or interval-censored data and with any distributional family amenable to method-of-moments estimation.
NASA Astrophysics Data System (ADS)
Núñez, M.; Robie, T.; Vlachos, D. G.
2017-10-01
Kinetic Monte Carlo (KMC) simulation provides insights into catalytic reactions unobtainable with either experiments or mean-field microkinetic models. Sensitivity analysis of KMC models assesses the robustness of the predictions to parametric perturbations and identifies rate determining steps in a chemical reaction network. Stiffness in the chemical reaction network, a ubiquitous feature, demands lengthy run times for KMC models and renders efficient sensitivity analysis based on the likelihood ratio method unusable. We address the challenge of efficiently conducting KMC simulations and performing accurate sensitivity analysis in systems with unknown time scales by employing two acceleration techniques: rate constant rescaling and parallel processing. We develop statistical criteria that ensure sufficient sampling of non-equilibrium steady state conditions. Our approach provides the twofold benefit of accelerating the simulation itself and enabling likelihood ratio sensitivity analysis, which provides further speedup relative to finite difference sensitivity analysis. As a result, the likelihood ratio method can be applied to real chemistry. We apply our methodology to the water-gas shift reaction on Pt(111).
A Walk on the Wild Side: The Impact of Music on Risk-Taking Likelihood.
Enström, Rickard; Schmaltz, Rodney
2017-01-01
From a marketing perspective, there has been substantial interest in on the role of risk-perception on consumer behavior. Specific 'problem music' like rap and heavy metal has long been associated with delinquent behavior, including violence, drug use, and promiscuous sex. Although individuals' risk preferences have been investigated across a range of decision-making situations, there has been little empirical work demonstrating the direct role music may have on the likelihood of engaging in risky activities. In the exploratory study reported here, we assessed the impact of listening to different styles of music while assessing risk-taking likelihood through a psychometric scale. Risk-taking likelihood was measured across ethical, financial, health and safety, recreational and social domains. Through the means of a canonical correlation analysis, the multivariate relationship between different music styles and individual risk-taking likelihood across the different domains is discussed. Our results indicate that listening to different types of music does influence risk-taking likelihood, though not in areas of health and safety.
A Walk on the Wild Side: The Impact of Music on Risk-Taking Likelihood
Enström, Rickard; Schmaltz, Rodney
2017-01-01
From a marketing perspective, there has been substantial interest in on the role of risk-perception on consumer behavior. Specific ‘problem music’ like rap and heavy metal has long been associated with delinquent behavior, including violence, drug use, and promiscuous sex. Although individuals’ risk preferences have been investigated across a range of decision-making situations, there has been little empirical work demonstrating the direct role music may have on the likelihood of engaging in risky activities. In the exploratory study reported here, we assessed the impact of listening to different styles of music while assessing risk-taking likelihood through a psychometric scale. Risk-taking likelihood was measured across ethical, financial, health and safety, recreational and social domains. Through the means of a canonical correlation analysis, the multivariate relationship between different music styles and individual risk-taking likelihood across the different domains is discussed. Our results indicate that listening to different types of music does influence risk-taking likelihood, though not in areas of health and safety. PMID:28539908
Tests for detecting overdispersion in models with measurement error in covariates.
Yang, Yingsi; Wong, Man Yu
2015-11-30
Measurement error in covariates can affect the accuracy in count data modeling and analysis. In overdispersion identification, the true mean-variance relationship can be obscured under the influence of measurement error in covariates. In this paper, we propose three tests for detecting overdispersion when covariates are measured with error: a modified score test and two score tests based on the proposed approximate likelihood and quasi-likelihood, respectively. The proposed approximate likelihood is derived under the classical measurement error model, and the resulting approximate maximum likelihood estimator is shown to have superior efficiency. Simulation results also show that the score test based on approximate likelihood outperforms the test based on quasi-likelihood and other alternatives in terms of empirical power. By analyzing a real dataset containing the health-related quality-of-life measurements of a particular group of patients, we demonstrate the importance of the proposed methods by showing that the analyses with and without measurement error correction yield significantly different results. Copyright © 2015 John Wiley & Sons, Ltd.
A general methodology for maximum likelihood inference from band-recovery data
Conroy, M.J.; Williams, B.K.
1984-01-01
A numerical procedure is described for obtaining maximum likelihood estimates and associated maximum likelihood inference from band- recovery data. The method is used to illustrate previously developed one-age-class band-recovery models, and is extended to new models, including the analysis with a covariate for survival rates and variable-time-period recovery models. Extensions to R-age-class band- recovery, mark-recapture models, and twice-yearly marking are discussed. A FORTRAN program provides computations for these models.
Surgical débridement and parenteral antibiotics in infected revision total knee arthroplasty.
Chiu, Fang-Yao; Chen, Chuan-Mu
2007-08-01
Whether surgical débridement and parenteral antibiotics with prosthesis retention for infected revision TKA eradicates infection is not well established. We sought to determine the prevalence of reinfection. Between 1992 and 2003, we prospectively followed 40 consecutive patients with deep infection after revision TKA. These patients had no prosthesis loosening or malalignment. Using the classification of Tsukayama et al, 10, 20, and 10 patients had Types I (acute postoperative), II (late chronic), and III (acute hematogenous) infections, respectively. All had surgical débridement and parenteral antibiotics with retention of their existing prostheses. The patients were followed for a minimum of 3 years (range, 36-143 months). Successful implant salvage was achieved in 12 of the 40 patients (30%). However, likelihood of success depended on the type of infection: patients with Type I infections (seven of 10) and patients with Type III infections (five of 10) retained their prostheses more often than patients with Type II infections (zero of 20). We recommend early débridement and retention of the prosthesis with Type I or Type III infections in revised TKAs, but primary removal for Type II infections.
The association of insurance status on the probability of transfer for pediatric trauma patients.
Hamilton, Emma C; Miller, Charles C; Cotton, Bryan A; Cox, Charles; Kao, Lillian S; Austin, Mary T
2016-12-01
The purpose of this study was to evaluate the association of insurance status on the probability of transfer of pediatric trauma patients to level I/II centers after initial evaluation at lower level centers. A retrospective review of all pediatric trauma patients (age<16years) registered in the 2007-2012 National Trauma Data Bank was performed. Multiple regression techniques controlling for clustering at the hospital level were used to determine the impact of insurance status on the probability of transfer to level I/II trauma centers. Of 38,205 patients, 33% of patients (12,432) were transferred from lower level centers to level I/II trauma centers. Adjusting for demographics and injury characteristics, children with no insurance had a higher likelihood of transfer than children with private insurance. Children with public or unknown insurance status were no more likely to be transferred than privately insured children. There were no variable interactions with insurance status. Among pediatric trauma patients, lack of insurance is an independent predictor for transfer to a major trauma center. While burns, severely injured, and younger patients remain the most likely to be transferred, these findings suggest a triage bias influenced by insurance status. Additional policies may be needed to avoid unnecessary transfer of uninsured pediatric trauma patients. Case-control study, level III. Copyright © 2016 Elsevier Inc. All rights reserved.
Maximum Likelihood Analysis in the PEN Experiment
NASA Astrophysics Data System (ADS)
Lehman, Martin
2013-10-01
The experimental determination of the π+ -->e+ ν (γ) decay branching ratio currently provides the most accurate test of lepton universality. The PEN experiment at PSI, Switzerland, aims to improve the present world average experimental precision of 3 . 3 ×10-3 to 5 ×10-4 using a stopped beam approach. During runs in 2008-10, PEN has acquired over 2 ×107 πe 2 events. The experiment includes active beam detectors (degrader, mini TPC, target), central MWPC tracking with plastic scintillator hodoscopes, and a spherical pure CsI electromagnetic shower calorimeter. The final branching ratio will be calculated using a maximum likelihood analysis. This analysis assigns each event a probability for 5 processes (π+ -->e+ ν , π+ -->μ+ ν , decay-in-flight, pile-up, and hadronic events) using Monte Carlo verified probability distribution functions of our observables (energies, times, etc). A progress report on the PEN maximum likelihood analysis will be presented. Work supported by NSF grant PHY-0970013.
Robbins, L G
2000-01-01
Graduate school programs in genetics have become so full that courses in statistics have often been eliminated. In addition, typical introductory statistics courses for the "statistics user" rather than the nascent statistician are laden with methods for analysis of measured variables while genetic data are most often discrete numbers. These courses are often seen by students and genetics professors alike as largely irrelevant cookbook courses. The powerful methods of likelihood analysis, although commonly employed in human genetics, are much less often used in other areas of genetics, even though current computational tools make this approach readily accessible. This article introduces the MLIKELY.PAS computer program and the logic of do-it-yourself maximum-likelihood statistics. The program itself, course materials, and expanded discussions of some examples that are only summarized here are available at http://www.unisi. it/ricerca/dip/bio_evol/sitomlikely/mlikely.h tml. PMID:10628965
Prioritizing conservation investments for mammal species globally
Wilson, Kerrie A.; Evans, Megan C.; Di Marco, Moreno; Green, David C.; Boitani, Luigi; Possingham, Hugh P.; Chiozza, Federica; Rondinini, Carlo
2011-01-01
We need to set priorities for conservation because we cannot do everything, everywhere, at the same time. We determined priority areas for investment in threat abatement actions, in both a cost-effective and spatially and temporally explicit way, for the threatened mammals of the world. Our analysis presents the first fine-resolution prioritization analysis for mammals at a global scale that accounts for the risk of habitat loss, the actions required to abate this risk, the costs of these actions and the likelihood of investment success. We evaluated the likelihood of success of investments using information on the past frequency and duration of legislative effectiveness at a country scale. The establishment of new protected areas was the action receiving the greatest investment, while restoration was never chosen. The resolution of the analysis and the incorporation of likelihood of success made little difference to this result, but affected the spatial location of these investments. PMID:21844046
An analysis of crash likelihood : age versus driving experience
DOT National Transportation Integrated Search
1995-05-01
The study was designed to determine the crash likelihood of drivers in Michigan as a function of two independent variables: driver age and driving experience. The age variable had eight levels (18, 19, 20, 21, 22, 23, 24, and 25 years old) and the ex...
Bradford, W David; Kleit, Andrew N; Nietert, Paul J; Ornstein, Steven
2006-12-01
Although highly controversial, directto-consumer (DTC) television advertising for prescription drugs is an established practice in the US health care industry. While the US Food and Drug Administration is currently reexamining its regulatory stance, little evidence exists regarding the impact of DTC advertising on patient health outcomes. The objective of this research was to study the relationship between heavy television promotion of 3 major hydroxymethylglutaryl coenzyme A reductase inhibitors ("statins") and the frequency with which patients are able to attain low-density lipoprotein cholesterol (LDL-C) blood-level goals after treatment with any statin. We used logistic regression to determine achievement of LDL-C goals at 6 months after statin treatment, using electronic medical record extract data from patients from geographically dispersed primary care practices in the United States. We identified LDL-C blood levels as being at or less than goal, as defined by risk-adjusted guidelines published by the National Heart, Lung, and Blood Institute from the Adult Treatment Panel III (ATP III) data. A total of 50,741 patients, identified from 88 practices, were diagnosed with hyperlipidemia and had begun therapy with any statin medication during the 1998-2004 time period. In addition, total dollars spent each month on television advertising at the national and local levels for atorvastatin, pravastatin, and simvastatin were obtained. DTC advertising data were merged by local media market where the physician practice was located and by the month in which the patient was first prescribed a statin. The models were run for all patients who initiated therapy, and also on a subsample of patients who continued to receive prescriptions for the drugs for at least 6 months. Logistic regressions were used to predict the likelihood that each patient attained the ATP III LDL-C blood-level goals as a function of DTC advertising and other factors. High levels of national DTC advertising when therapy was initiated were found to increase the likelihood that patients attained LDL-C goals at 6 months by 6% (P < 0.001)-although the effect was concentrated among patients with the least-restrictive ATP III LDL-C goals (
Epidemiological investigation of bovine tuberculosis herd breakdowns in Spain 2009/2011.
Guta, Sintayehu; Casal, Jordi; Napp, Sebastian; Saez, Jose Luis; Garcia-Saenz, Ariadna; Perez de Val, Bernat; Romero, Beatriz; Alvarez, Julio; Allepuz, Alberto
2014-01-01
We analyzed the most likely cause of 687 bovine tuberculosis (bTB) breakdowns detected in Spain between 2009 and 2011 (i.e., 22% of the total number of breakdowns detected during this period). Seven possible causes were considered: i) residual infection; ii) introduction of infected cattle from other herds; iii) sharing of pastures with infected herds; iv) contiguous spread from infected neighbor herds; v) presence of infected goats in the farm; vi) interaction with wildlife reservoirs and vii) contact with an infected human. For each possible cause a decision tree was developed and key questions were included in each of them. Answers to these key questions lead to different events within each decision tree. In order to assess the likelihood of occurrence of the different events a qualitative risk assessment approach was used. For this purpose, an expert opinion workshop was organized and ordinal values, ranging from 0 to 9 (i.e., null to very high likelihood of occurrence) were assigned. The analysis identified residual infection as the most frequent cause of bTB breakdowns (22.3%; 95%CI: 19.4-25.6), followed by interaction with wildlife reservoirs (13.1%; 95%CI: 10.8-15.8). The introduction of infected cattle, sharing of pastures and contiguous spread from infected neighbour herds were also identified as relevant causes. In 41.6% (95%CI: 38.0-45.4) of the breakdowns the origin of infection remained unknown. Veterinary officers conducting bTB breakdown investigations have to state their opinion about the possible cause of each breakdown. Comparison between the results of our analysis and the opinion from veterinary officers revealed a slight concordance. This slight agreement might reflect a lack of harmonized criteria to assess the most likely cause of bTB breakdowns as well as different perceptions about the importance of the possible causes. This is especially relevant in the case of the role of wildlife reservoirs.
Gelpi-Hammerschmidt, Francisco; Tinay, Ilker; Allard, Christopher B; Su, Li-Ming; Preston, Mark A; Trinh, Quoc-Dien; Kibel, Adam S; Wang, Ye; Chung, Benjamin I; Chang, Steven L
2016-02-01
We evaluate the contemporary incidence and consequences of postoperative rhabdomyolysis after extirpative renal surgery. We conducted a population based, retrospective cohort study of patients who underwent extirpative renal surgery with a diagnosis of a renal mass or renal cell carcinoma in the United States between 2004 and 2013. Regression analysis was performed to evaluate 90-day mortality (Clavien grade V), nonfatal major complications (Clavien grade III-IV), hospital readmission rates, direct costs and length of stay. The final weighted cohort included 310,880 open, 174,283 laparoscopic and 69,880 robotic extirpative renal surgery cases during the 10-year study period, with 745 (0.001%) experiencing postoperative rhabdomyolysis. The presence of postoperative rhabdomyolysis led to a significantly higher incidence of 90-day nonfatal major complications (34.7% vs 7.3%, p <0.05) and higher 90-day mortality (4.4% vs 1.02%, p <0.05). Length of stay was twice as long for patients with postoperative rhabdomyolysis (incidence risk ratio 1.83, 95% CI 1.56-2.15, p <0.001). The robotic approach was associated with a higher likelihood of postoperative rhabdomyolysis (vs laparoscopic approach, OR 2.43, p <0.05). Adjusted 90-day median direct hospital costs were USD 7,515 higher for patients with postoperative rhabdomyolysis (p <0.001). Our model revealed that the combination of obesity and prolonged surgery (more than 5 hours) was associated with a higher likelihood of postoperative rhabdomyolysis developing. Our study confirms that postoperative rhabdomyolysis is an uncommon complication among patients undergoing extirpative renal surgery, but has a potentially detrimental impact on surgical morbidity, mortality and costs. Male gender, comorbidities, obesity, prolonged surgery (more than 5 hours) and a robotic approach appear to place patients at higher risk for postoperative rhabdomyolysis. Copyright © 2016 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
Tan, Chongqing; Peng, Liubao; Zeng, Xiaohui; Li, Jianhe; Wan, Xiaomin; Chen, Gannong; Yi, Lidan; Luo, Xia; Zhao, Ziying
2013-01-01
First-line postoperative adjuvant chemotherapies with S-1 and capecitabine and oxaliplatin (XELOX) were first recommended for resectable gastric cancer patients in the 2010 and 2011 Chinese NCCN Clinical Practice Guidelines in Oncology: Gastric Cancer; however, their economic impact in China is unknown. The aim of this study was to compare the cost-effectiveness of adjuvant chemotherapy with XELOX, with S-1 and no treatment after a gastrectomy with extended (D2) lymph-node dissection among patients with stage II-IIIB gastric cancer. A Markov model, based on data from two clinical phase III trials, was developed to analyse the cost-effectiveness of patients in the XELOX group, S-1 group and surgery only (SO) group. The costs were estimated from the perspective of Chinese healthcare system. The utilities were assumed on the basis of previously published reports. Costs, quality-adjusted life-years (QALYs) and incremental cost-effectiveness ratios (ICER) were calculated with a lifetime horizon. One-way and probabilistic sensitivity analyses were performed. For the base case, XELOX had the lowest total cost ($44,568) and cost-effectiveness ratio ($7,360/QALY). The relative scenario analyses showed that SO was dominated by XELOX and the ICERs of S-1 was $58,843/QALY compared with XELOX. The one-way sensitivity analysis showed that the most influential parameter was the utility of disease-free survival. The probabilistic sensitivity analysis predicted a 75.8% likelihood that the ICER for XELOX would be less than $13,527 compared with S-1. When ICER was more than $38,000, the likelihood of cost-effectiveness achieved by S-1 group was greater than 50%. Our results suggest that for patients in China with resectable disease, first-line adjuvant chemotherapy with XELOX after a D2 gastrectomy is a best option comparing with S-1 and SO in view of our current study. In addition, S-1 might be a better choice, especially with a higher value of willingness-to-pay threshold.
Silverman, Merav H.; Jedd, Kelly; Luciana, Monica
2015-01-01
Behavioral responses to, and the neural processing of, rewards change dramatically during adolescence and may contribute to observed increases in risk-taking during this developmental period. Functional MRI (fMRI) studies suggest differences between adolescents and adults in neural activation during reward processing, but findings are contradictory, and effects have been found in non-predicted directions. The current study uses an activation likelihood estimation (ALE) approach for quantitative meta-analysis of functional neuroimaging studies to: 1) confirm the network of brain regions involved in adolescents’ reward processing, 2) identify regions involved in specific stages (anticipation, outcome) and valence (positive, negative) of reward processing, and 3) identify differences in activation likelihood between adolescent and adult reward-related brain activation. Results reveal a subcortical network of brain regions involved in adolescent reward processing similar to that found in adults with major hubs including the ventral and dorsal striatum, insula, and posterior cingulate cortex (PCC). Contrast analyses find that adolescents exhibit greater likelihood of activation in the insula while processing anticipation relative to outcome and greater likelihood of activation in the putamen and amygdala during outcome relative to anticipation. While processing positive compared to negative valence, adolescents show increased likelihood for activation in the posterior cingulate cortex (PCC) and ventral striatum. Contrasting adolescent reward processing with the existing ALE of adult reward processing (Liu et al., 2011) reveals increased likelihood for activation in limbic, frontolimbic, and striatal regions in adolescents compared with adults. Unlike adolescents, adults also activate executive control regions of the frontal and parietal lobes. These findings support hypothesized elevations in motivated activity during adolescence. PMID:26254587
Janssen, Eva; van Osch, Liesbeth; Lechner, Lilian; Candel, Math; de Vries, Hein
2012-01-01
Despite the increased recognition of affect in guiding probability estimates, perceived risk has been mainly operationalised in a cognitive way and the differentiation between rational and intuitive judgements is largely unexplored. This study investigated the validity of a measurement instrument differentiating cognitive and affective probability beliefs and examined whether behavioural decision making is mainly guided by cognition or affect. Data were obtained from four surveys focusing on smoking (N=268), fruit consumption (N=989), sunbed use (N=251) and sun protection (N=858). Correlational analyses showed that affective likelihood was more strongly correlated with worry compared to cognitive likelihood and confirmatory factor analysis provided support for a two-factor model of perceived likelihood instead of a one-factor model (i.e. cognition and affect combined). Furthermore, affective likelihood was significantly associated with the various outcome variables, whereas the association for cognitive likelihood was absent in three studies. The findings provide support for the construct validity of the measures used to assess cognitive and affective likelihood. Since affective likelihood might be a better predictor of health behaviour than the commonly used cognitive operationalisation, both dimensions should be considered in future research.
The Impact of Negative Pressure Wound Therapy on Orthopaedic Infection.
Webb, Lawrence X
2017-04-01
By hastening the resolution of edema and improving local microcirculation, topical negative pressure wound therapy (TNP) aids the establishment of early wound coverage. Its use in the setting of type III open fractures is reviewed. The author's initial use of TNP for closed surgical incisions and how it morphed its way into being applied to closed surgical wounds with heightened likelihood for infection is presented. Several case studies are presented to illustrate the role and the technique for management of acute or subacute infections involving bone and implant. Copyright © 2017 Elsevier Inc. All rights reserved.
On Muthen's Maximum Likelihood for Two-Level Covariance Structure Models
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Hayashi, Kentaro
2005-01-01
Data in social and behavioral sciences are often hierarchically organized. Special statistical procedures that take into account the dependence of such observations have been developed. Among procedures for 2-level covariance structure analysis, Muthen's maximum likelihood (MUML) has the advantage of easier computation and faster convergence. When…
Mixture Rasch Models with Joint Maximum Likelihood Estimation
ERIC Educational Resources Information Center
Willse, John T.
2011-01-01
This research provides a demonstration of the utility of mixture Rasch models. Specifically, a model capable of estimating a mixture partial credit model using joint maximum likelihood is presented. Like the partial credit model, the mixture partial credit model has the beneficial feature of being appropriate for analysis of assessment data…
An EM Algorithm for Maximum Likelihood Estimation of Process Factor Analysis Models
ERIC Educational Resources Information Center
Lee, Taehun
2010-01-01
In this dissertation, an Expectation-Maximization (EM) algorithm is developed and implemented to obtain maximum likelihood estimates of the parameters and the associated standard error estimates characterizing temporal flows for the latent variable time series following stationary vector ARMA processes, as well as the parameters defining the…
Maximum likelihood solution for inclination-only data in paleomagnetism
NASA Astrophysics Data System (ADS)
Arason, P.; Levi, S.
2010-08-01
We have developed a new robust maximum likelihood method for estimating the unbiased mean inclination from inclination-only data. In paleomagnetic analysis, the arithmetic mean of inclination-only data is known to introduce a shallowing bias. Several methods have been introduced to estimate the unbiased mean inclination of inclination-only data together with measures of the dispersion. Some inclination-only methods were designed to maximize the likelihood function of the marginal Fisher distribution. However, the exact analytical form of the maximum likelihood function is fairly complicated, and all the methods require various assumptions and approximations that are often inappropriate. For some steep and dispersed data sets, these methods provide estimates that are significantly displaced from the peak of the likelihood function to systematically shallower inclination. The problem locating the maximum of the likelihood function is partly due to difficulties in accurately evaluating the function for all values of interest, because some elements of the likelihood function increase exponentially as precision parameters increase, leading to numerical instabilities. In this study, we succeeded in analytically cancelling exponential elements from the log-likelihood function, and we are now able to calculate its value anywhere in the parameter space and for any inclination-only data set. Furthermore, we can now calculate the partial derivatives of the log-likelihood function with desired accuracy, and locate the maximum likelihood without the assumptions required by previous methods. To assess the reliability and accuracy of our method, we generated large numbers of random Fisher-distributed data sets, for which we calculated mean inclinations and precision parameters. The comparisons show that our new robust Arason-Levi maximum likelihood method is the most reliable, and the mean inclination estimates are the least biased towards shallow values.
A Primer on Risks, Issues and Opportunities
2016-08-01
likelihood or consequence. A risk has three main parts: a future root cause, a likelihood and a consequence. The future root cause is determined...through root cause analysis, which is the most important part of any risk management effort. SPECIAL SECTION: RISK MANAGEMENT Defense AT&L: July-August...2016 10 Root cause analysis gets to the heart of the risk. Why does the risk exist? What is its nature? How will the risk occur? What should be
NASA Astrophysics Data System (ADS)
Morse, Brad S.; Pohll, Greg; Huntington, Justin; Rodriguez Castillo, Ramiro
2003-06-01
In 1992, Mexican researchers discovered concentrations of arsenic in excess of World Heath Organization (WHO) standards in several municipal wells in the Zimapan Valley of Mexico. This study describes a method to delineate a capture zone for one of the most highly contaminated wells to aid in future well siting. A stochastic approach was used to model the capture zone because of the high level of uncertainty in several input parameters. Two stochastic techniques were performed and compared: "standard" Monte Carlo analysis and the generalized likelihood uncertainty estimator (GLUE) methodology. The GLUE procedure differs from standard Monte Carlo analysis in that it incorporates a goodness of fit (termed a likelihood measure) in evaluating the model. This allows for more information (in this case, head data) to be used in the uncertainty analysis, resulting in smaller prediction uncertainty. Two likelihood measures are tested in this study to determine which are in better agreement with the observed heads. While the standard Monte Carlo approach does not aid in parameter estimation, the GLUE methodology indicates best fit models when hydraulic conductivity is approximately 10-6.5 m/s, with vertically isotropic conditions and large quantities of interbasin flow entering the basin. Probabilistic isochrones (capture zone boundaries) are then presented, and as predicted, the GLUE-derived capture zones are significantly smaller in area than those from the standard Monte Carlo approach.
Hypothesis testing and earthquake prediction.
Jackson, D D
1996-04-30
Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions.
Hypothesis testing and earthquake prediction.
Jackson, D D
1996-01-01
Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions. PMID:11607663
Exact likelihood evaluations and foreground marginalization in low resolution WMAP data
NASA Astrophysics Data System (ADS)
Slosar, Anže; Seljak, Uroš; Makarov, Alexey
2004-06-01
The large scale anisotropies of Wilkinson Microwave Anisotropy Probe (WMAP) data have attracted a lot of attention and have been a source of controversy, with many favorite cosmological models being apparently disfavored by the power spectrum estimates at low l. All the existing analyses of theoretical models are based on approximations for the likelihood function, which are likely to be inaccurate on large scales. Here we present exact evaluations of the likelihood of the low multipoles by direct inversion of the theoretical covariance matrix for low resolution WMAP maps. We project out the unwanted galactic contaminants using the WMAP derived maps of these foregrounds. This improves over the template based foreground subtraction used in the original analysis, which can remove some of the cosmological signal and may lead to a suppression of power. As a result we find an increase in power at low multipoles. For the quadrupole the maximum likelihood values are rather uncertain and vary between 140 and 220 μK2. On the other hand, the probability distribution away from the peak is robust and, assuming a uniform prior between 0 and 2000 μK2, the probability of having the true value above 1200 μK2 (as predicted by the simplest cold dark matter model with a cosmological constant) is 10%, a factor of 2.5 higher than predicted by the WMAP likelihood code. We do not find the correlation function to be unusual beyond the low quadrupole value. We develop a fast likelihood evaluation routine that can be used instead of WMAP routines for low l values. We apply it to the Markov chain Monte Carlo analysis to compare the cosmological parameters between the two cases. The new analysis of WMAP either alone or jointly with the Sloan Digital Sky Survey (SDSS) and the Very Small Array (VSA) data reduces the evidence for running to less than 1σ, giving αs=-0.022±0.033 for the combined case. The new analysis prefers about a 1σ lower value of Ωm, a consequence of an increased integrated Sachs-Wolfe (ISW) effect contribution required by the increase in the spectrum at low l. These results suggest that the details of foreground removal and full likelihood analysis are important for parameter estimation from the WMAP data. They are robust in the sense that they do not change significantly with frequency, mask, or details of foreground template marginalization. The marginalization approach presented here is the most conservative method to remove the foregrounds and should be particularly useful in the analysis of polarization, where foreground contamination may be much more severe.
Ma, Chunming; Liu, Yue; Lu, Qiang; Lu, Na; Liu, Xiaoli; Tian, Yiming; Wang, Rui; Yin, Fuzai
2016-02-01
The blood pressure-to-height ratio (BPHR) has been shown to be an accurate index for screening hypertension in children and adolescents. The aim of the present study was to perform a meta-analysis to assess the performance of BPHR for the assessment of hypertension. Electronic and manual searches were performed to identify studies of the BPHR. After methodological quality assessment and data extraction, pooled estimates of the sensitivity, specificity, positive likelihood ratio, negative likelihood ratio, diagnostic odds ratio, area under the receiver operating characteristic curve and summary receiver operating characteristics were assessed systematically. The extent of heterogeneity for it was assessed. Six studies were identified for analysis. The pooled sensitivity, specificity, positive likelihood ratio, negative likelihood ratio and diagnostic odds ratio values of BPHR, for assessment of hypertension, were 96% [95% confidence interval (CI)=0.95-0.97], 90% (95% CI=0.90-0.91), 10.68 (95% CI=8.03-14.21), 0.04 (95% CI=0.03-0.07) and 247.82 (95% CI=114.50-536.34), respectively. The area under the receiver operating characteristic curve was 0.9472. The BPHR had higher diagnostic accuracies for identifying hypertension in children and adolescents.
Accurate Structural Correlations from Maximum Likelihood Superpositions
Theobald, Douglas L; Wuttke, Deborah S
2008-01-01
The cores of globular proteins are densely packed, resulting in complicated networks of structural interactions. These interactions in turn give rise to dynamic structural correlations over a wide range of time scales. Accurate analysis of these complex correlations is crucial for understanding biomolecular mechanisms and for relating structure to function. Here we report a highly accurate technique for inferring the major modes of structural correlation in macromolecules using likelihood-based statistical analysis of sets of structures. This method is generally applicable to any ensemble of related molecules, including families of nuclear magnetic resonance (NMR) models, different crystal forms of a protein, and structural alignments of homologous proteins, as well as molecular dynamics trajectories. Dominant modes of structural correlation are determined using principal components analysis (PCA) of the maximum likelihood estimate of the correlation matrix. The correlations we identify are inherently independent of the statistical uncertainty and dynamic heterogeneity associated with the structural coordinates. We additionally present an easily interpretable method (“PCA plots”) for displaying these positional correlations by color-coding them onto a macromolecular structure. Maximum likelihood PCA of structural superpositions, and the structural PCA plots that illustrate the results, will facilitate the accurate determination of dynamic structural correlations analyzed in diverse fields of structural biology. PMID:18282091
Maximum Likelihood Analysis of Nonlinear Structural Equation Models with Dichotomous Variables
ERIC Educational Resources Information Center
Song, Xin-Yuan; Lee, Sik-Yum
2005-01-01
In this article, a maximum likelihood approach is developed to analyze structural equation models with dichotomous variables that are common in behavioral, psychological and social research. To assess nonlinear causal effects among the latent variables, the structural equation in the model is defined by a nonlinear function. The basic idea of the…
Contributions to the Underlying Bivariate Normal Method for Factor Analyzing Ordinal Data
ERIC Educational Resources Information Center
Xi, Nuo; Browne, Michael W.
2014-01-01
A promising "underlying bivariate normal" approach was proposed by Jöreskog and Moustaki for use in the factor analysis of ordinal data. This was a limited information approach that involved the maximization of a composite likelihood function. Its advantage over full-information maximum likelihood was that very much less computation was…
ERIC Educational Resources Information Center
Adank, Patti
2012-01-01
The role of speech production mechanisms in difficult speech comprehension is the subject of on-going debate in speech science. Two Activation Likelihood Estimation (ALE) analyses were conducted on neuroimaging studies investigating difficult speech comprehension or speech production. Meta-analysis 1 included 10 studies contrasting comprehension…
John Hogland; Nedret Billor; Nathaniel Anderson
2013-01-01
Discriminant analysis, referred to as maximum likelihood classification within popular remote sensing software packages, is a common supervised technique used by analysts. Polytomous logistic regression (PLR), also referred to as multinomial logistic regression, is an alternative classification approach that is less restrictive, more flexible, and easy to interpret. To...
A time series intervention analysis (TSIA) of dendrochronological data to infer the tree growth-climate-disturbance relations and forest disturbance history is described. Maximum likelihood is used to estimate the parameters of a structural time series model with components for ...
Robust analysis of semiparametric renewal process models
Lin, Feng-Chang; Truong, Young K.; Fine, Jason P.
2013-01-01
Summary A rate model is proposed for a modulated renewal process comprising a single long sequence, where the covariate process may not capture the dependencies in the sequence as in standard intensity models. We consider partial likelihood-based inferences under a semiparametric multiplicative rate model, which has been widely studied in the context of independent and identical data. Under an intensity model, gap times in a single long sequence may be used naively in the partial likelihood with variance estimation utilizing the observed information matrix. Under a rate model, the gap times cannot be treated as independent and studying the partial likelihood is much more challenging. We employ a mixing condition in the application of limit theory for stationary sequences to obtain consistency and asymptotic normality. The estimator's variance is quite complicated owing to the unknown gap times dependence structure. We adapt block bootstrapping and cluster variance estimators to the partial likelihood. Simulation studies and an analysis of a semiparametric extension of a popular model for neural spike train data demonstrate the practical utility of the rate approach in comparison with the intensity approach. PMID:24550568
Can, Seda; van de Schoot, Rens; Hox, Joop
2015-06-01
Because variables may be correlated in the social and behavioral sciences, multicollinearity might be problematic. This study investigates the effect of collinearity manipulated in within and between levels of a two-level confirmatory factor analysis by Monte Carlo simulation. Furthermore, the influence of the size of the intraclass correlation coefficient (ICC) and estimation method; maximum likelihood estimation with robust chi-squares and standard errors and Bayesian estimation, on the convergence rate are investigated. The other variables of interest were rate of inadmissible solutions and the relative parameter and standard error bias on the between level. The results showed that inadmissible solutions were obtained when there was between level collinearity and the estimation method was maximum likelihood. In the within level multicollinearity condition, all of the solutions were admissible but the bias values were higher compared with the between level collinearity condition. Bayesian estimation appeared to be robust in obtaining admissible parameters but the relative bias was higher than for maximum likelihood estimation. Finally, as expected, high ICC produced less biased results compared to medium ICC conditions.
Porta, Alberto; Marchi, Andrea; Bari, Vlasta; Heusser, Karsten; Tank, Jens; Jordan, Jens; Barbic, Franca; Furlan, Raffaello
2015-01-01
We propose a symbolic analysis framework for the quantitative characterization of complex dynamical systems. It allows the description of the time course of a single variable, the assessment of joint interactions and an analysis triggered by a conditioning input. The framework was applied to spontaneous variability of heart period (HP), systolic arterial pressure (SAP) and integrated muscle sympathetic nerve activity (MSNA) with the aim of characterizing cardiovascular control and nonlinear influences of respiration at rest in supine position, during orthostatic challenge induced by 80° head-up tilt (TILT) and about 3 min before evoked pre-syncope signs (PRESY). The approach detected (i) the exaggerated sympathetic modulation and vagal withdrawal from HP variability and the increased presence of fast MSNA variability components during PRESY compared with TILT; (ii) the increase of the SAP–HP coordination occurring at slow temporal scales and a decrease of that occurring at faster time scales during PRESY compared with TILT; (iii) the reduction of the coordination between fast MSNA and SAP patterns during TILT and PRESY; (iv) the nonlinear influences of respiration leading to an increased likelihood to observe the abovementioned findings during expiration compared with inspiration one. The framework provided simple, quantitative indexes able to distinguish experimental conditions characterized by different states of the autonomic nervous system and to detect the early signs of a life threatening situation such as postural syncope. PMID:25548269
Snyder, Susan M; Rubenstein, Casey
2014-01-01
This study examined how incest, depression, parental drinking, relationship status, and living with parents affect patterns of substance use among emerging adults, 18 to 25 years old. The study sample included (n = 11,546) individuals who participated in Waves I, II, and III of the National Longitudinal Study of Adolescent Health (Add Health). The study used separate latent class analysis for males and females to determine how patterns of substance use clustered together. The study identified the following three classes of substance use: heavy, moderate, and normative substance use patterns. Multinomial logistic regression indicated that, for females only, incest histories also nearly doubled the risk of heavy-use class membership. In addition, experiencing depression, being single, and not living with parents serve as risk factors for males and females in the heavy-use group. Conversely, being Black, Hispanic, or living with parents lowered the likelihood of being in the group with the most substance use behaviors (i.e., heavy use). Findings highlight the need for interventions that target depression and female survivors of incest among emerging adults.
Snyder, Susan M.; Rubenstein, Casey
2016-01-01
This study examined how incest, depression, parental drinking, relationship status, and living with parents affect patterns of substance use among emerging adults, 18 to 25 years old. The study sample included (n = 11,546) individuals who participated in Waves I, II, and III of the National Longitudinal Study of Adolescent Health (Add Health). The study used separate latent class analysis for males and females to determine how patterns of substance use clustered together. The study identified the following three classes of substance use: heavy, moderate, and normative substance use patterns. Multinomial logistic regression indicated that, for females only, incest histories also nearly doubled the risk of heavy-use class membership. In addition, experiencing depression, being single, and not living with parents serve as risk factors for males and females in the heavy-use group. Conversely, being Black, Hispanic, or living with parents lowered the likelihood of being in the group with the most substance use behaviors (i.e., heavy use). Findings highlight the need for interventions that target depression and female survivors of incest among emerging adults. PMID:25052877
The association between diet quality, dietary patterns and depression in adults: a systematic review
2013-01-01
Background Recent evidence suggests that diet modifies key biological factors associated with the development of depression; however, associations between diet quality and depression are not fully understood. We performed a systematic review to evaluate existing evidence regarding the association between diet quality and depression. Method A computer-aided literature search was conducted using Medline, CINAHL, and PsycINFO, January 1965 to October 2011, and a best-evidence analysis performed. Results Twenty-five studies from nine countries met eligibility criteria. Our best-evidence analyses found limited evidence to support an association between traditional diets (Mediterranean or Norwegian diets) and depression. We also observed a conflicting level of evidence for associations between (i) a traditional Japanese diet and depression, (ii) a “healthy” diet and depression, (iii) a Western diet and depression, and (iv) individuals with depression and the likelihood of eating a less healthy diet. Conclusion To our knowledge, this is the first review to synthesize and critically analyze evidence regarding diet quality, dietary patterns and depression. Further studies are urgently required to elucidate whether a true causal association exists. PMID:23802679
VizieR Online Data Catalog: Kepler pipeline transit signal recovery. III. (Christiansen+, 2016)
NASA Astrophysics Data System (ADS)
Christiansen, J. L.; Clarke, B. D.; Burke, C. J.; Jenkins, J. M.; Bryson, S. T.; Coughlin, J. L.; Mullally, F.; Thompson, S. E.; Twicken, J. D.; Batalha, N. M.; Haas, M. R.; Catanzarite, J.; Campbell, J. R.; Uddin, A. K.; Zamudio, K.; Smith, J. C.; Henze, C. E.
2018-03-01
Here we describe the third transit injection experiment, which tests the entire Kepler observing baseline (Q1-Q17) for the first time across all 84 CCD channels. It was performed to measure the sensitivity of the Kepler pipeline used to generate the Q1-Q17 Data Release 24 (DR24) catalog of Kepler Objects of Interest (Coughlin et al. 2016, J/ApJS/224/12) available at the NASA Exoplanet Archive (Akeson et al. 2013PASP..125..989A). The average detection efficiency describes the likelihood that the Kepler pipeline would successfully recover a given transit signal. To measure this property we perform a Monte Carlo experiment where we inject the signatures of simulated transiting planets around 198154 target stars, one per star, across the focal plane starting with the Q1-Q17 DR24 calibrated pixels. The simulated transits are generated using the Mandel & Agol (2002ApJ...580L.171M) model. Of the injections, 159013 resulted in three or more injected transits (the minimum required for detection by the pipeline) and were used for the subsequent analysis. (1 data file).
A Gene Signature to Determine Metastatic Behavior in Thymomas
Gökmen-Polar, Yesim; Wilkinson, Jeff; Maetzold, Derek; Stone, John F.; Oelschlager, Kristen M.; Vladislav, Ioan Tudor; Shirar, Kristen L.; Kesler, Kenneth A.; Loehrer, Patrick J.; Badve, Sunil
2013-01-01
Purpose Thymoma represents one of the rarest of all malignancies. Stage and completeness of resection have been used to ascertain postoperative therapeutic strategies albeit with limited prognostic accuracy. A molecular classifier would be useful to improve the assessment of metastatic behaviour and optimize patient management. Methods qRT-PCR assay for 23 genes (19 test and four reference genes) was performed on multi-institutional archival primary thymomas (n = 36). Gene expression levels were used to compute a signature, classifying tumors into classes 1 and 2, corresponding to low or high likelihood for metastases. The signature was validated in an independent multi-institutional cohort of patients (n = 75). Results A nine-gene signature that can predict metastatic behavior of thymomas was developed and validated. Using radial basis machine modeling in the training set, 5-year and 10-year metastasis-free survival rates were 77% and 26% for predicted low (class 1) and high (class 2) risk of metastasis (P = 0.0047, log-rank), respectively. For the validation set, 5-year metastasis-free survival rates were 97% and 30% for predicted low- and high-risk patients (P = 0.0004, log-rank), respectively. The 5-year metastasis-free survival rates for the validation set were 49% and 41% for Masaoka stages I/II and III/IV (P = 0.0537, log-rank), respectively. In univariate and multivariate Cox models evaluating common prognostic factors for thymoma metastasis, the nine-gene signature was the only independent indicator of metastases (P = 0.036). Conclusion A nine-gene signature was established and validated which predicts the likelihood of metastasis more accurately than traditional staging. This further underscores the biologic determinants of the clinical course of thymoma and may improve patient management. PMID:23894276
Preoperative nomogram to predict the likelihood of complications after radical nephroureterectomy.
Raman, Jay D; Lin, Yu-Kuan; Shariat, Shahrokh F; Krabbe, Laura-Maria; Margulis, Vitaly; Arnouk, Alex; Lallas, Costas D; Trabulsi, Edouard J; Drouin, Sarah J; Rouprêt, Morgan; Bozzini, Gregory; Colin, Pierre; Peyronnet, Benoit; Bensalah, Karim; Bailey, Kari; Canes, David; Klatte, Tobias
2017-02-01
To construct a nomogram based on preoperative variables to better predict the likelihood of complications occurring within 30 days of radical nephroureterectomy (RNU). The charts of 731 patients undergoing RNU at eight academic medical centres between 2002 and 2014 were reviewed. Preoperative clinical, demographic and comorbidity indices were collected. Complications occurring within 30 days of surgery were graded using the modified Clavien-Dindo scale. Multivariate logistic regression determined the association between preoperative variables and post-RNU complications. A nomogram was created from the reduced multivariate model with internal validation using the bootstrapping technique with 200 repetitions. A total of 408 men and 323 women with a median age of 70 years and a body mass index of 27 kg/m 2 were included. A total of 75% of the cohort was white, 18% had an Eastern Cooperative Oncology Group (ECOG) performance status ≥2, 20% had a Charlson comorbidity index (CCI) score >5 and 50% had baseline chronic kidney disease (CKD) ≥ stage III. Overall, 279 patients (38%) experienced a complication, including 61 events (22%) with Clavien grade ≥ III. A multivariate model identified five variables associated with complications, including patient age, race, ECOG performance status, CKD stage and CCI score. A preoperative nomogram incorporating these risk factors was constructed with an area under curve of 72.2%. Using standard preoperative variables from this multi-institutional RNU experience, we constructed and validated a nomogram for predicting peri-operative complications after RNU. Such information may permit more accurate risk stratification on an individual cases basis before major surgery. © 2016 The Authors BJU International © 2016 BJU International Published by John Wiley & Sons Ltd.
Ananworanich, Jintanat; Eller, Leigh Anne; Pinyakorn, Suteeraporn; Kroon, Eugene; Sriplenchan, Somchai; Fletcher, James LK; Suttichom, Duanghathai; Bryant, Christopher; Trichavaroj, Rapee; Dawson, Peter; Michael, Nelson; Phanuphak, Nittaya; Robb, Merlin L
2017-01-01
Abstract Introduction: The extent of viral replication during acute HIV infection (AHI) influences HIV disease progression. However, information comparing viral load (VL) kinetics with and without antiretroviral therapy (ART) in AHI is limited. The knowledge gained could inform preventive strategies aimed at reducing VL during AHI and therapeutic strategies to alter the viral kinetics that may enhance the likelihood of achieving HIV remission. Methods: The analysis utilized VL data captured during the first year of HIV infection from two studies in Thailand: the RV217 study (untreated AHI, 30 participants and 412 visits) and the RV254 study (treated AHI, 235 participants and 2803 visits). Fiebig stages were I/II (HIV RNA+, HIV IgM−) and Fiebig III/IV (HIV IgM+, Western blot-/indeterminate). Data were modelled utilizing spline effects within a linear mixed model, with a random intercept and slope to allow for between-subject variability and adjustment for the differences in variability between studies. The number of knots in the quadratic spline basis functions was determined by comparing models with differing numbers of knots via the Akaike Information Criterion. Models were fit using PROC GLIMMIX in SAS v9.3. Results: At enrolment, there were 24 Fiebig I/II and 6 Fiebig III/IV individuals in the untreated group and 137 Fiebig I/II and 98 Fiebig III/IV individuals in the treated group. Overall, the median age was 27.5 years old, most were male (89%), and CRF01_AE was the most common HIV clade (76%). By day 12 (4 days after ART in RV254), the untreated group had a 2.7-fold higher predicted mean VL level compared to those treated (predicted log VL 6.19 for RV217 and 5.76 for RV254, p = 0.05). These differences increased to 135-fold by day 30 (predicted log VL 4.89 for RV217 and 2.76 for RV254) and 1148-fold by day 120 (predicted log VL 4.68 for RV217 and 1.63 for RV254) (p < 0.0001 for both) until both curves were similarly flat at about day 150 (p = 0.17 between days 150 and 160). The VL trajectories were significantly different between Fiebig I/II and Fiebig III/IV participants when comparing the two groups and within the treated group (p < 0.001 for both). Conclusions: Initiating ART in AHI dramatically changed the trajectory of VL very early in the course of infection that could have implications for reducing transmission potential and enhancing responses to future HIV remission strategies. There is an urgency of initiating ART when acute infection is identified. New and inexpensive strategies to engage and test individuals at high risk for HIV as well as immediate treatment access will be needed to improve the treatment of acute infection globally. Clinical Trial Number: NCT00796146 and NCT00796263 PMID:28691436
Ananworanich, Jintanat; Eller, Leigh Anne; Pinyakorn, Suteeraporn; Kroon, Eugene; Sriplenchan, Somchai; Fletcher, James Lk; Suttichom, Duanghathai; Bryant, Christopher; Trichavaroj, Rapee; Dawson, Peter; Michael, Nelson; Phanuphak, Nittaya; Robb, Merlin L
2017-06-26
The extent of viral replication during acute HIV infection (AHI) influences HIV disease progression. However, information comparing viral load (VL) kinetics with and without antiretroviral therapy (ART) in AHI is limited. The knowledge gained could inform preventive strategies aimed at reducing VL during AHI and therapeutic strategies to alter the viral kinetics that may enhance the likelihood of achieving HIV remission. The analysis utilized VL data captured during the first year of HIV infection from two studies in Thailand: the RV217 study (untreated AHI, 30 participants and 412 visits) and the RV254 study (treated AHI, 235 participants and 2803 visits). Fiebig stages were I/II (HIV RNA+, HIV IgM-) and Fiebig III/IV (HIV IgM+, Western blot-/indeterminate). Data were modelled utilizing spline effects within a linear mixed model, with a random intercept and slope to allow for between-subject variability and adjustment for the differences in variability between studies. The number of knots in the quadratic spline basis functions was determined by comparing models with differing numbers of knots via the Akaike Information Criterion. Models were fit using PROC GLIMMIX in SAS v9.3. At enrolment, there were 24 Fiebig I/II and 6 Fiebig III/IV individuals in the untreated group and 137 Fiebig I/II and 98 Fiebig III/IV individuals in the treated group. Overall, the median age was 27.5 years old, most were male (89%), and CRF01_AE was the most common HIV clade (76%). By day 12 (4 days after ART in RV254), the untreated group had a 2.7-fold higher predicted mean VL level compared to those treated (predicted log VL 6.19 for RV217 and 5.76 for RV254, p = 0.05). These differences increased to 135-fold by day 30 (predicted log VL 4.89 for RV217 and 2.76 for RV254) and 1148-fold by day 120 (predicted log VL 4.68 for RV217 and 1.63 for RV254) ( p < 0.0001 for both) until both curves were similarly flat at about day 150 ( p = 0.17 between days 150 and 160). The VL trajectories were significantly different between Fiebig I/II and Fiebig III/IV participants when comparing the two groups and within the treated group ( p < 0.001 for both). Initiating ART in AHI dramatically changed the trajectory of VL very early in the course of infection that could have implications for reducing transmission potential and enhancing responses to future HIV remission strategies. There is an urgency of initiating ART when acute infection is identified. New and inexpensive strategies to engage and test individuals at high risk for HIV as well as immediate treatment access will be needed to improve the treatment of acute infection globally. NCT00796146 and NCT00796263.
Approximate likelihood calculation on a phylogeny for Bayesian estimation of divergence times.
dos Reis, Mario; Yang, Ziheng
2011-07-01
The molecular clock provides a powerful way to estimate species divergence times. If information on some species divergence times is available from the fossil or geological record, it can be used to calibrate a phylogeny and estimate divergence times for all nodes in the tree. The Bayesian method provides a natural framework to incorporate different sources of information concerning divergence times, such as information in the fossil and molecular data. Current models of sequence evolution are intractable in a Bayesian setting, and Markov chain Monte Carlo (MCMC) is used to generate the posterior distribution of divergence times and evolutionary rates. This method is computationally expensive, as it involves the repeated calculation of the likelihood function. Here, we explore the use of Taylor expansion to approximate the likelihood during MCMC iteration. The approximation is much faster than conventional likelihood calculation. However, the approximation is expected to be poor when the proposed parameters are far from the likelihood peak. We explore the use of parameter transforms (square root, logarithm, and arcsine) to improve the approximation to the likelihood curve. We found that the new methods, particularly the arcsine-based transform, provided very good approximations under relaxed clock models and also under the global clock model when the global clock is not seriously violated. The approximation is poorer for analysis under the global clock when the global clock is seriously wrong and should thus not be used. The results suggest that the approximate method may be useful for Bayesian dating analysis using large data sets.
2013-01-01
Background Chronic kidney disease (CKD) poses a financial burden on patients and their households. This descriptive study measures the prevalence of economic hardship and out-of-pocket costs in an Australian CKD population. Methods A cross-sectional study of patients receiving care for CKD (stage III-V) in Western Sydney, Australia using a structured questionnaire. Data collection occurred between November 2010 and April 2011. Multivariate analyses assessed the relationships between economic hardship and individual, household and health system characteristics. Results The study included 247 prevalent CKD patients. A mean of AUD$907 per three months was paid out-of-pocket resulting in 71% (n=153) of participants experiencing financial catastrophe (out-of-pocket costs exceeding 10% of household income). Fifty-seven percent (n=140) of households reported economic hardship. The adjusted risk factors that decreased the likelihood of hardship included: home ownership (OR: 0.32, 95% CI: 0.14-0.71), access to financial resources (OR: 0.24, 95% CI: 0.11-0.50) and quality of life (OR: 0.12, 95% CI: 0.02-0.56). The factors that increased the likelihood of hardship included if income was negatively impacted by CKD (OR: 4.80, 95% CI: 2.17-10.62) and concessional status (i.e. receiving government support) (OR: 3.09, 95% CI: 1.38-6.91). Out-of-pocket costs and financial catastrophe were not found to be significantly associated with hardship in this analysis. Conclusions This study describes the poorer economic circumstances of households affected by CKD and reinforces the inter-relationships between chronic illness, economic well-being and quality of life for this patient population. PMID:23305212
Olarte-Castillo, Ximena A.; Hofer, Heribert; Goller, Katja V.; Martella, Vito; Moehlman, Patricia D.; East, Marion L.
2016-01-01
The genus Sapovirus, in the family Caliciviridae, includes enteric viruses of humans and domestic animals. Information on sapovirus infection of wildlife is limited and is currently lacking for any free-ranging wildlife species in Africa. By screening a large number of predominantly fecal samples (n = 631) obtained from five carnivore species in the Serengeti ecosystem, East Africa, sapovirus RNA was detected in the spotted hyena (Crocuta crocuta, family Hyaenidae), African lion (Panthera leo, family Felidae), and bat-eared fox (Otocyon megalotis, family Canidae), but not in golden or silver-backed jackals (Canis aureus and C. mesomelas, respectively, family Canidae). A phylogenetic analysis based on partial RNA-dependent RNA polymerase (RdRp) gene sequences placed the sapovirus strains from African carnivores in a monophyletic group. Within this monophyletic group, sapovirus strains from spotted hyenas formed one independent sub-group, and those from bat-eared fox and African lion a second sub-group. The percentage nucleotide similarity between sapoviruses from African carnivores and those from other species was low (< 70.4%). Long-term monitoring of sapovirus in a population of individually known spotted hyenas from 2001 to 2012 revealed: i) a relatively high overall infection prevalence (34.8%); ii) the circulation of several genetically diverse variants; iii) large fluctuations in infection prevalence across years, indicative of outbreaks; iv) no significant difference in the likelihood of infection between animals in different age categories. The likelihood of sapovirus infection decreased with increasing hyena group size, suggesting an encounter reduction effect, but was independent of socially mediated ano-genital contact, or the extent of the area over which an individual roamed. PMID:27661997
Changes in prescription contraceptive use, 1995-2002: the effect of insurance status.
Culwell, Kelly R; Feinglass, Joe
2007-12-01
To examine changes in prescription contraception use between 1995 and 2002 by insurance status among women at risk for unintended pregnancy. Data from the National Survey of Family Growth, including 4,767 women at risk of unintended pregnancy in 1995 and 3,569 in 2002, were used to evaluate changes in primary contraception methods by health insurance status and year of survey. Logistic regression models tested differences in the likelihood of prescription contraceptive use among privately insured, publicly insured, and uninsured women in each year, after controlling for age, race and ethnicity, education, income, employment, marital status, number of children, religion, and self reported overall health. Overall prescription contraceptive use increased between 1995 and 2002 by 3% (48.9% to 51.9%, P=.049). Nonuse of contraception also increased (11.6% to 16.1%, P<.001). The change in the likelihood of prescription contraceptive use was greatest and only significant among privately insured women (+5.5%, P=.002). In multiple regression analysis, women in 1995 were 10% less likely to report use of prescription contraceptives compared with women in 2002 (relative risk 0.90, 95% confidence interval 0.82-0.98), and uninsured women were more than 20% less likely to report prescription contraceptive use compared with privately insured women (relative risk 0.78, 95% confidence interval 0.67-0.90). Prescription contraceptive use increased most significantly among privately insured women between 1995 and 2002, potentially reflecting state mandates enacted during that period requiring contraceptive coverage by private insurers. It is important for clinicians to understand these differences and address issues of insurance coverage with patients when discussing contraceptive options. III.
Stummer, Walter; Rodrigues, Floriano; Schucht, Philippe; Preuss, Matthias; Wiewrodt, Dorothee; Nestler, Ulf; Stein, Marco; Artero, José Manuel Cabezudo; Platania, Nunzio; Skjøth-Rasmussen, Jane; Della Puppa, Alessandro; Caird, John; Cortnum, Søren; Eljamel, Sam; Ewald, Christian; González-García, Laura; Martin, Andrew J; Melada, Ante; Peraud, Aurelia; Brentrup, Angela; Santarius, Thomas; Steiner, Hans Herbert
2014-12-01
Five-aminolevulinic acid (Gliolan, medac, Wedel, Germany, 5-ALA) is approved for fluorescence-guided resections of adult malignant gliomas. Case reports indicate that 5-ALA can be used for children, yet no prospective study has been conducted as of yet. As a basis for a study, we conducted a survey among certified European Gliolan users to collect data on their experiences with children. Information on patient characteristics, MRI characteristics of tumors, histology, fluorescence qualities, and outcomes were requested. Surgeons were further asked to indicate whether fluorescence was "useful", i.e., leading to changes in surgical strategy or identification of residual tumor. Recursive partitioning analysis (RPA) was used for defining cohorts with high or low likelihoods for useful fluorescence. Data on 78 patients <18 years of age were submitted by 20 centers. Fluorescence was found useful in 12 of 14 glioblastomas (85 %), four of five anaplastic astrocytomas (60 %), and eight of ten ependymomas grades II and III (80 %). Fluorescence was found inconsistently useful in PNETs (three of seven; 43 %), gangliogliomas (two of five; 40 %), medulloblastomas (two of eight, 25 %) and pilocytic astrocytomas (two of 13; 15 %). RPA of pre-operative factors showed tumors with supratentorial location, strong contrast enhancement and first operation to have a likelihood of useful fluorescence of 64.3 %, as opposed to infratentorial tumors with first surgery (23.1 %). Our survey demonstrates 5-ALA as being used in pediatric brain tumors. 5-ALA may be especially useful for contrast-enhancing supratentorial tumors. These data indicate controlled studies to be necessary and also provide a basis for planning such a study.
Statistical analyses support power law distributions found in neuronal avalanches.
Klaus, Andreas; Yu, Shan; Plenz, Dietmar
2011-01-01
The size distribution of neuronal avalanches in cortical networks has been reported to follow a power law distribution with exponent close to -1.5, which is a reflection of long-range spatial correlations in spontaneous neuronal activity. However, identifying power law scaling in empirical data can be difficult and sometimes controversial. In the present study, we tested the power law hypothesis for neuronal avalanches by using more stringent statistical analyses. In particular, we performed the following steps: (i) analysis of finite-size scaling to identify scale-free dynamics in neuronal avalanches, (ii) model parameter estimation to determine the specific exponent of the power law, and (iii) comparison of the power law to alternative model distributions. Consistent with critical state dynamics, avalanche size distributions exhibited robust scaling behavior in which the maximum avalanche size was limited only by the spatial extent of sampling ("finite size" effect). This scale-free dynamics suggests the power law as a model for the distribution of avalanche sizes. Using both the Kolmogorov-Smirnov statistic and a maximum likelihood approach, we found the slope to be close to -1.5, which is in line with previous reports. Finally, the power law model for neuronal avalanches was compared to the exponential and to various heavy-tail distributions based on the Kolmogorov-Smirnov distance and by using a log-likelihood ratio test. Both the power law distribution without and with exponential cut-off provided significantly better fits to the cluster size distributions in neuronal avalanches than the exponential, the lognormal and the gamma distribution. In summary, our findings strongly support the power law scaling in neuronal avalanches, providing further evidence for critical state dynamics in superficial layers of cortex.
Onukwugha, Eberechukwu; Osteen, Phillip; Jayasekera, Jinani; Mullins, C Daniel; Mair, Christine A; Hussain, Arif
2014-11-01
Factors contributing to the lower likelihood of urologist follow-up among African American (AA) men diagnosed with prostate cancer may not be strictly related to patient factors. The authors investigated the relationship between crime, poverty, and poor housing, among others, and postdiagnosis urologist visits among AA and white men. The authors used linked cancer registry and Medicare claims data from 1999 through 2007 for men diagnosed with American Joint Committee on Cancer stage I to III prostate cancer. The USA Counties and County Business Patterns data sets provided county-level data. Variance components models reported the percentage of variation attributed to county of residence. Postdiagnosis urologist visits for AA and white men were investigated using logistic and modified Poisson regression models. A total of 65,635 patients were identified; 87% of whom were non-Hispanic white and 9.3% of whom were non-Hispanic AA. Approximately 16% of men diagnosed with stage I to III prostate cancer did not visit a urologist within 1 year after diagnosis (22% of AA men and 15% of white men). County of residence accounted for 10% of the variation in the visit outcome (13% for AA men and 10% for white men). AA men were more likely to live in counties ranked highest in terms of poverty, occupied housing units with no telephone, and crime. AA men were less likely to see a urologist (odds ratio, 0.65 [95% confidence interval, 0.6-0.71]; rate ratio, 0.94 [95% confidence interval, 0.92-0.95]). The sign and magnitude of the coefficients for the county-level measures differed across race-specific regression models of urologist visits. Among older men diagnosed with stage I to III prostate cancer, the social environment appears to contribute to some of the disparities in postdiagnosis urologist visits between AA and white men. © 2014 American Cancer Society.
Cosmological parameters from a re-analysis of the WMAP 7 year low-resolution maps
NASA Astrophysics Data System (ADS)
Finelli, F.; De Rosa, A.; Gruppuso, A.; Paoletti, D.
2013-06-01
Cosmological parameters from Wilkinson Microwave Anisotropy Probe (WMAP) 7 year data are re-analysed by substituting a pixel-based likelihood estimator to the one delivered publicly by the WMAP team. Our pixel-based estimator handles exactly intensity and polarization in a joint manner, allowing us to use low-resolution maps and noise covariance matrices in T, Q, U at the same resolution, which in this work is 3.6°. We describe the features and the performances of the code implementing our pixel-based likelihood estimator. We perform a battery of tests on the application of our pixel-based likelihood routine to WMAP publicly available low-resolution foreground-cleaned products, in combination with the WMAP high-ℓ likelihood, reporting the differences on cosmological parameters evaluated by the full WMAP likelihood public package. The differences are not only due to the treatment of polarization, but also to the marginalization over monopole and dipole uncertainties present in the WMAP pixel likelihood code for temperature. The credible central value for the cosmological parameters change below the 1σ level with respect to the evaluation by the full WMAP 7 year likelihood code, with the largest difference in a shift to smaller values of the scalar spectral index nS.
Heersink, Daniel K; Caley, Peter; Paini, Dean R; Barry, Simon C
2016-05-01
The cost of an uncontrolled incursion of invasive alien species (IAS) arising from undetected entry through ports can be substantial, and knowledge of port-specific risks is needed to help allocate limited surveillance resources. Quantifying the establishment likelihood of such an incursion requires quantifying the ability of a species to enter, establish, and spread. Estimation of the approach rate of IAS into ports provides a measure of likelihood of entry. Data on the approach rate of IAS are typically sparse, and the combinations of risk factors relating to country of origin and port of arrival diverse. This presents challenges to making formal statistical inference on establishment likelihood. Here we demonstrate how these challenges can be overcome with judicious use of mixed-effects models when estimating the incursion likelihood into Australia of the European (Apis mellifera) and Asian (A. cerana) honeybees, along with the invasive parasites of biosecurity concern they host (e.g., Varroa destructor). Our results demonstrate how skewed the establishment likelihood is, with one-tenth of the ports accounting for 80% or more of the likelihood for both species. These results have been utilized by biosecurity agencies in the allocation of resources to the surveillance of maritime ports. © 2015 Society for Risk Analysis.
Burns, Linda J.; Logan, Brent R.; Chitphakdithai, Pintip; Miller, John P.; Drexler, Rebecca; Spellman, Stephen; Switzer, Galen E.; Wingard, John R.; Anasetti, Claudio; Confer, Dennis L.
2016-01-01
We report a comparison of time to recovery, side effects, and change in blood counts from baseline to post-donation of unrelated donors who participated in the Blood and Marrow Transplant Clinical Trials Network (BMT CTN) phase III randomized, multicenter trial (0201) in which donor/recipient pairs were randomized to either peripheral blood stem cell (PBSC) or bone marrow (BM) donation. Of the entire cohort, 262 donated PBSC and 264 donated BM; 372 (71%) donors were from domestic and 154 (29%) from international centers (145 German and 9 Canadian). PBSC donors recovered in less time with a median time to recovery of 1 week compared to 2.3 weeks for BM donors. The number of donors reporting full recovery was significantly greater for donors of PBSC than of BM at 1, 2, and 3 weeks and 3 months post-donation. Multivariate analysis showed that PBSC donors were more likely to recover at any time post donation compared to BM donors (HR 2.08 [95% CI 1.73–2.50], p<0.001). Other characteristics that significantly increased the likelihood of complete recovery were being an international donor and donation in more recent years. Donors of BM were more likely to report grade 2–4 skeletal pain, body symptoms and fatigue at 1 week post donation. In logistic regression analysis of domestic donors only in which toxicities at peri-collection time points (day 5 filgrastim for PBSC donors and day 2 post-collection of BM donors) could be analyzed, no variable was significantly associated with grade 2–4 skeletal pain, including product donated (BM vs PBSC, OR 1.13 [95% CI 0.74–1.74], p=0.556). Blood counts were impacted by product donated, with mean change from baseline to post-donation being greater for white blood cells, neutrophils, mononuclear cells and platelets in PBSC donors whereas BM donors experienced a greater mean change in hemoglobin. This analysis provided an enhanced understanding of donor events as product donated was independent of physician bias or donor preference. PMID:27013014
Chaikriangkrai, Kongkiat; Jhun, Hye Yeon; Shantha, Ghanshyam Palamaner Subash; Abdulhak, Aref Bin; Tandon, Rudhir; Alqasrawi, Musab; Klappa, Anthony; Pancholy, Samir; Deshmukh, Abhishek; Bhama, Jay; Sigurdsson, Gardar
2018-07-01
In aortic stenosis patients referred for surgical and transcatheter aortic valve replacement (AVR), the evidence of diagnostic accuracy of coronary computed tomography angiography (CCTA) has been limited. The objective of this study was to investigate the diagnostic accuracy of CCTA for significant coronary artery disease (CAD) in patients referred for AVR using invasive coronary angiography (ICA) as the gold standard. We searched databases for all diagnostic studies of CCTA in patients referred for AVR, which reported diagnostic testing characteristics on patient-based analysis required to pool summary sensitivity, specificity, positive-likelihood ratio, and negative-likelihood ratio. Significant CAD in both CCTA and ICA was defined by >50% stenosis in any coronary artery, coronary stent, or bypass graft. Thirteen studies evaluated 1498 patients (mean age, 74 y; 47% men; 76% transcatheter AVR). The pooled prevalence of significant stenosis determined by ICA was 43%. Hierarchical summary receiver-operating characteristic analysis demonstrated a summary area under curve of 0.96. The pooled sensitivity, specificity, and positive-likelihood and negative-likelihood ratios of CCTA in identifying significant stenosis determined by ICA were 95%, 79%, 4.48, and 0.06, respectively. In subgroup analysis, the diagnostic profiles of CCTA were comparable between surgical and transcatheter AVR. Despite the higher prevalence of significant CAD in patients with aortic stenosis than with other valvular heart diseases, our meta-analysis has shown that CCTA has a suitable diagnostic accuracy profile as a gatekeeper test for ICA. Our study illustrates a need for further study of the potential role of CCTA in preoperative planning for AVR.
Overweight among primary school-age children in Malaysia.
Naidu, Balkish Mahadir; Mahmud, Siti Zuraidah; Ambak, Rashidah; Sallehuddin, Syafinaz Mohd; Mutalip, Hatta Abdul; Saari, Riyanti; Sahril, Norhafizah; Hamid, Hamizatul Akmal Abdul
2013-01-01
This study is a secondary data analysis from the National Health Morbidity Survey III, a population-based study conducted in 2006. A total of 7,749 children between 7 and 12 years old were recruited into the study. This study seeks to report the prevalence of overweight (including obesity) children in Malaysia using international cut-off point and identify its associated key social determinants. The results show that the overall prevalence of overweight children in Malaysia was 19.9%. The urban residents, males, Chinese, those who are wealthy, have overweight or educated guardians showed higher prevalence of overweight. In multivariable analysis, higher likelihood of being overweight was observed among those with advancing age (OR=1.15), urban residents (OR=1.16, 95% CI: 1.01-1.36), the Chinese (OR=1.45, 95% CI: 1.19-1.77), boys (OR=1.23, 95% CI: 1.08-1.41), and those who came from higher income family. In conclusion, one out of five of 7-12 year-old-children in Malaysia were overweight. Locality of residence, ethnicity, gender, guardian education, and overweight guardian were likely to be the predictors of this alarming issue. Societal and public health efforts are needed in order to reduce the burden of disease associated with obesity.
Kim, Sora; Kaila, Lauri; Lee, Seunghwan
2016-08-01
Phylogenetic relationships within family Oecophoridae have been poorly understood. Consequently the subfamily and genus level classifications with this family problematic. A comprehensive phylogenetic analysis of Oecophoridae, the concealer moths, was performed based on analysis of 4444 base pairs of mitochondrial COI, nuclear ribosomal RNA genes (18S and 28S) and nuclear protein coding genes (IDH, MDH, Rps5, EF1a and wingless) for 82 taxa. Data were analyzed using maximum likelihood (ML), parsimony (MP) and Bayesian (BP) phylogenetic frameworks. Phylogenetic analyses indicated that (i) genera Casmara, Tyrolimnas and Pseudodoxia did not belong to Oecophoridae, suggesting that Oecophoridae s. authors was not monophyletic; (ii) other oecophorids comprising two subfamilies, Pleurotinae and Oecophorinae, were nested within the same clade, and (iii) Martyringa, Acryptolechia and Periacmini were clustered with core Xyloryctidae. They appeared to be sister lineage with core Oecophoridae. BayesTraits were implemented to explore the ancestral character states to infer historical microhabitat patterns and sheltering strategy of larvae. Reconstruction of ancestral microhabitat of oecophorids indicated that oecophorids might have evolved from dried plant feeders and further convergently specialized. The ancestral larva sheltering strategy of oecophorids might have used a silk tube by making itself, shifting from mining leaves. Copyright © 2016 Elsevier Inc. All rights reserved.
Climate Informed Low Flow Frequency Analysis Using Nonstationary Modeling
NASA Astrophysics Data System (ADS)
Liu, D.; Guo, S.; Lian, Y.
2014-12-01
Stationarity is often assumed for frequency analysis of low flows in water resources management and planning. However, many studies have shown that flow characteristics, particularly the frequency spectrum of extreme hydrologic events,were modified by climate change and human activities and the conventional frequency analysis without considering the non-stationary characteristics may lead to costly design. The analysis presented in this paper was based on the more than 100 years of daily flow data from the Yichang gaging station 44 kilometers downstream of the Three Gorges Dam. The Mann-Kendall trend test under the scaling hypothesis showed that the annual low flows had significant monotonic trend, whereas an abrupt change point was identified in 1936 by the Pettitt test. The climate informed low flow frequency analysis and the divided and combined method are employed to account for the impacts from related climate variables and the nonstationarities in annual low flows. Without prior knowledge of the probability density function for the gaging station, six distribution functions including the Generalized Extreme Values (GEV), Pearson Type III, Gumbel, Gamma, Lognormal, and Weibull distributions have been tested to find the best fit, in which the local likelihood method is used to estimate the parameters. Analyses show that GEV had the best fit for the observed low flows. This study has also shown that the climate informed low flow frequency analysis is able to exploit the link between climate indices and low flows, which would account for the dynamic feature for reservoir management and provide more accurate and reliable designs for infrastructure and water supply.
Simultaneous Control of Error Rates in fMRI Data Analysis
Kang, Hakmook; Blume, Jeffrey; Ombao, Hernando; Badre, David
2015-01-01
The key idea of statistical hypothesis testing is to fix, and thereby control, the Type I error (false positive) rate across samples of any size. Multiple comparisons inflate the global (family-wise) Type I error rate and the traditional solution to maintaining control of the error rate is to increase the local (comparison-wise) Type II error (false negative) rates. However, in the analysis of human brain imaging data, the number of comparisons is so large that this solution breaks down: the local Type II error rate ends up being so large that scientifically meaningful analysis is precluded. Here we propose a novel solution to this problem: allow the Type I error rate to converge to zero along with the Type II error rate. It works because when the Type I error rate per comparison is very small, the accumulation (or global) Type I error rate is also small. This solution is achieved by employing the Likelihood paradigm, which uses likelihood ratios to measure the strength of evidence on a voxel-by-voxel basis. In this paper, we provide theoretical and empirical justification for a likelihood approach to the analysis of human brain imaging data. In addition, we present extensive simulations that show the likelihood approach is viable, leading to ‘cleaner’ looking brain maps and operationally superiority (lower average error rate). Finally, we include a case study on cognitive control related activation in the prefrontal cortex of the human brain. PMID:26272730
Ruilong, Zong; Daohai, Xie; Li, Geng; Xiaohong, Wang; Chunjie, Wang; Lei, Tian
2017-01-01
To carry out a meta-analysis on the performance of fluorine-18-fluorodeoxyglucose (F-FDG) PET/computed tomography (PET/CT) for the evaluation of solitary pulmonary nodules. In the meta-analysis, we performed searches of several electronic databases for relevant studies, including Google Scholar, PubMed, Cochrane Library, and several Chinese databases. The quality of all included studies was assessed by Quality Assessment of Diagnostic Accuracy Studies (QUADAS-2). Two observers independently extracted data of eligible articles. For the meta-analysis, the total sensitivity, specificity, positive likelihood ratio, negative likelihood ratio, and diagnostic odds ratios were pooled. A summary receiver operating characteristic curve was constructed. The I-test was performed to assess the impact of study heterogeneity on the results of the meta-analysis. Meta-regression and subgroup analysis were carried out to investigate the potential covariates that might have considerable impacts on heterogeneity. Overall, 12 studies were included in this meta-analysis, including a total of 1297 patients and 1301 pulmonary nodules. The pooled sensitivity, specificity, positive likelihood ratio, and negative likelihood ratio with corresponding 95% confidence intervals (CIs) were 0.82 (95% CI, 0.76-0.87), 0.81 (95% CI, 0.66-0.90), 4.3 (95% CI, 2.3-7.9), and 0.22 (95% CI, 0.16-0.30), respectively. Significant heterogeneity was observed in sensitivity (I=81.1%) and specificity (I=89.6%). Subgroup analysis showed that the best results for sensitivity (0.90; 95% CI, 0.68-0.86) and accuracy (0.93; 95% CI, 0.90-0.95) were present in a prospective study. The results of our analysis suggest that PET/CT is a useful tool for detecting malignant pulmonary nodules qualitatively. Although current evidence showed moderate accuracy for PET/CT in differentiating malignant from benign solitary pulmonary nodules, further work needs to be carried out to improve its reliability.
Maximum Likelihood Analysis of a Two-Level Nonlinear Structural Equation Model with Fixed Covariates
ERIC Educational Resources Information Center
Lee, Sik-Yum; Song, Xin-Yuan
2005-01-01
In this article, a maximum likelihood (ML) approach for analyzing a rather general two-level structural equation model is developed for hierarchically structured data that are very common in educational and/or behavioral research. The proposed two-level model can accommodate nonlinear causal relations among latent variables as well as effects…
Detecting Growth Shape Misspecifications in Latent Growth Models: An Evaluation of Fit Indexes
ERIC Educational Resources Information Center
Leite, Walter L.; Stapleton, Laura M.
2011-01-01
In this study, the authors compared the likelihood ratio test and fit indexes for detection of misspecifications of growth shape in latent growth models through a simulation study and a graphical analysis. They found that the likelihood ratio test, MFI, and root mean square error of approximation performed best for detecting model misspecification…
ERIC Educational Resources Information Center
Petty, Richard E.; And Others
1987-01-01
Answers James Stiff's criticism of the Elaboration Likelihood Model (ELM) of persuasion. Corrects certain misperceptions of the ELM and criticizes Stiff's meta-analysis that compares ELM predictions with those derived from Kahneman's elastic capacity model. Argues that Stiff's presentation of the ELM and the conclusions he draws based on the data…
Constrained Maximum Likelihood Estimation for Two-Level Mean and Covariance Structure Models
ERIC Educational Resources Information Center
Bentler, Peter M.; Liang, Jiajuan; Tang, Man-Lai; Yuan, Ke-Hai
2011-01-01
Maximum likelihood is commonly used for the estimation of model parameters in the analysis of two-level structural equation models. Constraints on model parameters could be encountered in some situations such as equal factor loadings for different factors. Linear constraints are the most common ones and they are relatively easy to handle in…
ERIC Educational Resources Information Center
Kelderman, Henk
1992-01-01
Describes algorithms used in the computer program LOGIMO for obtaining maximum likelihood estimates of the parameters in loglinear models. These algorithms are also useful for the analysis of loglinear item-response theory models. Presents modified versions of the iterative proportional fitting and Newton-Raphson algorithms. Simulated data…
Likelihood of Suicidality at Varying Levels of Depression Severity: A Re-Analysis of NESARC Data
ERIC Educational Resources Information Center
Uebelacker, Lisa A.; Strong, David; Weinstock, Lauren M.; Miller, Ivan W.
2010-01-01
Although it is clear that increasing depression severity is associated with more risk for suicidality, less is known about at what levels of depression severity the risk for different suicide symptoms increases. We used item response theory to estimate the likelihood of endorsing suicide symptoms across levels of depression severity in an…
Is Immigrant Status Relevant in School Violence Research? An Analysis with Latino Students
ERIC Educational Resources Information Center
Peguero, Anthony A.
2008-01-01
Background: The role of race and ethnicity is consistently found to be linked to the likelihood of students experiencing school violence-related outcomes; however, the findings are not always consistent. The variation of likelihood, as well as the type, of student-related school violence outcome among the Latino student population may be…
Design of simplified maximum-likelihood receivers for multiuser CPM systems.
Bing, Li; Bai, Baoming
2014-01-01
A class of simplified maximum-likelihood receivers designed for continuous phase modulation based multiuser systems is proposed. The presented receiver is built upon a front end employing mismatched filters and a maximum-likelihood detector defined in a low-dimensional signal space. The performance of the proposed receivers is analyzed and compared to some existing receivers. Some schemes are designed to implement the proposed receivers and to reveal the roles of different system parameters. Analysis and numerical results show that the proposed receivers can approach the optimum multiuser receivers with significantly (even exponentially in some cases) reduced complexity and marginal performance degradation.
Rabi cropped area forecasting of parts of Banaskatha District,Gujarat using MRS RISAT-1 SAR data
NASA Astrophysics Data System (ADS)
Parekh, R. A.; Mehta, R. L.; Vyas, A.
2016-10-01
Radar sensors can be used for large-scale vegetation mapping and monitoring using backscatter coefficients in different polarisations and wavelength bands. Due to cloud and haze interference, optical images are not always available at all phonological stages important for crop discrimination. Moreover, in cloud prone areas, exclusively SAR approach would provide operational solution. This paper presents the results of classifying the cropped and non cropped areas using multi-temporal SAR images. Dual polarised C- band RISAT MRS (Medium Resolution ScanSAR mode) data were acquired on 9thDec. 2012, 28thJan. 2013 and 22nd Feb. 2013 at 18m spatial resolution. Intensity images of two polarisations (HH, HV) were extracted and converted into backscattering coefficient images. Cross polarisation ratio (CPR) images and Radar fractional vegetation density index (RFDI) were created from the temporal data and integrated with the multi-temporal images. Signatures of cropped and un-cropped areas were used for maximum likelihood supervised classification. Separability in cropped and umcropped classes using different polarisation combinations and classification accuracy analysis was carried out. FCC (False Color Composite) prepared using best three SAR polarisations in the data set was compared with LISS-III (Linear Imaging Self-Scanning System-III) image. The acreage under rabi crops was estimated. The methodology developed was for rabi cropped area, due to availability of SAR data of rabi season. Though, the approach is more relevant for acreage estimation of kharif crops when frequent cloud cover condition prevails during monsoon season and optical sensors fail to deliver good quality images.
NASA Astrophysics Data System (ADS)
Pan, Zhen; Anderes, Ethan; Knox, Lloyd
2018-05-01
One of the major targets for next-generation cosmic microwave background (CMB) experiments is the detection of the primordial B-mode signal. Planning is under way for Stage-IV experiments that are projected to have instrumental noise small enough to make lensing and foregrounds the dominant source of uncertainty for estimating the tensor-to-scalar ratio r from polarization maps. This makes delensing a crucial part of future CMB polarization science. In this paper we present a likelihood method for estimating the tensor-to-scalar ratio r from CMB polarization observations, which combines the benefits of a full-scale likelihood approach with the tractability of the quadratic delensing technique. This method is a pixel space, all order likelihood analysis of the quadratic delensed B modes, and it essentially builds upon the quadratic delenser by taking into account all order lensing and pixel space anomalies. Its tractability relies on a crucial factorization of the pixel space covariance matrix of the polarization observations which allows one to compute the full Gaussian approximate likelihood profile, as a function of r , at the same computational cost of a single likelihood evaluation.
Analyzing user behavior of the micro-blogging website Sina Weibo during hot social events
NASA Astrophysics Data System (ADS)
Guan, Wanqiu; Gao, Haoyu; Yang, Mingmin; Li, Yuan; Ma, Haixin; Qian, Weining; Cao, Zhigang; Yang, Xiaoguang
2014-02-01
The spread and resonance of users’ opinions on Sina Weibo, the most popular micro-blogging website in China, are tremendously influential, having significantly affected the processes of many real-world hot social events. We select 21 hot events that were widely discussed on Sina Weibo in 2011, and do some statistical analyses. Our main findings are that (i) male users are more likely to be involved, (ii) messages that contain pictures and those posted by verified users are more likely to be reposted, while those with URLs are less likely, (iii) the gender factor, for most events, presents no significant difference in reposting likelihood.
Wald Sequential Probability Ratio Test for Analysis of Orbital Conjunction Data
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell; Markley, F. Landis; Gold, Dara
2013-01-01
We propose a Wald Sequential Probability Ratio Test for analysis of commonly available predictions associated with spacecraft conjunctions. Such predictions generally consist of a relative state and relative state error covariance at the time of closest approach, under the assumption that prediction errors are Gaussian. We show that under these circumstances, the likelihood ratio of the Wald test reduces to an especially simple form, involving the current best estimate of collision probability, and a similar estimate of collision probability that is based on prior assumptions about the likelihood of collision.
Tian, Xian-Liang; Guan, Xian
2015-01-01
Objective: The objective of this paper is to examine the impact of Hurricane Katrina on displaced students’ behavioral disorder. Methods: First, we determine displaced students’ likelihood of discipline infraction each year relative to non-evacuees using all K12 student records of the U.S. state of Louisiana during the period of 2000–2008. Second, we investigate the impact of hurricane on evacuee students’ in-school behavior in a difference-in-difference framework. The quasi-experimental nature of the hurricane makes this framework appropriate with the advantage that the problem of endogeneity is of least concern and the causal effect of interest can be reasonably identified. Results: Preliminary analysis demonstrates a sharp increase in displaced students’ relative likelihood of discipline infraction around 2005 when the hurricane occurred. Further, formal difference-in-difference analysis confirms the results. To be specific, post Katrina, displaced students’ relative likelihood of any discipline infraction has increased by 7.3% whereas the increase in the relative likelihood for status offense, offense against person, offense against property and serious crime is 4%, 1.5%, 3.8% and 2.1%, respectively. Conclusion: When disasters occur, as was the case with Hurricane Katrina, in addition to assistance for adult evacuees, governments, in cooperation with schools, should also provide aid and assistance to displaced children to support their mental health and in-school behavior. PMID:26006127
Tian, Xian-Liang; Guan, Xian
2015-05-22
The objective of this paper is to examine the impact of Hurricane Katrina on displaced students' behavioral disorder. First, we determine displaced students' likelihood of discipline infraction each year relative to non-evacuees using all K12 student records of the U.S. state of Louisiana during the period of 2000-2008. Second, we investigate the impact of hurricane on evacuee students' in-school behavior in a difference-in-difference framework. The quasi-experimental nature of the hurricane makes this framework appropriate with the advantage that the problem of endogeneity is of least concern and the causal effect of interest can be reasonably identified. Preliminary analysis demonstrates a sharp increase in displaced students' relative likelihood of discipline infraction around 2005 when the hurricane occurred. Further, formal difference-in-difference analysis confirms the results. To be specific, post Katrina, displaced students' relative likelihood of any discipline infraction has increased by 7.3% whereas the increase in the relative likelihood for status offense, offense against person, offense against property and serious crime is 4%, 1.5%, 3.8% and 2.1%, respectively. When disasters occur, as was the case with Hurricane Katrina, in addition to assistance for adult evacuees, governments, in cooperation with schools, should also provide aid and assistance to displaced children to support their mental health and in-school behavior.
Wang, Lina; Li, Hao; Yang, Zhongyuan; Guo, Zhuming; Zhang, Quan
2015-07-01
This study was designed to assess the efficiency of the serum thyrotropin to thyroglobulin ratio for thyroid nodule evaluation in euthyroid patients. Cross-sectional study. Sun Yat-sen University Cancer Center, State Key Laboratory of Oncology in South China. Retrospective analysis was performed for 400 previously untreated cases presenting with thyroid nodules. Thyroid function was tested with commercially available radioimmunoassays. The receiver operating characteristic curves were constructed to determine cutoff values. The efficacy of the thyrotropin:thyroglobulin ratio and thyroid-stimulating hormone for thyroid nodule evaluation was evaluated in terms of sensitivity, specificity, positive predictive value, positive likelihood ratio, negative likelihood ratio, and odds ratio. In receiver operating characteristic curve analysis, the area under the curve was 0.746 for the thyrotropin:thyroglobulin ratio and 0.659 for thyroid-stimulating hormone. With a cutoff point value of 24.97 IU/g for the thyrotropin:thyroglobulin ratio, the sensitivity, specificity, positive predictive value, positive likelihood ratio, and negative likelihood ratio were 78.9%, 60.8%, 75.5%, 2.01, and 0.35, respectively. The odds ratio for the thyrotropin:thyroglobulin ratio indicating malignancy was 5.80. With a cutoff point value of 1.525 µIU/mL for thyroid-stimulating hormone, the sensitivity, specificity, positive predictive value, positive likelihood ratio, and negative likelihood ratio were 74.0%, 53.2%, 70.8%, 1.58, and 0.49, respectively. The odds ratio indicating malignancy for thyroid-stimulating hormone was 3.23. Increasing preoperative serum thyrotropin:thyroglobulin ratio is a risk factor for thyroid carcinoma, and the correlation of the thyrotropin:thyroglobulin ratio to malignancy is higher than that for serum thyroid-stimulating hormone. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2015.
Theofilatos, Athanasios
2017-06-01
The effective treatment of road accidents and thus the enhancement of road safety is a major concern to societies due to the losses in human lives and the economic and social costs. The investigation of road accident likelihood and severity by utilizing real-time traffic and weather data has recently received significant attention by researchers. However, collected data mainly stem from freeways and expressways. Consequently, the aim of the present paper is to add to the current knowledge by investigating accident likelihood and severity by exploiting real-time traffic and weather data collected from urban arterials in Athens, Greece. Random Forests (RF) are firstly applied for preliminary analysis purposes. More specifically, it is aimed to rank candidate variables according to their relevant importance and provide a first insight on the potential significant variables. Then, Bayesian logistic regression as well finite mixture and mixed effects logit models are applied to further explore factors associated with accident likelihood and severity respectively. Regarding accident likelihood, the Bayesian logistic regression showed that variations in traffic significantly influence accident occurrence. On the other hand, accident severity analysis revealed a generally mixed influence of traffic variations on accident severity, although international literature states that traffic variations increase severity. Lastly, weather parameters did not find to have a direct influence on accident likelihood or severity. The study added to the current knowledge by incorporating real-time traffic and weather data from urban arterials to investigate accident occurrence and accident severity mechanisms. The identification of risk factors can lead to the development of effective traffic management strategies to reduce accident occurrence and severity of injuries in urban arterials. Copyright © 2017 Elsevier Ltd and National Safety Council. All rights reserved.
Thirumala, Parthasarathy D; Krishnaiah, Balaji; Crammond, Donald J; Habeych, Miguel E; Balzer, Jeffrey R
2014-04-01
Intraoperative monitoring of brain stem auditory evoked potential during microvascular decompression (MVD) prevent hearing loss (HL). Previous studies have shown that changes in wave III (wIII) are an early and sensitive sign of auditory nerve injury. To evaluate the changes of amplitude and latency of wIII of brain stem auditory evoked potential during MVD and its association with postoperative HL. Hearing loss was classified by American Academy of Otolaryngology - Head and Neck Surgery (AAO-HNS) criteria, based on changes in pure tone audiometry and speech discrimination score. Retrospective analysis of wIII in patients who underwent intraoperative monitoring with brain stem auditory evoked potential during MVD was performed. A univariate logistic regression analysis was performed on independent variables amplitude of wIII and latency of wIII at change max and On-Skin, or a final recording at the time of skin closure. A further analysis for the same variables was performed adjusting for the loss of wave. The latency of wIII was not found to be significantly different between groups I and II. The amplitude of wIII was significantly decreased in the group with HL. Regression analysis did not find any increased odds of HL with changes in the amplitude of wIII. Changes in wave III did not increase the odds of HL in patients who underwent brain stem auditory evoked potential s during MVD. This information might be valuable to evaluate the value of wIII as an alarm criterion during MVD to prevent HL.
Costa, Rui Miguel; Miller, Geoffrey F; Brody, Stuart
2012-12-01
Research indicates that (i) women's orgasm during penile-vaginal intercourse (PVI) is influenced by fitness-related male partner characteristics, (ii) penis size is important for many women, and (iii) preference for a longer penis is associated with greater vaginal orgasm consistency (triggered by PVI without concurrent clitoral masturbation). To test the hypothesis that vaginal orgasm frequency is associated with women's reporting that a longer than average penis is more likely to provoke their PVI orgasm. Three hundred twenty-three women reported in an online survey their past month frequency of various sexual behaviors (including PVI, vaginal orgasm, and clitoral orgasm), the effects of a longer than average penis on likelihood of orgasm from PVI, and the importance they attributed to PVI and to noncoital sex. Univariate analyses of covariance with dependent variables being frequencies of various sexual behaviors and types of orgasm and with independent variable being women reporting vs. not reporting that a longer than average penis is important for their orgasm from PVI. Likelihood of orgasm with a longer penis was related to greater vaginal orgasm frequency but unrelated to frequencies of other sexual behaviors, including clitoral orgasm. In binary logistic regression, likelihood of orgasm with a longer penis was related to greater importance attributed to PVI and lesser importance attributed to noncoital sex. Women who prefer deeper penile-vaginal stimulation are more likely to have vaginal orgasm, consistent with vaginal orgasm evolving as part of a female mate choice system favoring somewhat larger than average penises. Future research could extend the findings by overcoming limitations related to more precise measurement of penis length (to the pubis and pressed close to the pubic bone) and girth, and large representative samples. Future experimental research might assess to what extent different penis sizes influence women's satisfaction and likelihood of vaginal orgasm. © 2012 International Society for Sexual Medicine.
Staley, Dennis M.; Negri, Jacquelyn A.; Kean, Jason W.; Laber, Jayme L.; Tillery, Anne C.; Youberg, Ann M.
2016-06-30
Wildfire can significantly alter the hydrologic response of a watershed to the extent that even modest rainstorms can generate dangerous flash floods and debris flows. To reduce public exposure to hazard, the U.S. Geological Survey produces post-fire debris-flow hazard assessments for select fires in the western United States. We use publicly available geospatial data describing basin morphology, burn severity, soil properties, and rainfall characteristics to estimate the statistical likelihood that debris flows will occur in response to a storm of a given rainfall intensity. Using an empirical database and refined geospatial analysis methods, we defined new equations for the prediction of debris-flow likelihood using logistic regression methods. We showed that the new logistic regression model outperformed previous models used to predict debris-flow likelihood.
The effects of gender, family status, and race on sentencing decisions.
Freiburger, Tina L
2010-01-01
This study sought to determine the effects of family role, gender, and race on judges' sentencing decisions. To assess these effects, factorial surveys were sent to 360 Court of Common Plea judges who presided over criminal court cases in the state. Survey administration resulted in a 51% response rate. The findings indicate that defendants who were depicted as performing caretaker roles had a significantly decreased likelihood of incarceration. Further analysis found that the reduction in likelihood of incarceration for being a caretaker was larger for males than for females. Examination of the interaction of familial role with race found that familial role equally reduced the likelihood of incarceration for black and white females. Familial responsibility, however, resulted in a significantly greater decrease in likelihood of incarceration for black men than for white men. 2009 John Wiley & Sons, Ltd.
Bayesian structural equation modeling in sport and exercise psychology.
Stenling, Andreas; Ivarsson, Andreas; Johnson, Urban; Lindwall, Magnus
2015-08-01
Bayesian statistics is on the rise in mainstream psychology, but applications in sport and exercise psychology research are scarce. In this article, the foundations of Bayesian analysis are introduced, and we will illustrate how to apply Bayesian structural equation modeling in a sport and exercise psychology setting. More specifically, we contrasted a confirmatory factor analysis on the Sport Motivation Scale II estimated with the most commonly used estimator, maximum likelihood, and a Bayesian approach with weakly informative priors for cross-loadings and correlated residuals. The results indicated that the model with Bayesian estimation and weakly informative priors provided a good fit to the data, whereas the model estimated with a maximum likelihood estimator did not produce a well-fitting model. The reasons for this discrepancy between maximum likelihood and Bayesian estimation are discussed as well as potential advantages and caveats with the Bayesian approach.
Seniors, health information, and the Internet: motivation, ability, and Internet knowledge.
Sheng, Xiaojing; Simpson, Penny M
2013-10-01
Providing health information to older adults is crucial to empowering them to better control their health, and the information is readily available on the Internet. Yet, little is known about the factors that are important in affecting seniors' Internet search for health information behavior. This work addresses this research deficit by examining the role of health information orientation (HIO), eHealth literacy, and Internet knowledge (IK) in affecting the likelihood of using the Internet as a source for health information. The analysis reveals that each variable in the study is significant in affecting Internet search likelihood. Results from the analysis also demonstrate the partial mediating role of eHealth literacy and the interaction between eHealth literacy and HIO. The findings suggest that improving seniors' IK and eHealth literacy would increase their likelihood of searching for and finding health information on the Internet that might encourage better health behaviors.
A Gateway for Phylogenetic Analysis Powered by Grid Computing Featuring GARLI 2.0
Bazinet, Adam L.; Zwickl, Derrick J.; Cummings, Michael P.
2014-01-01
We introduce molecularevolution.org, a publicly available gateway for high-throughput, maximum-likelihood phylogenetic analysis powered by grid computing. The gateway features a garli 2.0 web service that enables a user to quickly and easily submit thousands of maximum likelihood tree searches or bootstrap searches that are executed in parallel on distributed computing resources. The garli web service allows one to easily specify partitioned substitution models using a graphical interface, and it performs sophisticated post-processing of phylogenetic results. Although the garli web service has been used by the research community for over three years, here we formally announce the availability of the service, describe its capabilities, highlight new features and recent improvements, and provide details about how the grid system efficiently delivers high-quality phylogenetic results. [garli, gateway, grid computing, maximum likelihood, molecular evolution portal, phylogenetics, web service.] PMID:24789072
ERIC Educational Resources Information Center
Yuan, Ke-Hai
2008-01-01
In the literature of mean and covariance structure analysis, noncentral chi-square distribution is commonly used to describe the behavior of the likelihood ratio (LR) statistic under alternative hypothesis. Due to the inaccessibility of the rather technical literature for the distribution of the LR statistic, it is widely believed that the…
An Improved Nested Sampling Algorithm for Model Selection and Assessment
NASA Astrophysics Data System (ADS)
Zeng, X.; Ye, M.; Wu, J.; WANG, D.
2017-12-01
Multimodel strategy is a general approach for treating model structure uncertainty in recent researches. The unknown groundwater system is represented by several plausible conceptual models. Each alternative conceptual model is attached with a weight which represents the possibility of this model. In Bayesian framework, the posterior model weight is computed as the product of model prior weight and marginal likelihood (or termed as model evidence). As a result, estimating marginal likelihoods is crucial for reliable model selection and assessment in multimodel analysis. Nested sampling estimator (NSE) is a new proposed algorithm for marginal likelihood estimation. The implementation of NSE comprises searching the parameters' space from low likelihood area to high likelihood area gradually, and this evolution is finished iteratively via local sampling procedure. Thus, the efficiency of NSE is dominated by the strength of local sampling procedure. Currently, Metropolis-Hasting (M-H) algorithm and its variants are often used for local sampling in NSE. However, M-H is not an efficient sampling algorithm for high-dimensional or complex likelihood function. For improving the performance of NSE, it could be feasible to integrate more efficient and elaborated sampling algorithm - DREAMzs into the local sampling. In addition, in order to overcome the computation burden problem of large quantity of repeating model executions in marginal likelihood estimation, an adaptive sparse grid stochastic collocation method is used to build the surrogates for original groundwater model.
Wan, Bing; Wang, Siqi; Tu, Mengqi; Wu, Bo; Han, Ping; Xu, Haibo
2017-03-01
The purpose of this meta-analysis was to evaluate the diagnostic accuracy of perfusion magnetic resonance imaging (MRI) as a method for differentiating glioma recurrence from pseudoprogression. The PubMed, Embase, Cochrane Library, and Chinese Biomedical databases were searched comprehensively for relevant studies up to August 3, 2016 according to specific inclusion and exclusion criteria. The quality of the included studies was assessed according to the quality assessment of diagnostic accuracy studies (QUADAS-2). After performing heterogeneity and threshold effect tests, pooled sensitivity, specificity, positive likelihood ratio, negative likelihood ratio, and diagnostic odds ratio were calculated. Publication bias was evaluated visually by a funnel plot and quantitatively using Deek funnel plot asymmetry test. The area under the summary receiver operating characteristic curve was calculated to demonstrate the diagnostic performance of perfusion MRI. Eleven studies covering 416 patients and 418 lesions were included in this meta-analysis. The pooled sensitivity, specificity, positive likelihood ratio, negative likelihood ratio, and diagnostic odds ratio were 0.88 (95% confidence interval [CI] 0.84-0.92), 0.77 (95% CI 0.69-0.84), 3.93 (95% CI 2.83-5.46), 0.16 (95% CI 0.11-0.22), and 27.17 (95% CI 14.96-49.35), respectively. The area under the summary receiver operating characteristic curve was 0.8899. There was no notable publication bias. Sensitivity analysis showed that the meta-analysis results were stable and credible. While perfusion MRI is not the ideal diagnostic method for differentiating glioma recurrence from pseudoprogression, it could improve diagnostic accuracy. Therefore, further research on combining perfusion MRI with other imaging modalities is warranted.
Gu, Lijuan; Rosenberg, Mark W; Zeng, Juxin
2017-10-01
China's rapid socioeconomic growth in recent years and the simultaneous increase in many forms of pollution are generating contradictory pictures of residents' well-being. This paper applies multilevel analysis to the 2013 China General Social Survey data on social development and health to understand this twofold phenomenon. Multilevel models are developed to investigate the impact of socioeconomic development and environmental degradation on self-reported health (SRH) and self-reported happiness (SRHP), differentiating among lower, middle, and higher income groups. The results of the logit multilevel analysis demonstrate that income, jobs, and education increased the likelihood of rating SRH and SRHP positively for the lower and middle groups but had little or no effect on the higher income group. Having basic health insurance had an insignificant effect on health but increased the likelihood of happiness among the lower income group. Provincial-level pollutants were associated with a higher likelihood of good health for all income groups, and community-level industrial pollutants increased the likelihood of good health for the lower and middle income groups. Measures of community-level pollution were robust predictors of the likelihood of unhappiness among the lower and middle income groups. Environmental hazards had a mediating effect on the relationship between socioeconomic development and health, and socioeconomic development strengthened the association between environmental hazards and happiness. These outcomes indicate that the complex interconnections among socioeconomic development and environmental degradation have differential effects on well-being among different income groups in China.
Draborg, Eva; Andersen, Christian Kronborg
2006-01-01
Health technology assessment (HTA) has been used as input in decision making worldwide for more than 25 years. However, no uniform definition of HTA or agreement on assessment methods exists, leaving open the question of what influences the choice of assessment methods in HTAs. The objective of this study is to analyze statistically a possible relationship between methods of assessment used in practical HTAs, type of assessed technology, type of assessors, and year of publication. A sample of 433 HTAs published by eleven leading institutions or agencies in nine countries was reviewed and analyzed by multiple logistic regression. The study shows that outsourcing of HTA reports to external partners is associated with a higher likelihood of using assessment methods, such as meta-analysis, surveys, economic evaluations, and randomized controlled trials; and with a lower likelihood of using assessment methods, such as literature reviews and "other methods". The year of publication was statistically related to the inclusion of economic evaluations and shows a decreasing likelihood during the year span. The type of assessed technology was related to economic evaluations with a decreasing likelihood, to surveys, and to "other methods" with a decreasing likelihood when pharmaceuticals were the assessed type of technology. During the period from 1989 to 2002, no major developments in assessment methods used in practical HTAs were shown statistically in a sample of 433 HTAs worldwide. Outsourcing to external assessors has a statistically significant influence on choice of assessment methods.
Two models for evaluating landslide hazards
Davis, J.C.; Chung, C.-J.; Ohlmacher, G.C.
2006-01-01
Two alternative procedures for estimating landslide hazards were evaluated using data on topographic digital elevation models (DEMs) and bedrock lithologies in an area adjacent to the Missouri River in Atchison County, Kansas, USA. The two procedures are based on the likelihood ratio model but utilize different assumptions. The empirical likelihood ratio model is based on non-parametric empirical univariate frequency distribution functions under an assumption of conditional independence while the multivariate logistic discriminant model assumes that likelihood ratios can be expressed in terms of logistic functions. The relative hazards of occurrence of landslides were estimated by an empirical likelihood ratio model and by multivariate logistic discriminant analysis. Predictor variables consisted of grids containing topographic elevations, slope angles, and slope aspects calculated from a 30-m DEM. An integer grid of coded bedrock lithologies taken from digitized geologic maps was also used as a predictor variable. Both statistical models yield relative estimates in the form of the proportion of total map area predicted to already contain or to be the site of future landslides. The stabilities of estimates were checked by cross-validation of results from random subsamples, using each of the two procedures. Cell-by-cell comparisons of hazard maps made by the two models show that the two sets of estimates are virtually identical. This suggests that the empirical likelihood ratio and the logistic discriminant analysis models are robust with respect to the conditional independent assumption and the logistic function assumption, respectively, and that either model can be used successfully to evaluate landslide hazards. ?? 2006.
The impact of cosmetic breast implants on breastfeeding: a systematic review and meta-analysis
2014-01-01
Background Cosmetic breast augmentation (breast implants) is one of the most common plastic surgery procedures worldwide and uptake in high income countries has increased in the last two decades. Women need information about all associated outcomes in order to make an informed decision regarding whether to undergo cosmetic breast surgery. We conducted a systematic review to assess breastfeeding outcomes among women with breast implants compared to women without. Methods A systematic literature search of Medline, Pubmed, CINAHL and Embase databases was conducted using the earliest inclusive dates through December 2013. Eligible studies included comparative studies that reported breastfeeding outcomes (any breastfeeding, and among women who breastfed, exclusive breastfeeding) for women with and without breast implants. Pairs of reviewers extracted descriptive data, study quality, and outcomes. Rate ratios (RR) and 95% confidence intervals (CI) were pooled across studies using the random-effects model. The Newcastle-Ottawa scale (NOS) was used to critically appraise study quality, and the National Health and Medical Research Council Level of Evidence Scale to rank the level of the evidence. This systematic review has been registered with the international prospective register of systematic reviews (PROSPERO): CRD42014009074. Results Three small, observational studies met the inclusion criteria. The quality of the studies was fair (NOS 4-6) and the level of evidence was low (III-2 - III-3). There was no significant difference in attempted breastfeeding (one study, RR 0.94, 95% CI 0.76, 1.17). However, among women who breastfed, all three studies reported a reduced likelihood of exclusive breastfeeding amongst women with breast implants with a pooled rate ratio of 0.60 (95% CI 0.40, 0.90). Conclusions This systematic review and meta-analysis suggests that women with breast implants who breastfeed were less likely to exclusively feed their infants with breast milk compared to women without breast implants. PMID:25332722
NASA Astrophysics Data System (ADS)
Weerathunga, Thilina Shihan
2017-08-01
Gravitational waves are a fundamental prediction of Einstein's General Theory of Relativity. The first experimental proof of their existence was provided by the Nobel Prize winning discovery by Taylor and Hulse of orbital decay in a binary pulsar system. The first detection of gravitational waves incident on earth from an astrophysical source was announced in 2016 by the LIGO Scientific Collaboration, launching the new era of gravitational wave (GW) astronomy. The signal detected was from the merger of two black holes, which is an example of sources called Compact Binary Coalescences (CBCs). Data analysis strategies used in the search for CBC signals are derivatives of the Maximum-Likelihood (ML) method. The ML method applied to data from a network of geographically distributed GW detectors--called fully coherent network analysis--is currently the best approach for estimating source location and GW polarization waveforms. However, in the case of CBCs, especially for lower mass systems (O(1M solar masses)) such as double neutron star binaries, fully coherent network analysis is computationally expensive. The ML method requires locating the global maximum of the likelihood function over a nine dimensional parameter space, where the computation of the likelihood at each point requires correlations involving O(104) to O(106) samples between the data and the corresponding candidate signal waveform template. Approximations, such as semi-coherent coincidence searches, are currently used to circumvent the computational barrier but incur a concomitant loss in sensitivity. We explored the effectiveness of Particle Swarm Optimization (PSO), a well-known algorithm in the field of swarm intelligence, in addressing the fully coherent network analysis problem. As an example, we used a four-detector network consisting of the two LIGO detectors at Hanford and Livingston, Virgo and Kagra, all having initial LIGO noise power spectral densities, and show that PSO can locate the global maximum with less than 240,000 likelihood evaluations for a component mass range of 1.0 to 10.0 solar masses at a realistic coherent network signal to noise ratio of 9.0. Our results show that PSO can successfully deliver a fully-coherent all-sky search with < (1/10 ) the number of likelihood evaluations needed for a grid-based search. Used as a follow-up step, the savings in the number of likelihood evaluations may also reduce latency in obtaining ML estimates of source parameters in semi-coherent searches.
Testing students' e-learning via Facebook through Bayesian structural equation modeling.
Salarzadeh Jenatabadi, Hashem; Moghavvemi, Sedigheh; Wan Mohamed Radzi, Che Wan Jasimah Bt; Babashamsi, Parastoo; Arashi, Mohammad
2017-01-01
Learning is an intentional activity, with several factors affecting students' intention to use new learning technology. Researchers have investigated technology acceptance in different contexts by developing various theories/models and testing them by a number of means. Although most theories/models developed have been examined through regression or structural equation modeling, Bayesian analysis offers more accurate data analysis results. To address this gap, the unified theory of acceptance and technology use in the context of e-learning via Facebook are re-examined in this study using Bayesian analysis. The data (S1 Data) were collected from 170 students enrolled in a business statistics course at University of Malaya, Malaysia, and tested with the maximum likelihood and Bayesian approaches. The difference between the two methods' results indicates that performance expectancy and hedonic motivation are the strongest factors influencing the intention to use e-learning via Facebook. The Bayesian estimation model exhibited better data fit than the maximum likelihood estimator model. The results of the Bayesian and maximum likelihood estimator approaches are compared and the reasons for the result discrepancy are deliberated.
Lehmann, A; Scheffler, Ch; Hermanussen, M
2010-02-01
Recent progress in modelling individual growth has been achieved by combining the principal component analysis and the maximum likelihood principle. This combination models growth even in incomplete sets of data and in data obtained at irregular intervals. We re-analysed late 18th century longitudinal growth of German boys from the boarding school Carlsschule in Stuttgart. The boys, aged 6-23 years, were measured at irregular 3-12 monthly intervals during the period 1771-1793. At the age of 18 years, mean height was 1652 mm, but height variation was large. The shortest boy reached 1474 mm, the tallest 1826 mm. Measured height closely paralleled modelled height, with mean difference of 4 mm, SD 7 mm. Seasonal height variation was found. Low growth rates occurred in spring and high growth rates in summer and autumn. The present study demonstrates that combining the principal component analysis and the maximum likelihood principle enables growth modelling in historic height data also. Copyright (c) 2009 Elsevier GmbH. All rights reserved.
Testing students’ e-learning via Facebook through Bayesian structural equation modeling
Moghavvemi, Sedigheh; Wan Mohamed Radzi, Che Wan Jasimah Bt; Babashamsi, Parastoo; Arashi, Mohammad
2017-01-01
Learning is an intentional activity, with several factors affecting students’ intention to use new learning technology. Researchers have investigated technology acceptance in different contexts by developing various theories/models and testing them by a number of means. Although most theories/models developed have been examined through regression or structural equation modeling, Bayesian analysis offers more accurate data analysis results. To address this gap, the unified theory of acceptance and technology use in the context of e-learning via Facebook are re-examined in this study using Bayesian analysis. The data (S1 Data) were collected from 170 students enrolled in a business statistics course at University of Malaya, Malaysia, and tested with the maximum likelihood and Bayesian approaches. The difference between the two methods’ results indicates that performance expectancy and hedonic motivation are the strongest factors influencing the intention to use e-learning via Facebook. The Bayesian estimation model exhibited better data fit than the maximum likelihood estimator model. The results of the Bayesian and maximum likelihood estimator approaches are compared and the reasons for the result discrepancy are deliberated. PMID:28886019
Two C++ Libraries for Counting Trees on a Phylogenetic Terrace.
Biczok, R; Bozsoky, P; Eisenmann, P; Ernst, J; Ribizel, T; Scholz, F; Trefzer, A; Weber, F; Hamann, M; Stamatakis, A
2018-05-08
The presence of terraces in phylogenetic tree space, that is, a potentially large number of distinct tree topologies that have exactly the same analytical likelihood score, was first described by Sanderson et al. (2011). However, popular software tools for maximum likelihood and Bayesian phylogenetic inference do not yet routinely report, if inferred phylogenies reside on a terrace, or not. We believe, this is due to the lack of an efficient library to (i) determine if a tree resides on a terrace, (ii) calculate how many trees reside on a terrace, and (iii) enumerate all trees on a terrace. In our bioinformatics practical that is set up as a programming contest we developed two efficient and independent C++ implementations of the SUPERB algorithm by Constantinescu and Sankoff (1995) for counting and enumerating trees on a terrace. Both implementations yield exactly the same results, are more than one order of magnitude faster, and require one order of magnitude less memory than a previous 3rd party python implementation. The source codes are available under GNU GPL at https://github.com/terraphast. Alexandros.Stamatakis@h-its.org. Supplementary data are available at Bioinformatics online.
Fast maximum likelihood estimation of mutation rates using a birth-death process.
Wu, Xiaowei; Zhu, Hongxiao
2015-02-07
Since fluctuation analysis was first introduced by Luria and Delbrück in 1943, it has been widely used to make inference about spontaneous mutation rates in cultured cells. Under certain model assumptions, the probability distribution of the number of mutants that appear in a fluctuation experiment can be derived explicitly, which provides the basis of mutation rate estimation. It has been shown that, among various existing estimators, the maximum likelihood estimator usually demonstrates some desirable properties such as consistency and lower mean squared error. However, its application in real experimental data is often hindered by slow computation of likelihood due to the recursive form of the mutant-count distribution. We propose a fast maximum likelihood estimator of mutation rates, MLE-BD, based on a birth-death process model with non-differential growth assumption. Simulation studies demonstrate that, compared with the conventional maximum likelihood estimator derived from the Luria-Delbrück distribution, MLE-BD achieves substantial improvement on computational speed and is applicable to arbitrarily large number of mutants. In addition, it still retains good accuracy on point estimation. Published by Elsevier Ltd.
Likelihood-based confidence intervals for estimating floods with given return periods
NASA Astrophysics Data System (ADS)
Martins, Eduardo Sávio P. R.; Clarke, Robin T.
1993-06-01
This paper discusses aspects of the calculation of likelihood-based confidence intervals for T-year floods, with particular reference to (1) the two-parameter gamma distribution; (2) the Gumbel distribution; (3) the two-parameter log-normal distribution, and other distributions related to the normal by Box-Cox transformations. Calculation of the confidence limits is straightforward using the Nelder-Mead algorithm with a constraint incorporated, although care is necessary to ensure convergence either of the Nelder-Mead algorithm, or of the Newton-Raphson calculation of maximum-likelihood estimates. Methods are illustrated using records from 18 gauging stations in the basin of the River Itajai-Acu, State of Santa Catarina, southern Brazil. A small and restricted simulation compared likelihood-based confidence limits with those given by use of the central limit theorem; for the same confidence probability, the confidence limits of the simulation were wider than those of the central limit theorem, which failed more frequently to contain the true quantile being estimated. The paper discusses possible applications of likelihood-based confidence intervals in other areas of hydrological analysis.
ERIC Educational Resources Information Center
Lee, Woong-Kyu
2012-01-01
The principal objective of this study was to gain insight into attitude changes occurring during IT acceptance from the perspective of elaboration likelihood model (ELM). In particular, the primary target of this study was the process of IT acceptance through an education program. Although the Internet and computers are now quite ubiquitous, and…
Phoebe L. Zarnetske; Thomas C., Jr. Edwards; Gretchen G. Moisen
2007-01-01
Estimating species likelihood of occurrence across extensive landscapes is a powerful management tool. Unfortunately, available occurrence data for landscape-scale modeling is often lacking and usually only in the form of observed presences. Ecologically based pseudo-absence points were generated from within habitat envelopes to accompany presence-only data in habitat...
ERIC Educational Resources Information Center
Raley, R. Kelly; Bratter, Jenifer
2004-01-01
Using the 1987-1988 and 1992-1994 waves of the National Survey of Families and Households, the authors measure the association between Wave 1 responses to 12 questions on whom respondents would be "most willing to marry" and the likelihood of marriage by Wave 2. Preliminary analysis indicated that some questions about partner preferences…
Uncertainty analysis of signal deconvolution using a measured instrument response function
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hartouni, E. P.; Beeman, B.; Caggiano, J. A.
2016-10-05
A common analysis procedure minimizes the ln-likelihood that a set of experimental observables matches a parameterized model of the observation. The model includes a description of the underlying physical process as well as the instrument response function (IRF). Here, we investigate the National Ignition Facility (NIF) neutron time-of-flight (nTOF) spectrometers, the IRF is constructed from measurements and models. IRF measurements have a finite precision that can make significant contributions to the uncertainty estimate of the physical model’s parameters. Finally, we apply a Bayesian analysis to properly account for IRF uncertainties in calculating the ln-likelihood function used to find the optimummore » physical parameters.« less
Statistical inference of static analysis rules
NASA Technical Reports Server (NTRS)
Engler, Dawson Richards (Inventor)
2009-01-01
Various apparatus and methods are disclosed for identifying errors in program code. Respective numbers of observances of at least one correctness rule by different code instances that relate to the at least one correctness rule are counted in the program code. Each code instance has an associated counted number of observances of the correctness rule by the code instance. Also counted are respective numbers of violations of the correctness rule by different code instances that relate to the correctness rule. Each code instance has an associated counted number of violations of the correctness rule by the code instance. A respective likelihood of the validity is determined for each code instance as a function of the counted number of observances and counted number of violations. The likelihood of validity indicates a relative likelihood that a related code instance is required to observe the correctness rule. The violations may be output in order of the likelihood of validity of a violated correctness rule.
Ning, Jing; Chen, Yong; Piao, Jin
2017-07-01
Publication bias occurs when the published research results are systematically unrepresentative of the population of studies that have been conducted, and is a potential threat to meaningful meta-analysis. The Copas selection model provides a flexible framework for correcting estimates and offers considerable insight into the publication bias. However, maximizing the observed likelihood under the Copas selection model is challenging because the observed data contain very little information on the latent variable. In this article, we study a Copas-like selection model and propose an expectation-maximization (EM) algorithm for estimation based on the full likelihood. Empirical simulation studies show that the EM algorithm and its associated inferential procedure performs well and avoids the non-convergence problem when maximizing the observed likelihood. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Likelihoods for fixed rank nomination networks
HOFF, PETER; FOSDICK, BAILEY; VOLFOVSKY, ALEX; STOVEL, KATHERINE
2014-01-01
Many studies that gather social network data use survey methods that lead to censored, missing, or otherwise incomplete information. For example, the popular fixed rank nomination (FRN) scheme, often used in studies of schools and businesses, asks study participants to nominate and rank at most a small number of contacts or friends, leaving the existence of other relations uncertain. However, most statistical models are formulated in terms of completely observed binary networks. Statistical analyses of FRN data with such models ignore the censored and ranked nature of the data and could potentially result in misleading statistical inference. To investigate this possibility, we compare Bayesian parameter estimates obtained from a likelihood for complete binary networks with those obtained from likelihoods that are derived from the FRN scheme, and therefore accommodate the ranked and censored nature of the data. We show analytically and via simulation that the binary likelihood can provide misleading inference, particularly for certain model parameters that relate network ties to characteristics of individuals and pairs of individuals. We also compare these different likelihoods in a data analysis of several adolescent social networks. For some of these networks, the parameter estimates from the binary and FRN likelihoods lead to different conclusions, indicating the importance of analyzing FRN data with a method that accounts for the FRN survey design. PMID:25110586
Stark, Mitchell S.; Klein, Kerenaftali; Weide, Benjamin; Haydu, Lauren E.; Pflugfelder, Annette; Tang, Yue Hang; Palmer, Jane M.; Whiteman, David C.; Scolyer, Richard A.; Mann, Graham J.; Thompson, John F.; Long, Georgina V.; Barbour, Andrew P.; Soyer, H. Peter; Garbe, Claus; Herington, Adrian; Pollock, Pamela M.; Hayward, Nicholas K.
2015-01-01
The overall 5-year survival for melanoma is 91%. However, if distant metastasis occurs (stage IV), cure rates are < 15%. Hence, melanoma detection in earlier stages (stages I–III) maximises the chances of patient survival. We measured the expression of a panel of 17 microRNAs (miRNAs) (MELmiR-17) in melanoma tissues (stage III; n = 76 and IV; n = 10) and serum samples (collected from controls with no melanoma, n = 130; and patients with melanoma (stages I/II, n = 86; III, n = 50; and IV, n = 119)) obtained from biobanks in Australia and Germany. In melanoma tissues, members of the ‘MELmiR-17’ panel were found to be predictors of stage, recurrence, and survival. Additionally, in a minimally-invasive blood test, a seven-miRNA panel (MELmiR-7) detected the presence of melanoma (relative to controls) with high sensitivity (93%) and specificity (≥ 82%) when ≥ 4 miRNAs were expressed. Moreover, the ‘MELmiR-7’ panel characterised overall survival of melanoma patients better than both serum LDH and S100B (delta log likelihood = 11, p < 0.001). This panel was found to be superior to currently used serological markers for melanoma progression, recurrence, and survival; and would be ideally suited to monitor tumour progression in patients diagnosed with early metastatic disease (stages IIIa–c/IV M1a–b) to detect relapse following surgical or adjuvant treatment. PMID:26288839
Parent-Child Communication and Marijuana Initiation: Evidence Using Discrete-Time Survival Analysis
Nonnemaker, James M.; Silber-Ashley, Olivia; Farrelly, Matthew C.; Dench, Daniel
2012-01-01
This study supplements existing literature on the relationship between parent-child communication and adolescent drug use by exploring whether parental and/or adolescent recall of specific drug-related conversations differentially impact youth's likelihood of initiating marijuana use. Using discrete-time survival analysis, we estimated the hazard of marijuana initiation using a logit model to obtain an estimate of the relative risk of initiation. Our results suggest that parent-child communication about drug use is either not protective (no effect) or—in the case of youth reports of communication—potentially harmful (leading to increased likelihood of marijuana initiation). PMID:22958867
Comparative genomic analysis of the WRKY III gene family in populus, grape, arabidopsis and rice.
Wang, Yiyi; Feng, Lin; Zhu, Yuxin; Li, Yuan; Yan, Hanwei; Xiang, Yan
2015-09-08
WRKY III genes have significant functions in regulating plant development and resistance. In plant, WRKY gene family has been studied in many species, however, there still lack a comprehensive analysis of WRKY III genes in the woody plant species poplar, three representative lineages of flowering plant species are incorporated in most analyses: Arabidopsis (a model plant for annual herbaceous dicots), grape (one model plant for perennial dicots) and Oryza sativa (a model plant for monocots). In this study, we identified 10, 6, 13 and 28 WRKY III genes in the genomes of Populus trichocarpa, grape (Vitis vinifera), Arabidopsis thaliana and rice (Oryza sativa), respectively. Phylogenetic analysis revealed that the WRKY III proteins could be divided into four clades. By microsynteny analysis, we found that the duplicated regions were more conserved between poplar and grape than Arabidopsis or rice. We dated their duplications by Ks analysis of Populus WRKY III genes and demonstrated that all the blocks were formed after the divergence of monocots and dicots. Strong purifying selection has played a key role in the maintenance of WRKY III genes in Populus. Tissue expression analysis of the WRKY III genes in Populus revealed that five were most highly expressed in the xylem. We also performed quantitative real-time reverse transcription PCR analysis of WRKY III genes in Populus treated with salicylic acid, abscisic acid and polyethylene glycol to explore their stress-related expression patterns. This study highlighted the duplication and diversification of the WRKY III gene family in Populus and provided a comprehensive analysis of this gene family in the Populus genome. Our results indicated that the majority of WRKY III genes of Populus was expanded by large-scale gene duplication. The expression pattern of PtrWRKYIII gene identified that these genes play important roles in the xylem during poplar growth and development, and may play crucial role in defense to drought stress. Our results presented here may aid in the selection of appropriate candidate genes for further characterization of their biological functions in poplar.
The likelihood ratio as a random variable for linked markers in kinship analysis.
Egeland, Thore; Slooten, Klaas
2016-11-01
The likelihood ratio is the fundamental quantity that summarizes the evidence in forensic cases. Therefore, it is important to understand the theoretical properties of this statistic. This paper is the last in a series of three, and the first to study linked markers. We show that for all non-inbred pairwise kinship comparisons, the expected likelihood ratio in favor of a type of relatedness depends on the allele frequencies only via the number of alleles, also for linked markers, and also if the true relationship is another one than is tested for by the likelihood ratio. Exact expressions for the expectation and variance are derived for all these cases. Furthermore, we show that the expected likelihood ratio is a non-increasing function if the recombination rate increases between 0 and 0.5 when the actual relationship is the one investigated by the LR. Besides being of theoretical interest, exact expressions such as obtained here can be used for software validation as they allow to verify the correctness up to arbitrary precision. The paper also presents results and advice of practical importance. For example, we argue that the logarithm of the likelihood ratio behaves in a fundamentally different way than the likelihood ratio itself in terms of expectation and variance, in agreement with its interpretation as weight of evidence. Equipped with the results presented and freely available software, one may check calculations and software and also do power calculations.
Li, Shi; Mukherjee, Bhramar; Batterman, Stuart; Ghosh, Malay
2013-12-01
Case-crossover designs are widely used to study short-term exposure effects on the risk of acute adverse health events. While the frequentist literature on this topic is vast, there is no Bayesian work in this general area. The contribution of this paper is twofold. First, the paper establishes Bayesian equivalence results that require characterization of the set of priors under which the posterior distributions of the risk ratio parameters based on a case-crossover and time-series analysis are identical. Second, the paper studies inferential issues under case-crossover designs in a Bayesian framework. Traditionally, a conditional logistic regression is used for inference on risk-ratio parameters in case-crossover studies. We consider instead a more general full likelihood-based approach which makes less restrictive assumptions on the risk functions. Formulation of a full likelihood leads to growth in the number of parameters proportional to the sample size. We propose a semi-parametric Bayesian approach using a Dirichlet process prior to handle the random nuisance parameters that appear in a full likelihood formulation. We carry out a simulation study to compare the Bayesian methods based on full and conditional likelihood with the standard frequentist approaches for case-crossover and time-series analysis. The proposed methods are illustrated through the Detroit Asthma Morbidity, Air Quality and Traffic study, which examines the association between acute asthma risk and ambient air pollutant concentrations. © 2013, The International Biometric Society.
NASA Astrophysics Data System (ADS)
Shafii, M.; Tolson, B.; Matott, L. S.
2012-04-01
Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.
A survey of nebulae around Galactic Wolf-Rayet stars in the southern sky, 1
NASA Technical Reports Server (NTRS)
Marston, A. P.; Chu, Y.-H.; Garcia-Segura, G.
1994-01-01
Images are presented from the first half of a survey of all Galactic Wolf-Rayet stars in the catalog of van der Hucht et al. (1981) residing in the southern skies. Previous surveys used only existing broad-band photographic plates. Encouraged by successes using CCD imaging with interference filters of the LMC and northern Galaxy (Miller & Chu 1993), we have expanded the survey to the southern hemisphere. In the first half of our southern survey, H alpha and (O III) narrow-band CCD images of fields centered on known Wolf-Rayet stars have indicated the existence of six new ring nebulae as well as revealing previously unobserved morphological features in the known ring nebulae. An example of this is an almost perfect ring of (O III) emission residing interior to the previously observed H alpha filaments of the Wolf-Rayet ring nebulae RCW 104. Our surveys to date indicate that 21% of all Wolf-Rayet stars have ring nebulae, with WN-type Wolf-Rayet stars having a greater likelihood for an associated ring.
Pulse duration settings in subthalamic stimulation for Parkinson's disease
Steigerwald, Frank; Timmermann, Lars; Kühn, Andrea; Schnitzler, Alfons; Reich, Martin M.; Kirsch, Anna Dalal; Barbe, Michael Thomas; Visser‐Vandewalle, Veerle; Hübl, Julius; van Riesen, Christoph; Groiss, Stefan Jun; Moldovan, Alexia‐Sabine; Lin, Sherry; Carcieri, Stephen; Manola, Ljubomir
2017-01-01
ABSTRACT Background Stimulation parameters in deep brain stimulation (DBS) of the subthalamic nucleus for Parkinson's disease (PD) are rarely tested in double‐blind conditions. Evidence‐based recommendations on optimal stimulator settings are needed. Results from the CUSTOM‐DBS study are reported, comparing 2 pulse durations. Methods A total of 15 patients were programmed using a pulse width of 30 µs (test) or 60 µs (control). Efficacy and side‐effect thresholds and unified PD rating scale (UPDRS) III were measured in meds‐off (primary outcome). The therapeutic window was the difference between patients’ efficacy and side effect thresholds. Results The therapeutic window was significantly larger at 30 µs than 60 µs (P = ·0009) and the efficacy (UPDRS III score) was noninferior (P = .00008). Interpretation Subthalamic neurostimulation at 30 µs versus 60 µs pulse width is equally effective on PD motor signs, is more energy efficient, and has less likelihood of stimulation‐related side effects. © 2017 The Authors. Movement Disorders published by Wiley Periodicals, Inc. on behalf of International Parkinson and Movement Disorder Society. PMID:29165837
Ice hockey shoulder pad design and the effect on head response during shoulder-to-head impacts.
Richards, Darrin; Ivarsson, B Johan; Scher, Irving; Hoover, Ryan; Rodowicz, Kathleen; Cripton, Peter
2016-11-01
Ice hockey body checks involving direct shoulder-to-head contact frequently result in head injury. In the current study, we examined the effect of shoulder pad style on the likelihood of head injury from a shoulder-to-head check. Shoulder-to-head body checks were simulated by swinging a modified Hybrid-III anthropomorphic test device (ATD) with and without shoulder pads into a stationary Hybrid-III ATD at 21 km/h. Tests were conducted with three different styles of shoulder pads (traditional, integrated and tethered) and without shoulder pads for the purpose of control. Head response kinematics for the stationary ATD were measured. Compared to the case of no shoulder pads, the three different pad styles significantly (p < 0.05) reduced peak resultant linear head accelerations of the stationary ATD by 35-56%. The integrated shoulder pads reduced linear head accelerations by an additional 18-21% beyond the other two styles of shoulder pads. The data presented here suggest that shoulder pads can be designed to help protect the head of the struck player in a shoulder-to-head check.
Analyzing Planck and low redshift data sets with advanced statistical methods
NASA Astrophysics Data System (ADS)
Eifler, Tim
The recent ESA/NASA Planck mission has provided a key data set to constrain cosmology that is most sensitive to physics of the early Universe, such as inflation and primordial NonGaussianity (Planck 2015 results XIII). In combination with cosmological probes of the LargeScale Structure (LSS), the Planck data set is a powerful source of information to investigate late time phenomena (Planck 2015 results XIV), e.g. the accelerated expansion of the Universe, the impact of baryonic physics on the growth of structure, and the alignment of galaxies in their dark matter halos. It is the main objective of this proposal to re-analyze the archival Planck data, 1) with different, more recently developed statistical methods for cosmological parameter inference, and 2) to combine Planck and ground-based observations in an innovative way. We will make the corresponding analysis framework publicly available and believe that it will set a new standard for future CMB-LSS analyses. Advanced statistical methods, such as the Gibbs sampler (Jewell et al 2004, Wandelt et al 2004) have been critical in the analysis of Planck data. More recently, Approximate Bayesian Computation (ABC, see Weyant et al 2012, Akeret et al 2015, Ishida et al 2015, for cosmological applications) has matured to an interesting tool in cosmological likelihood analyses. It circumvents several assumptions that enter the standard Planck (and most LSS) likelihood analyses, most importantly, the assumption that the functional form of the likelihood of the CMB observables is a multivariate Gaussian. Beyond applying new statistical methods to Planck data in order to cross-check and validate existing constraints, we plan to combine Planck and DES data in a new and innovative way and run multi-probe likelihood analyses of CMB and LSS observables. The complexity of multiprobe likelihood analyses scale (non-linearly) with the level of correlations amongst the individual probes that are included. For the multi-probe analysis proposed here we will use the existing CosmoLike software, a computationally efficient analysis framework that is unique in its integrated ansatz of jointly analyzing probes of large-scale structure (LSS) of the Universe. We plan to combine CosmoLike with publicly available CMB analysis software (Camb, CLASS) to include modeling capabilities of CMB temperature, polarization, and lensing measurements. The resulting analysis framework will be capable to independently and jointly analyze data from the CMB and from various probes of the LSS of the Universe. After completion we will utilize this framework to check for consistency amongst the individual probes and subsequently run a joint likelihood analysis of probes that are not in tension. The inclusion of Planck information in a joint likelihood analysis substantially reduces DES uncertainties in cosmological parameters, and allows for unprecedented constraints on parameters that describe astrophysics. In their recent review Observational Probes of Cosmic Acceleration (Weinberg et al 2013) the authors emphasize the value of a balanced program that employs several of the most powerful methods in combination, both to cross-check systematic uncertainties and to take advantage of complementary information. The work we propose follows exactly this idea: 1) cross-checking existing Planck results with alternative methods in the data analysis, 2) checking for consistency of Planck and DES data, and 3) running a joint analysis to constrain cosmology and astrophysics. It is now expedient to develop and refine multi-probe analysis strategies that allow the comparison and inclusion of information from disparate probes to optimally obtain cosmology and astrophysics. Analyzing Planck and DES data poses an ideal opportunity for this purpose and corresponding lessons will be of great value for the science preparation of Euclid and WFIRST.
MCMC multilocus lod scores: application of a new approach.
George, Andrew W; Wijsman, Ellen M; Thompson, Elizabeth A
2005-01-01
On extended pedigrees with extensive missing data, the calculation of multilocus likelihoods for linkage analysis is often beyond the computational bounds of exact methods. Growing interest therefore surrounds the implementation of Monte Carlo estimation methods. In this paper, we demonstrate the speed and accuracy of a new Markov chain Monte Carlo method for the estimation of linkage likelihoods through an analysis of real data from a study of early-onset Alzheimer's disease. For those data sets where comparison with exact analysis is possible, we achieved up to a 100-fold increase in speed. Our approach is implemented in the program lm_bayes within the framework of the freely available MORGAN 2.6 package for Monte Carlo genetic analysis (http://www.stat.washington.edu/thompson/Genepi/MORGAN/Morgan.shtml).
The Relation between Factor Score Estimates, Image Scores, and Principal Component Scores
ERIC Educational Resources Information Center
Velicer, Wayne F.
1976-01-01
Investigates the relation between factor score estimates, principal component scores, and image scores. The three methods compared are maximum likelihood factor analysis, principal component analysis, and a variant of rescaled image analysis. (RC)
Salinero-Fort, Miguel Ángel; de Burgos-Lunar, Carmen; Mostaza Prieto, José; Lahoz Rallo, Carlos; Abánades-Herranz, Juan Carlos; Gómez-Campelo, Paloma; Laguna Cuesta, Fernando; Estirado De Cabo, Eva; García Iglesias, Francisca; González Alegre, Teresa; Fernández Puntero, Belén; Montesano Sánchez, Luis; Vicent López, David; Cornejo Del Río, Víctor; Fernández García, Pedro J; Sabín Rodríguez, Concesa; López López, Silvia; Patrón Barandío, Pedro
2015-01-01
Introduction The incidence of type 2 diabetes mellitus (T2DM) is increasing worldwide. When diagnosed, many patients already have organ damage or advance subclinical atherosclerosis. An early diagnosis could allow the implementation of lifestyle changes and treatment options aimed at delaying the progression of the disease and to avoid cardiovascular complications. Different scores for identifying undiagnosed diabetes have been reported, however, their performance in populations of southern Europe has not been sufficiently evaluated. The main objectives of our study are: to evaluate the screening performance and cut-off points of the main scores that identify the risk of undiagnosed T2DM and prediabetes in a Spanish population, and to develop and validate our own predictive models of undiagnosed T2DM (screening model), and future T2DM (prediction risk model) after 5-year follow-up. As a secondary objective, we will evaluate the atherosclerotic burden of the population with undiagnosed T2DM. Methods and analysis Population-based prospective cohort study with baseline screening, to evaluate the performance of the FINDRISC, DANISH, DESIR, ARIC and QDScore, against the gold standard tests: Fasting plasma glucose, oral glucose tolerance and/or HbA1c. The sample size will include 1352 participants between the ages of 45 and 74 years. Analysis: sensitivity, specificity, positive predictive value, negative predictive value, likelihood ratio positive, likelihood ratio negative and receiver operating characteristic curves and area under curve. Binary logistic regression for the first 700 individuals (derivation) and last 652 (validation) will be performed. All analyses will be calculated with their 95% CI; statistical significance will be p<0.05. Ethics and dissemination The study protocol has been approved by the Research Ethics Committee of the Carlos III Hospital (Madrid). The score performance and predictive model will be presented in medical conferences, workshops, seminars and round table discussions. Furthermore, the predictive model will be published in a peer-reviewed medical journal to further increase the exposure of the scores. PMID:26220868
Changes in mode of travel to work: a natural experimental study of new transport infrastructure.
Heinen, Eva; Panter, Jenna; Mackett, Roger; Ogilvie, David
2015-06-20
New transport infrastructure may promote a shift towards active travel, thereby improving population health. The purpose of this study was to determine the effect of a major transport infrastructure project on commuters' mode of travel, trip frequency and distance travelled to work. Quasi-experimental analysis nested within a cohort study of 470 adults working in Cambridge, UK. The intervention consisted of the opening of a guided busway with a path for walking and cycling in 2011. Exposure to the intervention was defined as the negative of the square root of the shortest distance from home to busway. The outcome measures were changes in commute mode share and number of commute trips - both based on a seven-day travel-to-work record collected before (2009) and after (2012) the intervention - and change in objective commute distance. The mode share outcomes were changes in the proportions of trips (i) involving any active travel, (ii) involving any public transport, and (iii) made entirely by car. Separate multinomial regression models were estimated adjusting for commute and sociodemographic characteristics, residential settlement size and life events. Proximity to the busway predicted an increased likelihood of a large (>30 %) increase in the share of commute trips involving any active travel (relative risk ratio [RRR] 1.80, 95 % CI 1.27, 2.55) and a large decrease in the share of trips made entirely by car (RRR 2.09, 95 % CI 1.35, 3.21), as well as a lower likelihood of a small (<30 %) reduction in the share of trips involving any active travel (RRR 0.47, 95 % CI 0.28, 0.81). It was not associated with changes in the share of commute trips involving any public transport, the number of commute trips, or commute distance. The new infrastructure promoted an increase in the share of commuting trips involving active travel and a decrease in the share made entirely by car. Further analysis will show the extent to which the changes in commute mode share were translated into an increase in time spent in active commuting and consequent health gain.
Newman, Phil; Adams, Roger; Waddington, Gordon
2012-09-01
To examine the relationship between two clinical test results and future diagnosis of (Medial Tibial Stress Syndrome) MTSS in personnel at a military trainee establishment. Data from a preparticipation musculoskeletal screening test performed on 384 Australian Defence Force Academy Officer Cadets were compared against 693 injuries reported by 326 of the Officer Cadets in the following 16 months. Data were held in an Injury Surveillance database and analysed using χ² and Fisher's Exact tests, and Receiver Operating Characteristic Curve analysis. Diagnosis of MTSS, confirmed by an independent blinded health practitioner. Both the palpation and oedema clinical tests were each found to be significant predictors for later onset of MTSS. Specifically: Shin palpation test OR 4.63, 95% CI 2.5 to 8.5, Positive Likelihood Ratio 3.38, Negative Likelihood Ratio 0.732, Pearson χ² p<0.001; Shin oedema test OR 76.1 95% CI 9.6 to 602.7, Positive Likelihood Ratio 7.26, Negative Likelihood Ratio 0.095, Fisher's Exact p<0.001; Combined Shin Palpation Test and Shin Oedema Test Positive Likelihood Ratio 7.94, Negative Likelihood Ratio <0.001, Fisher's Exact p<0.001. Female gender was found to be an independent risk factor (OR 2.97, 95% CI 1.66 to 5.31, Positive Likelihood Ratio 2.09, Negative Likelihood Ratio 0.703, Pearson χ² p<0.001) for developing MTSS. The tests for MTSS employed here are components of a normal clinical examination used to diagnose MTSS. This paper confirms that these tests and female gender can also be confidently applied in predicting those in an asymptomatic population who are at greater risk of developing MTSS symptoms with activity at some point in the future.
Search for lepton flavor violation in upsilon decays.
Love, W; Savinov, V; Lopez, A; Mehrabyan, S; Mendez, H; Ramirez, J; Huang, G S; Miller, D H; Pavlunin, V; Sanghi, B; Shipsey, I P J; Xin, B; Adams, G S; Anderson, M; Cummings, J P; Danko, I; Hu, D; Moziak, B; Napolitano, J; He, Q; Insler, J; Muramatsu, H; Park, C S; Thorndike, E H; Yang, F; Artuso, M; Blusk, S; Horwitz, N; Khalil, S; Li, J; Menaa, N; Mountain, R; Nisar, S; Randrianarivony, K; Sia, R; Skwarnicki, T; Stone, S; Wang, J C; Bonvicini, G; Cinabro, D; Dubrovin, M; Lincoln, A; Asner, D M; Edwards, K W; Naik, P; Briere, R A; Ferguson, T; Tatishvili, G; Vogel, H; Watkins, M E; Rosner, J L; Adam, N E; Alexander, J P; Berkelman, K; Cassel, D G; Duboscq, J E; Ehrlich, R; Fields, L; Galik, R S; Gibbons, L; Gray, R; Gray, S W; Hartill, D L; Heltsley, B K; Hertz, D; Jones, C D; Kandaswamy, J; Kreinick, D L; Kuznetsov, V E; Mahlke-Krüger, H; Mohapatra, D; Onyisi, P U E; Patterson, J R; Peterson, D; Pivarski, J; Riley, D; Ryd, A; Sadoff, A J; Schwarthoff, H; Shi, X; Stroiney, S; Sun, W M; Wilksen, T; Athar, S B; Patel, R; Yelton, J; Rubin, P; Cawlfield, C; Eisenstein, B I; Karliner, I; Kim, D; Lowrey, N; Selen, M; White, E J; Wiss, J; Mitchell, R E; Shepherd, M R; Besson, D; Pedlar, T K; Cronin-Hennessy, D; Gao, K Y; Hietala, J; Kubota, Y; Klein, T; Lang, B W; Poling, R; Scott, A W; Smith, A; Zweber, P; Dobbs, S; Metreveli, Z; Seth, K K; Tomaradze, A; Ecklund, K M
2008-11-14
In this Letter, we describe a search for lepton flavor violation (LFV) in the bottomonium system. We search for leptonic decays Upsilon(nS)-->mutau (n=1, 2, and 3) using the data collected with the CLEO III detector. We identify the tau lepton using its leptonic decay nu_{tau}nu[over ]_{e}e and utilize multidimensional likelihood fitting with probability density function shapes measured from independent data samples. We report our estimates of 95% C.L. upper limits on LFV branching fractions of Upsilon mesons. We interpret our results in terms of the exclusion plot for the energy scale of a hypothetical new interaction versus its effective LFV coupling in the framework of effective field theory.
Search for Lepton Flavor Violation in Upsilon Decays
NASA Astrophysics Data System (ADS)
Love, W.; Savinov, V.; Lopez, A.; Mehrabyan, S.; Mendez, H.; Ramirez, J.; Huang, G. S.; Miller, D. H.; Pavlunin, V.; Sanghi, B.; Shipsey, I. P. J.; Xin, B.; Adams, G. S.; Anderson, M.; Cummings, J. P.; Danko, I.; Hu, D.; Moziak, B.; Napolitano, J.; He, Q.; Insler, J.; Muramatsu, H.; Park, C. S.; Thorndike, E. H.; Yang, F.; Artuso, M.; Blusk, S.; Horwitz, N.; Khalil, S.; Li, J.; Menaa, N.; Mountain, R.; Nisar, S.; Randrianarivony, K.; Sia, R.; Skwarnicki, T.; Stone, S.; Wang, J. C.; Bonvicini, G.; Cinabro, D.; Dubrovin, M.; Lincoln, A.; Asner, D. M.; Edwards, K. W.; Naik, P.; Briere, R. A.; Ferguson, T.; Tatishvili, G.; Vogel, H.; Watkins, M. E.; Rosner, J. L.; Adam, N. E.; Alexander, J. P.; Berkelman, K.; Cassel, D. G.; Duboscq, J. E.; Ehrlich, R.; Fields, L.; Galik, R. S.; Gibbons, L.; Gray, R.; Gray, S. W.; Hartill, D. L.; Heltsley, B. K.; Hertz, D.; Jones, C. D.; Kandaswamy, J.; Kreinick, D. L.; Kuznetsov, V. E.; Mahlke-Krüger, H.; Mohapatra, D.; Onyisi, P. U. E.; Patterson, J. R.; Peterson, D.; Pivarski, J.; Riley, D.; Ryd, A.; Sadoff, A. J.; Schwarthoff, H.; Shi, X.; Stroiney, S.; Sun, W. M.; Wilksen, T.; Athar, S. B.; Patel, R.; Yelton, J.; Rubin, P.; Cawlfield, C.; Eisenstein, B. I.; Karliner, I.; Kim, D.; Lowrey, N.; Selen, M.; White, E. J.; Wiss, J.; Mitchell, R. E.; Shepherd, M. R.; Besson, D.; Pedlar, T. K.; Cronin-Hennessy, D.; Gao, K. Y.; Hietala, J.; Kubota, Y.; Klein, T.; Lang, B. W.; Poling, R.; Scott, A. W.; Smith, A.; Zweber, P.; Dobbs, S.; Metreveli, Z.; Seth, K. K.; Tomaradze, A.; Ecklund, K. M.
2008-11-01
In this Letter, we describe a search for lepton flavor violation (LFV) in the bottomonium system. We search for leptonic decays Υ(nS)→μτ (n=1, 2, and 3) using the data collected with the CLEO III detector. We identify the τ lepton using its leptonic decay ντν¯ee and utilize multidimensional likelihood fitting with probability density function shapes measured from independent data samples. We report our estimates of 95% C.L. upper limits on LFV branching fractions of Υ mesons. We interpret our results in terms of the exclusion plot for the energy scale of a hypothetical new interaction versus its effective LFV coupling in the framework of effective field theory.
ERIC Educational Resources Information Center
Molenaar, Peter C. M.; Nesselroade, John R.
1998-01-01
Pseudo-Maximum Likelihood (p-ML) and Asymptotically Distribution Free (ADF) estimation methods for estimating dynamic factor model parameters within a covariance structure framework were compared through a Monte Carlo simulation. Both methods appear to give consistent model parameter estimates, but only ADF gives standard errors and chi-square…
2017-01-01
Abstract This review traces, through psychiatric textbooks, the history of the Kraepelinian concept of paranoia in the 20th century and then relates the common reported symptoms and signs to the diagnostic criteria for paranoia/delusional disorder in DSM-III through DSM-5. Clinical descriptions of paranoia appearing in 10 textbooks, published 1899 to 1970, revealed 11 prominent symptoms and signs reported by 5 or more authors. Three symptoms (systematized delusions, minimal hallucinations, and prominent ideas of reference) and 2 signs (chronic course and minimal affective deterioration) were reported by 8 or 9 of the authors. Four textbook authors rejected the Kraepelinian concept of paranoia. A weak relationship was seen between the frequency with which the clinical features were reported and the likelihood of their inclusion in modern DSM manuals. Indeed, the diagnostic criteria for paranoia/delusional disorder shifted substantially from DSM-III to DSM-5. The modern operationalized criteria for paranoia/delusional disorder do not well reflect the symptoms and signs frequently reported by historical experts. In contrast to results of similar reviews for depression, schizophrenia and mania, the clinical construct of paranoia/delusional disorder has been somewhat unstable in Western Psychiatry since the turn of the 20th century as reflected in both textbooks and the DSM editions. PMID:28003468
A tutorial on pilot studies: the what, why and how
2010-01-01
Pilot studies for phase III trials - which are comparative randomized trials designed to provide preliminary evidence on the clinical efficacy of a drug or intervention - are routinely performed in many clinical areas. Also commonly know as "feasibility" or "vanguard" studies, they are designed to assess the safety of treatment or interventions; to assess recruitment potential; to assess the feasibility of international collaboration or coordination for multicentre trials; to increase clinical experience with the study medication or intervention for the phase III trials. They are the best way to assess feasibility of a large, expensive full-scale study, and in fact are an almost essential pre-requisite. Conducting a pilot prior to the main study can enhance the likelihood of success of the main study and potentially help to avoid doomed main studies. The objective of this paper is to provide a detailed examination of the key aspects of pilot studies for phase III trials including: 1) the general reasons for conducting a pilot study; 2) the relationships between pilot studies, proof-of-concept studies, and adaptive designs; 3) the challenges of and misconceptions about pilot studies; 4) the criteria for evaluating the success of a pilot study; 5) frequently asked questions about pilot studies; 7) some ethical aspects related to pilot studies; and 8) some suggestions on how to report the results of pilot investigations using the CONSORT format. PMID:20053272
Li, Zhanzhan; Zhou, Qin; Li, Yanyan; Yan, Shipeng; Fu, Jun; Huang, Xinqiong; Shen, Liangfang
2017-02-28
We conducted a meta-analysis to evaluate the diagnostic values of mean cerebral blood volume for recurrent and radiation injury in glioma patients. We performed systematic electronic searches for eligible study up to August 8, 2016. Bivariate mixed effects models were used to estimate the combined sensitivity, specificity, positive likelihood ratios, negative likelihood ratios, diagnostic odds ratios and their 95% confidence intervals (CIs). Fifteen studies with a total number of 576 participants were enrolled. The pooled sensitivity and specificity of diagnostic were 0.88 (95%CI: 0.82-0.92) and 0.85 (95%CI: 0.68-0.93). The pooled positive likelihood ratio is 5.73 (95%CI: 2.56-12.81), negative likelihood ratio is 0.15 (95%CI: 0.10-0.22), and the diagnostic odds ratio is 39.34 (95%CI:13.96-110.84). The summary receiver operator characteristic is 0.91 (95%CI: 0.88-0.93). However, the Deek's plot suggested publication bias may exist (t=2.30, P=0.039). Mean cerebral blood volume measurement methods seems to be very sensitive and highly specific to differentiate recurrent and radiation injury in glioma patients. The results should be interpreted with caution because of the potential bias.
On Bayesian Testing of Additive Conjoint Measurement Axioms Using Synthetic Likelihood.
Karabatsos, George
2018-06-01
This article introduces a Bayesian method for testing the axioms of additive conjoint measurement. The method is based on an importance sampling algorithm that performs likelihood-free, approximate Bayesian inference using a synthetic likelihood to overcome the analytical intractability of this testing problem. This new method improves upon previous methods because it provides an omnibus test of the entire hierarchy of cancellation axioms, beyond double cancellation. It does so while accounting for the posterior uncertainty that is inherent in the empirical orderings that are implied by these axioms, together. The new method is illustrated through a test of the cancellation axioms on a classic survey data set, and through the analysis of simulated data.
Towers, Sherry; Mubayi, Anuj; Castillo-Chavez, Carlos
2018-01-01
When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature, but is critical to analysis reproducibility and robustness. When an analysis cannot distinguish between a null and alternate hypothesis, care must be taken to ensure that the analysis methodology itself maximizes the use of information in the data that can distinguish between the two hypotheses. The use of binned methods by Lankford & Tomek (2017), that examined how many mass killings fell within a 14 day window from a previous mass killing, substantially reduced the sensitivity of their analysis to contagion effects. The unbinned likelihood methods used by Towers et al (2015) did not suffer from this problem. While a binned analysis might be favorable for simplicity and clarity of presentation, unbinned likelihood methods are preferable when effects might be somewhat subtle.
Mubayi, Anuj; Castillo-Chavez, Carlos
2018-01-01
Background When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. Methods In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature, but is critical to analysis reproducibility and robustness. Conclusions When an analysis cannot distinguish between a null and alternate hypothesis, care must be taken to ensure that the analysis methodology itself maximizes the use of information in the data that can distinguish between the two hypotheses. The use of binned methods by Lankford & Tomek (2017), that examined how many mass killings fell within a 14 day window from a previous mass killing, substantially reduced the sensitivity of their analysis to contagion effects. The unbinned likelihood methods used by Towers et al (2015) did not suffer from this problem. While a binned analysis might be favorable for simplicity and clarity of presentation, unbinned likelihood methods are preferable when effects might be somewhat subtle. PMID:29742115
NASA Astrophysics Data System (ADS)
Cheng, Qin-Bo; Chen, Xi; Xu, Chong-Yu; Reinhardt-Imjela, Christian; Schulte, Achim
2014-11-01
In this study, the likelihood functions for uncertainty analysis of hydrological models are compared and improved through the following steps: (1) the equivalent relationship between the Nash-Sutcliffe Efficiency coefficient (NSE) and the likelihood function with Gaussian independent and identically distributed residuals is proved; (2) a new estimation method of the Box-Cox transformation (BC) parameter is developed to improve the effective elimination of the heteroscedasticity of model residuals; and (3) three likelihood functions-NSE, Generalized Error Distribution with BC (BC-GED) and Skew Generalized Error Distribution with BC (BC-SGED)-are applied for SWAT-WB-VSA (Soil and Water Assessment Tool - Water Balance - Variable Source Area) model calibration in the Baocun watershed, Eastern China. Performances of calibrated models are compared using the observed river discharges and groundwater levels. The result shows that the minimum variance constraint can effectively estimate the BC parameter. The form of the likelihood function significantly impacts on the calibrated parameters and the simulated results of high and low flow components. SWAT-WB-VSA with the NSE approach simulates flood well, but baseflow badly owing to the assumption of Gaussian error distribution, where the probability of the large error is low, but the small error around zero approximates equiprobability. By contrast, SWAT-WB-VSA with the BC-GED or BC-SGED approach mimics baseflow well, which is proved in the groundwater level simulation. The assumption of skewness of the error distribution may be unnecessary, because all the results of the BC-SGED approach are nearly the same as those of the BC-GED approach.
Sangster, George; Alström, Per; Forsmark, Emma; Olsson, Urban
2010-10-01
The chats and flycatchers (Muscicapidae) represent an assemblage of 275 species in 48 genera. Defining natural groups within this assemblage has been challenging because of its high diversity and a paucity of phylogenetically informative morphological characters. We assessed the phylogenetic relationships of 124 species and 34 genera of Muscicapidae, and 20 species of Turdidae, using molecular sequence data from one mitochondrial gene and three nuclear loci, in total 3240bp. Bayesian and maximum likelihood analyses yielded a well-resolved tree in which nearly all basal nodes were strongly supported. The traditionally defined Muscicapidae, Muscicapinae and Saxicolinae were paraphyletic. Four major clades are recognized in Muscicapidae: Muscicapinae, Niltavinae (new family-group name), Erithacinae and Saxicolinae. Interesting relationships recovered by this analysis include: (i) a clade comprising the 'blue' flycatcher genera Niltava, Cyornis, Cyanoptila and Eumyias and some species of Rhinomyias; (ii) the position of Erithacus rubecula in a clade of otherwise exclusively African species; (iii) a close relationship between the shortwing Heinrichia calligyna and the flycatcher Rhinomyias insignis; (iv) a sister-relationship between forktails Enicurus and whistling thrushes Myophonus; and (v) a sister relationship of Ficedula and the 'chats'Monticola, Phoenicurus, Saxicola and Oenanthe. A high number of traditionally defined genera was found to be paraphyletic or polyphyletic. Copyright 2010 Elsevier Inc. All rights reserved.
Phillips, Patrick P J; Dooley, Kelly E; Gillespie, Stephen H; Heinrich, Norbert; Stout, Jason E; Nahid, Payam; Diacon, Andreas H; Aarnoutse, Rob E; Kibiki, Gibson S; Boeree, Martin J; Hoelscher, Michael
2016-03-23
The standard 6-month four-drug regimen for the treatment of drug-sensitive tuberculosis has remained unchanged for decades and is inadequate to control the epidemic. Shorter, simpler regimens are urgently needed to defeat what is now the world's greatest infectious disease killer. We describe the Phase IIC Selection Trial with Extended Post-treatment follow-up (STEP) as a novel hybrid phase II/III trial design to accelerate regimen development. In the Phase IIC STEP trial, the experimental regimen is given for the duration for which it will be studied in phase III (presently 3 or 4 months) and patients are followed for clinical outcomes of treatment failure and relapse for a total of 12 months from randomisation. Operating characteristics of the trial design are explored assuming a classical frequentist framework as well as a Bayesian framework with flat and sceptical priors. A simulation study is conducted using data from the RIFAQUIN phase III trial to illustrate how such a design could be used in practice. With 80 patients per arm, and two (2.5 %) unfavourable outcomes in the STEP trial, there is a probability of 0.99 that the proportion of unfavourable outcomes in a potential phase III trial would be less than 12 % and a probability of 0.91 that the proportion of unfavourable outcomes would be less than 8 %. With six (7.5 %) unfavourable outcomes, there is a probability of 0.82 that the proportion of unfavourable outcomes in a potential phase III trial would be less than 12 % and a probability of 0.41 that it would be less than 8 %. Simulations using data from the RIFAQUIN trial show that a STEP trial with 80 patients per arm would have correctly shown that the Inferior Regimen should not proceed to phase III and would have had a high chance (0.88) of either showing that the Successful Regimen could proceed to phase III or that it might require further optimisation. Collection of definitive clinical outcome data in a relatively small number of participants over only 12 months provides valuable information about the likelihood of success in a future phase III trial. We strongly believe that the STEP trial design described herein is an important tool that would allow for more informed decision-making and accelerate regimen development.
NASA Astrophysics Data System (ADS)
Pelle, A.; Allen, M.; Fu, J. S.
2013-12-01
With rising population and increasing urban density, it is of pivotal importance for urban planners to plan for increasing extreme precipitation events. Climate models indicate that an increase in global mean temperature will lead to increased frequency and intensity of storms of a variety of types. Analysis of results from the Coupled Model Intercomparison Project, Phase 5 (CMIP5) has demonstrated that global climate models severely underestimate precipitation, however. Preliminary results from dynamical downscaling indicate that Philadelphia, Pennsylvania is expected to experience the greatest increase of precipitation due to an increase in annual extreme events in the US. New York City, New York and Chicago, Illinois are anticipated to have similarly large increases in annual extreme precipitation events. In order to produce more accurate results, we downscale Philadelphia, Chicago, and New York City using the Weather Research and Forecasting model (WRF). We analyze historical precipitation data and WRF output utilizing a Log Pearson Type III (LP3) distribution for frequency of extreme precipitation events. This study aims to determine the likelihood of extreme precipitation in future years and its effect on the of cost of stormwater management for these three cities.
Witt, Cordelie E; Arbabi, Saman; Nathens, Avery B; Vavilala, Monica S; Rivara, Frederick P
2017-04-01
The implications of childhood obesity on pediatric trauma outcomes are not clearly established. Anthropomorphic data were recently added to the National Trauma Data Bank (NTDB) Research Datasets, enabling a large, multicenter evaluation of the effect of obesity on pediatric trauma patients. Children ages 2 to 19years who required hospitalization for traumatic injury were identified in the 2013-2014 NTDB Research Datasets. Age and gender-specific body mass indices (BMI) were calculated. Outcomes included injury patterns, operative procedures, complications, and hospital utilization parameters. Data from 149,817 pediatric patients were analyzed; higher BMI percentiles were associated with significantly more extremity injuries, and fewer injuries to the head, abdomen, thorax and spine (p values <0.001). On multivariable analysis, higher BMI percentiles were associated with significantly increased likelihood of death, deep venous thrombosis, pulmonary embolus and pneumonia; although there was no difference in risk of overall complications. Obese children also had significantly longer lengths of stay and more frequent ventilator requirement. Among children admitted after trauma, increased BMI percentile is associated with increased risk of death and potentially preventable complications. These findings suggest that obese children may require different management than nonobese counterparts to prevent complications. Level III; prognosis study. Copyright © 2017 Elsevier Inc. All rights reserved.
Hybrid pairwise likelihood analysis of animal behavior experiments.
Cattelan, Manuela; Varin, Cristiano
2013-12-01
The study of the determinants of fights between animals is an important issue in understanding animal behavior. For this purpose, tournament experiments among a set of animals are often used by zoologists. The results of these tournament experiments are naturally analyzed by paired comparison models. Proper statistical analysis of these models is complicated by the presence of dependence between the outcomes of fights because the same animal is involved in different contests. This paper discusses two different model specifications to account for between-fights dependence. Models are fitted through the hybrid pairwise likelihood method that iterates between optimal estimating equations for the regression parameters and pairwise likelihood inference for the association parameters. This approach requires the specification of means and covariances only. For this reason, the method can be applied also when the computation of the joint distribution is difficult or inconvenient. The proposed methodology is investigated by simulation studies and applied to real data about adult male Cape Dwarf Chameleons. © 2013, The International Biometric Society.
Schminkey, Donna L; von Oertzen, Timo; Bullock, Linda
2016-08-01
With increasing access to population-based data and electronic health records for secondary analysis, missing data are common. In the social and behavioral sciences, missing data frequently are handled with multiple imputation methods or full information maximum likelihood (FIML) techniques, but healthcare researchers have not embraced these methodologies to the same extent and more often use either traditional imputation techniques or complete case analysis, which can compromise power and introduce unintended bias. This article is a review of options for handling missing data, concluding with a case study demonstrating the utility of multilevel structural equation modeling using full information maximum likelihood (MSEM with FIML) to handle large amounts of missing data. MSEM with FIML is a parsimonious and hypothesis-driven strategy to cope with large amounts of missing data without compromising power or introducing bias. This technique is relevant for nurse researchers faced with ever-increasing amounts of electronic data and decreasing research budgets. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Sustainability likelihood of remediation options for metal-contaminated soil/sediment.
Chen, Season S; Taylor, Jessica S; Baek, Kitae; Khan, Eakalak; Tsang, Daniel C W; Ok, Yong Sik
2017-05-01
Multi-criteria analysis and detailed impact analysis were carried out to assess the sustainability of four remedial alternatives for metal-contaminated soil/sediment at former timber treatment sites and harbour sediment with different scales. The sustainability was evaluated in the aspects of human health and safety, environment, stakeholder concern, and land use, under four different scenarios with varying weighting factors. The Monte Carlo simulation was performed to reveal the likelihood of accomplishing sustainable remediation with different treatment options at different sites. The results showed that in-situ remedial technologies were more sustainable than ex-situ ones, where in-situ containment demonstrated both the most sustainable result and the highest probability to achieve sustainability amongst the four remedial alternatives in this study, reflecting the lesser extent of off-site and on-site impacts. Concerns associated with ex-situ options were adverse impacts tied to all four aspects and caused by excavation, extraction, and off-site disposal. The results of this study suggested the importance of considering the uncertainties resulting from the remedial options (i.e., stochastic analysis) in addition to the overall sustainability scores (i.e., deterministic analysis). The developed framework and model simulation could serve as an assessment for the sustainability likelihood of remedial options to ensure sustainable remediation of contaminated sites. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Iliff, K. W.; Maine, R. E.
1976-01-01
A maximum likelihood estimation method was applied to flight data and procedures to facilitate the routine analysis of a large amount of flight data were described. Techniques that can be used to obtain stability and control derivatives from aircraft maneuvers that are less than ideal for this purpose are described. The techniques involve detecting and correcting the effects of dependent or nearly dependent variables, structural vibration, data drift, inadequate instrumentation, and difficulties with the data acquisition system and the mathematical model. The use of uncertainty levels and multiple maneuver analysis also proved to be useful in improving the quality of the estimated coefficients. The procedures used for editing the data and for overall analysis are also discussed.
Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis
NASA Technical Reports Server (NTRS)
Shortle, J. F.; Allocco, M.
2005-01-01
Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.
Xu, Mei-Mei; Jia, Hong-Yu; Yan, Li-Li; Li, Shan-Shan; Zheng, Yue
2017-01-01
Abstract Background: This meta-analysis aimed to provide a pooled analysis of prospective controlled trials comparing the diagnostic accuracy of 22-G and 25-G needles on endoscopic ultrasonography (EUS-FNA) of the solid pancreatic mass. Methods: We established a rigorous study protocol according to Cochrane Collaboration recommendations. We systematically searched the PubMed and Embase databases to identify articles to include in the meta-analysis. Sensitivity, specificity, and corresponding 95% confidence intervals were calculated for 22-G and 25-G needles of individual studies from the contingency tables. Results: Eleven prospective controlled trials included a total of 837 patients (412 with 22-G vs 425 with 25-G). Our outcomes revealed that 25-G needles (92% [95% CI, 89%–95%]) have higher sensitivity than 22-G needles (88% [95% CI, 84%–91%]) on solid pancreatic mass EUS-FNA (P = 0.046). However, there were no significant differences between the 2 groups in overall diagnostic specificity (P = 0.842). The pooled positive and negative likelihood ratio of the 22-G needle were 12.61 (95% CI, 5.65–28.14) and 0.16 (95% CI, 0.12–0.21), respectively. The pooled positive likelihood ratio was 12.61 (95% CI, 5.65–28.14), and the negative likelihood ratio was 0.16 (95% CI, 0.12–0.21) for the 22-G needle. The pooled positive likelihood ratio was 8.44 (95% CI, 3.87–18.42), and the negative likelihood ratio was 0.13 (95% CI, 0.09–0.18) for the 25-G needle. The area under the summary receiver operating characteristic curve was 0.97 for the 22-G needle and 0.96 for the 25-G needle. Conclusion: Compared to the study of 22-G EUS-FNA needles, our study showed that 25-G needles have superior sensitivity in the evaluation of solid pancreatic lesions by EUS–FNA. PMID:28151856
Sell, Rebecca E; Sarno, Renee; Lawrence, Brenna; Castillo, Edward M; Fisher, Roger; Brainard, Criss; Dunford, James V; Davis, Daniel P
2010-07-01
The three-phase model of ventricular fibrillation (VF) arrest suggests a period of compressions to "prime" the heart prior to defibrillation attempts. In addition, post-shock compressions may increase the likelihood of return of spontaneous circulation (ROSC). The optimal intervals for shock delivery following cessation of compressions (pre-shock interval) and resumption of compressions following a shock (post-shock interval) remain unclear. To define optimal pre- and post-defibrillation compression pauses for out-of-hospital cardiac arrest (OOHCA). All patients suffering OOHCA from VF were identified over a 1-month period. Defibrillator data were abstracted and analyzed using the combination of ECG, impedance, and audio recording. Receiver-operator curve (ROC) analysis was used to define the optimal pre- and post-shock compression intervals. Multiple logistic regression analysis was used to quantify the relationship between these intervals and ROSC. Covariates included cumulative number of defibrillation attempts, intubation status, and administration of epinephrine in the immediate pre-shock compression cycle. Cluster adjustment was performed due to the possibility of multiple defibrillation attempts for each patient. A total of 36 patients with 96 defibrillation attempts were included. The ROC analysis identified an optimal pre-shock interval of <3s and an optimal post-shock interval of <6s. Increased likelihood of ROSC was observed with a pre-shock interval <3s (adjusted OR 6.7, 95% CI 2.0-22.3, p=0.002) and a post-shock interval of <6s (adjusted OR 10.7, 95% CI 2.8-41.4, p=0.001). Likelihood of ROSC was substantially increased with the optimization of both pre- and post-shock intervals (adjusted OR 13.1, 95% CI 3.4-49.9, p<0.001). Decreasing pre- and post-shock compression intervals increases the likelihood of ROSC in OOHCA from VF.
Murray, Justine V; Jansen, Cassie C; De Barro, Paul
2016-01-01
In an effort to eliminate dengue, a successful technology was developed with the stable introduction of the obligate intracellular bacteria Wolbachia pipientis into the mosquito Aedes aegypti to reduce its ability to transmit dengue fever due to life shortening and inhibition of viral replication effects. An analysis of risk was required before considering release of the modified mosquito into the environment. Expert knowledge and a risk assessment framework were used to identify risk associated with the release of the modified mosquito. Individual and group expert elicitation was performed to identify potential hazards. A Bayesian network (BN) was developed to capture the relationship between hazards and the likelihood of events occurring. Risk was calculated from the expert likelihood estimates populating the BN and the consequence estimates elicited from experts. The risk model for "Don't Achieve Release" provided an estimated 46% likelihood that the release would not occur by a nominated time but generated an overall risk rating of very low. The ability to obtain compliance had the greatest influence on the likelihood of release occurring. The risk model for "Cause More Harm" provided a 12.5% likelihood that more harm would result from the release, but the overall risk was considered negligible. The efficacy of mosquito management had the most influence, with the perception that the threat of dengue fever had been eliminated, resulting in less household mosquito control, and was scored as the highest ranked individual hazard (albeit low risk). The risk analysis was designed to incorporate the interacting complexity of hazards that may affect the release of the technology into the environment. The risk analysis was a small, but important, implementation phase in the success of this innovative research introducing a new technology to combat dengue transmission in the environment.
Quirino, Isabel G; Silva, Jose Maria P; Diniz, Jose S; Lima, Eleonora M; Rocha, Ana Cristina S; Simões e Silva, Ana Cristina; Oliveira, Eduardo A
2011-01-01
The aim of this study was to evaluate the diagnostic accuracy of dimercapto-succinic acid renal scintigraphy and renal ultrasound in identifying high grade vesicoureteral reflux in children after a first episode of urinary tract infection. A total of 533 children following a first urinary tract infection were included in the analysis. Patients were assessed by 3 diagnostic imaging studies, renal ultrasound, dimercapto-succinic acid scan and voiding cystourethrography. The main event of interest was the presence of high grade (III to V) vesicoureteral reflux. The combined and separate diagnostic accuracy of screening methods was assessed by calculation of diagnostic OR, sensitivity, specificity, positive predictive value, negative predictive value and likelihood ratio. A total of 246 patients had reflux, of whom 144 (27%) had high grade (III to V) disease. Sensitivity, negative predictive value and diagnostic OR of ultrasound for high grade reflux were 83.3%, 90.8% and 7.9, respectively. Dimercapto-succinic acid scan had the same sensitivity as ultrasound but a higher negative predictive value (91.7%) and diagnostic OR (10.9). If both tests were analyzed in parallel by using the OR rule, ie a negative diagnosis was established only when both test results were normal, sensitivity increased to 97%, negative predictive value to 97% and diagnostic OR to 25.3. Only 9 children (6.3%) with dilating reflux had an absence of alterations in both tests. Our findings support the idea that ultrasound and dimercapto-succinic acid scan used in combination are reliable predictors of dilating vesicoureteral reflux. Copyright © 2011 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
Prugger, Christof; Wellmann, Jürgen; Heidrich, Jan; De Bacquer, Dirk; De Smedt, Delphine; De Backer, Guy; Reiner, Željko; Empana, Jean-Philippe; Fras, Zlatko; Gaita, Dan; Jennings, Catriona; Kotseva, Kornelia; Wood, David; Keil, Ulrich
2017-01-01
Regular exercise lowers the risk of cardiovascular death in coronary heart disease (CHD) patients. We aimed to investigate regular exercise behaviour and intention in relation to symptoms of anxiety and depression in CHD patients across Europe. This study was based on a multicentre cross-sectional survey. In the EUROpean Action on Secondary and Primary Prevention through Intervention to Reduce Events (EUROASPIRE) III survey, 8966 CHD patients <80 years of age from 22 European countries were interviewed on average 15 months after hospitalisation. Whether patients exercised or intended to exercise regularly was assessed using the Stages of Change questionnaire in 8330 patients. Symptoms of anxiety and depression were evaluated using the Hospital Anxiety and Depression Scale. Total physical activity was measured by the International Physical Activity Questionnaire in patients from a subset of 14 countries. Overall, 50.3% of patients were not intending to exercise regularly, 15.9% were intending to exercise regularly, and 33.8% were exercising regularly. Patients with severe symptoms of depression less frequently exercised regularly than patients with symptoms in the normal range (20.2%, 95% confidence interval (CI) 14.8-26.8 vs 36.7%, 95% CI 29.8-44.2). Among patients not exercising regularly, patients with severe symptoms of depression were less likely to have an intention to exercise regularly (odds ratio 0.62, 95% CI 0.46-0.85). Symptoms of anxiety did not affect regular exercise intention. In sensitivity analysis, results were consistent when adjusting for total physical activity. Lower frequency of regular exercise and decreased likelihood of exercise intention were observed in CHD patients with severe depressive symptoms. Severe symptoms of depression may preclude CHD patients from performing regular exercise. © The European Society of Cardiology 2016.
Kress, W John; Erickson, David L; Swenson, Nathan G; Thompson, Jill; Uriarte, Maria; Zimmerman, Jess K
2010-11-09
Species number, functional traits, and phylogenetic history all contribute to characterizing the biological diversity in plant communities. The phylogenetic component of diversity has been particularly difficult to quantify in species-rich tropical tree assemblages. The compilation of previously published (and often incomplete) data on evolutionary relationships of species into a composite phylogeny of the taxa in a forest, through such programs as Phylomatic, has proven useful in building community phylogenies although often of limited resolution. Recently, DNA barcodes have been used to construct a robust community phylogeny for nearly 300 tree species in a forest dynamics plot in Panama using a supermatrix method. In that study sequence data from three barcode loci were used to generate a well-resolved species-level phylogeny. Here we expand upon this earlier investigation and present results on the use of a phylogenetic constraint tree to generate a community phylogeny for a diverse, tropical forest dynamics plot in Puerto Rico. This enhanced method of phylogenetic reconstruction insures the congruence of the barcode phylogeny with broadly accepted hypotheses on the phylogeny of flowering plants (i.e., APG III) regardless of the number and taxonomic breadth of the taxa sampled. We also compare maximum parsimony versus maximum likelihood estimates of community phylogenetic relationships as well as evaluate the effectiveness of one- versus two- versus three-gene barcodes in resolving community evolutionary history. As first demonstrated in the Panamanian forest dynamics plot, the results for the Puerto Rican plot illustrate that highly resolved phylogenies derived from DNA barcode sequence data combined with a constraint tree based on APG III are particularly useful in comparative analysis of phylogenetic diversity and will enhance research on the interface between community ecology and evolution.
A Likelihood-Based Framework for Association Analysis of Allele-Specific Copy Numbers.
Hu, Y J; Lin, D Y; Sun, W; Zeng, D
2014-10-01
Copy number variants (CNVs) and single nucleotide polymorphisms (SNPs) co-exist throughout the human genome and jointly contribute to phenotypic variations. Thus, it is desirable to consider both types of variants, as characterized by allele-specific copy numbers (ASCNs), in association studies of complex human diseases. Current SNP genotyping technologies capture the CNV and SNP information simultaneously via fluorescent intensity measurements. The common practice of calling ASCNs from the intensity measurements and then using the ASCN calls in downstream association analysis has important limitations. First, the association tests are prone to false-positive findings when differential measurement errors between cases and controls arise from differences in DNA quality or handling. Second, the uncertainties in the ASCN calls are ignored. We present a general framework for the integrated analysis of CNVs and SNPs, including the analysis of total copy numbers as a special case. Our approach combines the ASCN calling and the association analysis into a single step while allowing for differential measurement errors. We construct likelihood functions that properly account for case-control sampling and measurement errors. We establish the asymptotic properties of the maximum likelihood estimators and develop EM algorithms to implement the corresponding inference procedures. The advantages of the proposed methods over the existing ones are demonstrated through realistic simulation studies and an application to a genome-wide association study of schizophrenia. Extensions to next-generation sequencing data are discussed.
Boden, Lauren M; Boden, Stephanie A; Premkumar, Ajay; Gottschalk, Michael B; Boden, Scott D
2018-02-09
Retrospective analysis of prospectively collected data. To create a data-driven triage system stratifying patients by likelihood of undergoing spinal surgery within one year of presentation. Low back pain (LBP) and radicular lower extremity (LE) symptoms are common musculoskeletal problems. There is currently no standard data-derived triage process based on information that can be obtained prior to the initial physician-patient encounter to direct patients to the optimal physician type. We analyzed patient-reported data from 8006 patients with a chief complaint of LBP and/or LE radicular symptoms who presented to surgeons at a large multidisciplinary spine center between September 1, 2005 and June 30, 2016. Univariate and multivariate analysis identified independent risk factors for undergoing spinal surgery within one year of initial visit. A model incorporating these risk factors was created using a random sample of 80% of the total patients in our cohort, and validated on the remaining 20%. The baseline one-year surgery rate within our cohort was 39% for all patients and 42% for patients with LE symptoms. Those identified as high likelihood by the center's existing triage process had a surgery rate of 45%. The new triage scoring system proposed in this study was able to identify a high likelihood group in which 58% underwent surgery, which is a 46% higher surgery rate than in non-triaged patients and a 29% improvement from our institution's existing triage system. The data-driven triage model and scoring system derived and validated in this study (Spine Surgery Likelihood model [SSL-11]), significantly improved existing processes in predicting the likelihood of undergoing spinal surgery within one year of initial presentation. This triage system will allow centers to more selectively screen for surgical candidates and more effectively direct patients to surgeons or non-operative spine specialists. 4.
Modeling gene expression measurement error: a quasi-likelihood approach
Strimmer, Korbinian
2003-01-01
Background Using suitable error models for gene expression measurements is essential in the statistical analysis of microarray data. However, the true probabilistic model underlying gene expression intensity readings is generally not known. Instead, in currently used approaches some simple parametric model is assumed (usually a transformed normal distribution) or the empirical distribution is estimated. However, both these strategies may not be optimal for gene expression data, as the non-parametric approach ignores known structural information whereas the fully parametric models run the risk of misspecification. A further related problem is the choice of a suitable scale for the model (e.g. observed vs. log-scale). Results Here a simple semi-parametric model for gene expression measurement error is presented. In this approach inference is based an approximate likelihood function (the extended quasi-likelihood). Only partial knowledge about the unknown true distribution is required to construct this function. In case of gene expression this information is available in the form of the postulated (e.g. quadratic) variance structure of the data. As the quasi-likelihood behaves (almost) like a proper likelihood, it allows for the estimation of calibration and variance parameters, and it is also straightforward to obtain corresponding approximate confidence intervals. Unlike most other frameworks, it also allows analysis on any preferred scale, i.e. both on the original linear scale as well as on a transformed scale. It can also be employed in regression approaches to model systematic (e.g. array or dye) effects. Conclusions The quasi-likelihood framework provides a simple and versatile approach to analyze gene expression data that does not make any strong distributional assumptions about the underlying error model. For several simulated as well as real data sets it provides a better fit to the data than competing models. In an example it also improved the power of tests to identify differential expression. PMID:12659637
Comparative serum albumin interactions and antitumor effects of Au(III) and Ga(III) ions.
Sarioglu, Omer Faruk; Ozdemir, Ayse; Karaboduk, Kuddusi; Tekinay, Turgay
2015-01-01
In the present study, interactions of Au(III) and Ga(III) ions on human serum albumin (HSA) were studied comparatively via spectroscopic and thermal analysis methods: UV-vis absorbance spectroscopy, fluorescence spectroscopy, Fourier transform infrared (FT-IR) spectroscopy and isothermal titration calorimetry (ITC). The potential antitumor effects of these ions were studied on MCF-7 cells via Alamar blue assay. It was found that both Au(III) and Ga(III) ions can interact with HSA, however; Au(III) ions interact with HSA more favorably and with a higher affinity. FT-IR second derivative analysis results demonstrated that, high concentrations of both metal ions led to a considerable decrease in the α-helix content of HSA; while Au(III) led to around 5% of decrease in the α-helix content at 200μM, it was around 1% for Ga(III) at the same concentration. Calorimetric analysis gave the binding kinetics of metal-HSA interactions; while the binding affinity (Ka) of Au(III)-HSA binding was around 3.87×10(5)M(-1), it was around 9.68×10(3)M(-1) for Ga(III)-HSA binding. Spectroscopy studies overall suggest that both metal ions have significant effects on the chemical structure of HSA, including the secondary structure alterations. Antitumor activity studies on MCF7 tumor cell line with both metal ions revealed that, Au(III) ions have a higher antiproliferative activity compared to Ga(III) ions. Copyright © 2014 Elsevier GmbH. All rights reserved.
Population Synthesis of Radio and Gamma-ray Pulsars using the Maximum Likelihood Approach
NASA Astrophysics Data System (ADS)
Billman, Caleb; Gonthier, P. L.; Harding, A. K.
2012-01-01
We present the results of a pulsar population synthesis of normal pulsars from the Galactic disk using a maximum likelihood method. We seek to maximize the likelihood of a set of parameters in a Monte Carlo population statistics code to better understand their uncertainties and the confidence region of the model's parameter space. The maximum likelihood method allows for the use of more applicable Poisson statistics in the comparison of distributions of small numbers of detected gamma-ray and radio pulsars. Our code simulates pulsars at birth using Monte Carlo techniques and evolves them to the present assuming initial spatial, kick velocity, magnetic field, and period distributions. Pulsars are spun down to the present and given radio and gamma-ray emission characteristics. We select measured distributions of radio pulsars from the Parkes Multibeam survey and Fermi gamma-ray pulsars to perform a likelihood analysis of the assumed model parameters such as initial period and magnetic field, and radio luminosity. We present the results of a grid search of the parameter space as well as a search for the maximum likelihood using a Markov Chain Monte Carlo method. We express our gratitude for the generous support of the Michigan Space Grant Consortium, of the National Science Foundation (REU and RUI), the NASA Astrophysics Theory and Fundamental Program and the NASA Fermi Guest Investigator Program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wackers, F.J.; Russo, D.J.; Russo, D.
The prognostic significance of normal quantitative planar thallium-201 stress scintigraphy was evaluated in patients with a chest pain syndrome. The prevalence of cardiac events during follow-up was related to the pretest (that is, before stress scintigraphy) likelihood of coronary artery disease determined on the basis of symptoms, age, sex and stress electrocardiography. In a consecutive series of 344 patients who had adequate thallium-201 stress scintigrams, 95 had unequivocally normal studies by quantitative analysis. The pretest likelihood of coronary artery disease in the 95 patients had a bimodal distribution. During a mean follow-up period of 22 +/- 3 months, no patientmore » died. Three patients (3%) had a cardiac event: two of these patients (pretest likelihood of coronary artery disease 54 and 94%) had a nonfatal myocardial infarction 8 and 22 months, respectively, after stress scintigraphy, and one patient (pretest likelihood 98%) underwent percutaneous transluminal coronary angioplasty 16 months after stress scintigraphy for persisting anginal complaints. Three patients were lost to follow-up; all three had a low pretest likelihood of coronary artery disease. It is concluded that patients with chest pain and normal findings on quantitative thallium-201 scintigraphy have an excellent prognosis. Cardiac events are rare (infarction rate 1% per year) and occur in patients with a moderate to high pretest likelihood of coronary artery disease.« less
el Galta, Rachid; Uitte de Willige, Shirley; de Visser, Marieke C H; Helmer, Quinta; Hsu, Li; Houwing-Duistermaat, Jeanine J
2007-09-24
In this paper, we propose a one degree of freedom test for association between a candidate gene and a binary trait. This method is a generalization of Terwilliger's likelihood ratio statistic and is especially powerful for the situation of one associated haplotype. As an alternative to the likelihood ratio statistic, we derive a score statistic, which has a tractable expression. For haplotype analysis, we assume that phase is known. By means of a simulation study, we compare the performance of the score statistic to Pearson's chi-square statistic and the likelihood ratio statistic proposed by Terwilliger. We illustrate the method on three candidate genes studied in the Leiden Thrombophilia Study. We conclude that the statistic follows a chi square distribution under the null hypothesis and that the score statistic is more powerful than Terwilliger's likelihood ratio statistic when the associated haplotype has frequency between 0.1 and 0.4 and has a small impact on the studied disorder. With regard to Pearson's chi-square statistic, the score statistic has more power when the associated haplotype has frequency above 0.2 and the number of variants is above five.
Parent-child communication and marijuana initiation: evidence using discrete-time survival analysis.
Nonnemaker, James M; Silber-Ashley, Olivia; Farrelly, Matthew C; Dench, Daniel
2012-12-01
This study supplements existing literature on the relationship between parent-child communication and adolescent drug use by exploring whether parental and/or adolescent recall of specific drug-related conversations differentially impact youth's likelihood of initiating marijuana use. Using discrete-time survival analysis, we estimated the hazard of marijuana initiation using a logit model to obtain an estimate of the relative risk of initiation. Our results suggest that parent-child communication about drug use is either not protective (no effect) or - in the case of youth reports of communication - potentially harmful (leading to increased likelihood of marijuana initiation). Copyright © 2012 Elsevier Ltd. All rights reserved.
Ro, Kyung-Han; Heo, Jae-Won; Lee, Dae-Hee
2018-05-01
Implant survivorship is reported to be lower and complications, particularly bearing dislocation, are reported to be more frequent in Asian than in Western patients with medial knee osteoarthritis (OA) undergoing Oxford® Phase III unicompartmental knee arthroplasty (UKA). To date, however, these complications have not been compared between these groups of patients. The purpose of this study was to perform a meta-analysis comparing the standardized incidence rates of (1) all-cause reoperation; (2) reoperation related to bearing dislocation; and (3) reoperation related to progression of lateral compartment arthritis in Asian and Western patients with medial knee OA who underwent Oxford Phase III UKA. We searched MEDLINE® (January 1, 1976, to May 31, 2017), EMBASE® (January 1, 1985, to May 31, 2017), and the Cochrane Library (January 1, 1987, to May 31, 2017) for studies that reported complications of Oxford Phase III UKAs. Studies were included if they reported reoperation rates attributable to bearing dislocation and/or progression of lateral knee OA after surgery with this implant. Twenty-seven studies were included in this systematic review and 16 studies with followups > 5 years were included in the meta-analysis. These rates were converted to standardized incidence rate (that is, reoperations per 100 observed component years) based on mean followup and number of involved knees in each study. After applying prespecified inclusion and exclusion criteria, the studies were categorized into two groups, Asian and Western, based on hospital location. Twenty-five studies, containing 3152 Asian patients and 5455 Western patients, were evaluated. Study quality was assessed by the modified Coleman Methodology score (MCMS). Although all studies were Level IV, their mean MCMS score was 66.92 (SD, 8.7; 95% confidence interval [CI], 63.5-70.3), indicating fair quality. Because the heterogeneity of all subgroup meta-analyses was high, a random-effects model was used with estimations using the restricted maximum likelihood method. There was no difference in the proportion of Asian patients versus Western patients undergoing reoperation for any cause calculated as 100 component observed years (1.022 of 3152 Asian patients; 95% CI, 0.810-1.235 versus 1.300 of 5455 Western patients; 95% CI, 1.067-1.534; odds ratio, 0.7839; 95% CI, 0.5323-1.1545; p = 0.178). The mean reoperation rate attributable to bearing dislocation per 100 observed years was higher in Asian than in Western patients (0.525; 95% CI, 0.407-0.643 versus 0.141; 95% CI, 0.116-0.166; odds ratio, 3.7378; 95% CI, 1.694-8.248; p = 0.001) Conversely, the mean reoperation rate attributable to lateral knee OA per 100 observed years was lower in Asian than in Western patients (0.093; 95% CI, 0.070-0.115 versus 0.298; 95% CI, 0.217-0.379; odds ratio, 0.3114; 95% CI, 0.0986-0.9840; p < 0.001). Although total reoperation rates did not differ in the two populations, reoperation for bearing dislocation was more likely to occur in Asian than in Western patients, whereas reoperation for lateral knee OA progression was more likely to occur in Western than in Asian patients after Oxford Phase III UKA. Although possible explanations for these findings may be hypothesized, additional randomized, prospective comparative studies are needed. However, better survival outcomes after UKA may require consideration of ethnicity and lifestyle choices in addition to traditional surgical technique and perioperative care. Level III, therapeutic study.
Tanasescu, Radu; Cottam, William J; Condon, Laura; Tench, Christopher R; Auer, Dorothee P
2016-09-01
Maladaptive mechanisms of pain processing in chronic pain conditions (CP) are poorly understood. We used coordinate based meta-analysis of 266 fMRI pain studies to study functional brain reorganisation in CP and experimental models of hyperalgesia. The pattern of nociceptive brain activation was similar in CP, hyperalgesia and normalgesia in controls. However, elevated likelihood of activation was detected in the left putamen, left frontal gyrus and right insula in CP comparing stimuli of the most painful vs. other site. Meta-analysis of contrast maps showed no difference between CP, controls, mood conditions. In contrast, experimental hyperalgesia induced stronger activation in the bilateral insula, left cingulate and right frontal gyrus. Activation likelihood maps support a shared neural pain signature of cutaneous nociception in CP and controls. We also present a double dissociation between neural correlates of transient and persistent pain sensitisation with general increased activation intensity but unchanged pattern in experimental hyperalgesia and, by contrast, focally increased activation likelihood, but unchanged intensity, in CP when stimulated at the most painful body part. Copyright © 2016. Published by Elsevier Ltd.
Meyer, Karin; Kirkpatrick, Mark
2005-01-01
Principal component analysis is a widely used 'dimension reduction' technique, albeit generally at a phenotypic level. It is shown that we can estimate genetic principal components directly through a simple reparameterisation of the usual linear, mixed model. This is applicable to any analysis fitting multiple, correlated genetic effects, whether effects for individual traits or sets of random regression coefficients to model trajectories. Depending on the magnitude of genetic correlation, a subset of the principal component generally suffices to capture the bulk of genetic variation. Corresponding estimates of genetic covariance matrices are more parsimonious, have reduced rank and are smoothed, with the number of parameters required to model the dispersion structure reduced from k(k + 1)/2 to m(2k - m + 1)/2 for k effects and m principal components. Estimation of these parameters, the largest eigenvalues and pertaining eigenvectors of the genetic covariance matrix, via restricted maximum likelihood using derivatives of the likelihood, is described. It is shown that reduced rank estimation can reduce computational requirements of multivariate analyses substantially. An application to the analysis of eight traits recorded via live ultrasound scanning of beef cattle is given. PMID:15588566
Neandertal admixture in Eurasia confirmed by maximum-likelihood analysis of three genomes.
Lohse, Konrad; Frantz, Laurent A F
2014-04-01
Although there has been much interest in estimating histories of divergence and admixture from genomic data, it has proved difficult to distinguish recent admixture from long-term structure in the ancestral population. Thus, recent genome-wide analyses based on summary statistics have sparked controversy about the possibility of interbreeding between Neandertals and modern humans in Eurasia. Here we derive the probability of full mutational configurations in nonrecombining sequence blocks under both admixture and ancestral structure scenarios. Dividing the genome into short blocks gives an efficient way to compute maximum-likelihood estimates of parameters. We apply this likelihood scheme to triplets of human and Neandertal genomes and compare the relative support for a model of admixture from Neandertals into Eurasian populations after their expansion out of Africa against a history of persistent structure in their common ancestral population in Africa. Our analysis allows us to conclusively reject a model of ancestral structure in Africa and instead reveals strong support for Neandertal admixture in Eurasia at a higher rate (3.4-7.3%) than suggested previously. Using analysis and simulations we show that our inference is more powerful than previous summary statistics and robust to realistic levels of recombination.
Mendoza, Maria C.B.; Burns, Trudy L.; Jones, Michael P.
2009-01-01
Objectives Case-deletion diagnostic methods are tools that allow identification of influential observations that may affect parameter estimates and model fitting conclusions. The goal of this paper was to develop two case-deletion diagnostics, the exact case deletion (ECD) and the empirical influence function (EIF), for detecting outliers that can affect results of sib-pair maximum likelihood quantitative trait locus (QTL) linkage analysis. Methods Subroutines to compute the ECD and EIF were incorporated into the maximum likelihood QTL variance estimation components of the linkage analysis program MAPMAKER/SIBS. Performance of the diagnostics was compared in simulation studies that evaluated the proportion of outliers correctly identified (sensitivity), and the proportion of non-outliers correctly identified (specificity). Results Simulations involving nuclear family data sets with one outlier showed EIF sensitivities approximated ECD sensitivities well for outlier-affected parameters. Sensitivities were high, indicating the outlier was identified a high proportion of the time. Simulations also showed the enormous computational time advantage of the EIF. Diagnostics applied to body mass index in nuclear families detected observations influential on the lod score and model parameter estimates. Conclusions The EIF is a practical diagnostic tool that has the advantages of high sensitivity and quick computation. PMID:19172086
Neandertal Admixture in Eurasia Confirmed by Maximum-Likelihood Analysis of Three Genomes
Lohse, Konrad; Frantz, Laurent A. F.
2014-01-01
Although there has been much interest in estimating histories of divergence and admixture from genomic data, it has proved difficult to distinguish recent admixture from long-term structure in the ancestral population. Thus, recent genome-wide analyses based on summary statistics have sparked controversy about the possibility of interbreeding between Neandertals and modern humans in Eurasia. Here we derive the probability of full mutational configurations in nonrecombining sequence blocks under both admixture and ancestral structure scenarios. Dividing the genome into short blocks gives an efficient way to compute maximum-likelihood estimates of parameters. We apply this likelihood scheme to triplets of human and Neandertal genomes and compare the relative support for a model of admixture from Neandertals into Eurasian populations after their expansion out of Africa against a history of persistent structure in their common ancestral population in Africa. Our analysis allows us to conclusively reject a model of ancestral structure in Africa and instead reveals strong support for Neandertal admixture in Eurasia at a higher rate (3.4−7.3%) than suggested previously. Using analysis and simulations we show that our inference is more powerful than previous summary statistics and robust to realistic levels of recombination. PMID:24532731
Xu, Maoqi; Chen, Liang
2018-01-01
The individual sample heterogeneity is one of the biggest obstacles in biomarker identification for complex diseases such as cancers. Current statistical models to identify differentially expressed genes between disease and control groups often overlook the substantial human sample heterogeneity. Meanwhile, traditional nonparametric tests lose detailed data information and sacrifice the analysis power, although they are distribution free and robust to heterogeneity. Here, we propose an empirical likelihood ratio test with a mean-variance relationship constraint (ELTSeq) for the differential expression analysis of RNA sequencing (RNA-seq). As a distribution-free nonparametric model, ELTSeq handles individual heterogeneity by estimating an empirical probability for each observation without making any assumption about read-count distribution. It also incorporates a constraint for the read-count overdispersion, which is widely observed in RNA-seq data. ELTSeq demonstrates a significant improvement over existing methods such as edgeR, DESeq, t-tests, Wilcoxon tests and the classic empirical likelihood-ratio test when handling heterogeneous groups. It will significantly advance the transcriptomics studies of cancers and other complex disease. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Mapping Quantitative Traits in Unselected Families: Algorithms and Examples
Dupuis, Josée; Shi, Jianxin; Manning, Alisa K.; Benjamin, Emelia J.; Meigs, James B.; Cupples, L. Adrienne; Siegmund, David
2009-01-01
Linkage analysis has been widely used to identify from family data genetic variants influencing quantitative traits. Common approaches have both strengths and limitations. Likelihood ratio tests typically computed in variance component analysis can accommodate large families but are highly sensitive to departure from normality assumptions. Regression-based approaches are more robust but their use has primarily been restricted to nuclear families. In this paper, we develop methods for mapping quantitative traits in moderately large pedigrees. Our methods are based on the score statistic which in contrast to the likelihood ratio statistic, can use nonparametric estimators of variability to achieve robustness of the false positive rate against departures from the hypothesized phenotypic model. Because the score statistic is easier to calculate than the likelihood ratio statistic, our basic mapping methods utilize relatively simple computer code that performs statistical analysis on output from any program that computes estimates of identity-by-descent. This simplicity also permits development and evaluation of methods to deal with multivariate and ordinal phenotypes, and with gene-gene and gene-environment interaction. We demonstrate our methods on simulated data and on fasting insulin, a quantitative trait measured in the Framingham Heart Study. PMID:19278016
Elghafghuf, Adel; Dufour, Simon; Reyher, Kristen; Dohoo, Ian; Stryhn, Henrik
2014-12-01
Mastitis is a complex disease affecting dairy cows and is considered to be the most costly disease of dairy herds. The hazard of mastitis is a function of many factors, both managerial and environmental, making its control a difficult issue to milk producers. Observational studies of clinical mastitis (CM) often generate datasets with a number of characteristics which influence the analysis of those data: the outcome of interest may be the time to occurrence of a case of mastitis, predictors may change over time (time-dependent predictors), the effects of factors may change over time (time-dependent effects), there are usually multiple hierarchical levels, and datasets may be very large. Analysis of such data often requires expansion of the data into the counting-process format - leading to larger datasets - thus complicating the analysis and requiring excessive computing time. In this study, a nested frailty Cox model with time-dependent predictors and effects was applied to Canadian Bovine Mastitis Research Network data in which 10,831 lactations of 8035 cows from 69 herds were followed through lactation until the first occurrence of CM. The model was fit to the data as a Poisson model with nested normally distributed random effects at the cow and herd levels. Risk factors associated with the hazard of CM during the lactation were identified, such as parity, calving season, herd somatic cell score, pasture access, fore-stripping, and proportion of treated cases of CM in a herd. The analysis showed that most of the predictors had a strong effect early in lactation and also demonstrated substantial variation in the baseline hazard among cows and between herds. A small simulation study for a setting similar to the real data was conducted to evaluate the Poisson maximum likelihood estimation approach with both Gaussian quadrature method and Laplace approximation. Further, the performance of the two methods was compared with the performance of a widely used estimation approach for frailty Cox models based on the penalized partial likelihood. The simulation study showed good performance for the Poisson maximum likelihood approach with Gaussian quadrature and biased variance component estimates for both the Poisson maximum likelihood with Laplace approximation and penalized partial likelihood approaches. Copyright © 2014. Published by Elsevier B.V.
Yoshimura, Etsuro; Kohdr, Hicham; Mori, Satoshi; Hider, Robert C
2011-08-01
The phytosiderophores, mugineic acid (MA) and epi-hydroxymugineic acid (HMA), together with a related compound, nicotianamine (NA), were investigated for their ability to bind Al(III). Potentiometric titration analysis demonstrated that MA and HMA bind Al(III), in contrast to NA which does not under normal physiological conditions. With MA and HMA, in addition to the Al complex (AlL), the protonated (AlLH) and deprotonated (AlLH(-1)) complexes were identified from an analysis of titration curves, where L denotes the phytosiderophore form in which all the carboxylate functions are ionized. The equilibrium formation constants of the Al(III) phytosiderophore complexes are much smaller than those of the corresponding Fe(III) complexes. The higher selectivity of phytosiderophores for Fe(III) over Al(III) facilitates Fe(III) acquisition in alkaline conditions where free Al(III) levels are higher than free Fe(III) levels.
ERIC Educational Resources Information Center
Lee, S. Y.; Jennrich, R. I.
1979-01-01
A variety of algorithms for analyzing covariance structures are considered. Additionally, two methods of estimation, maximum likelihood, and weighted least squares are considered. Comparisons are made between these algorithms and factor analysis. (Author/JKS)
Modeling the rejection probability in plant imports.
Surkov, I V; van der Werf, W; van Kooten, O; Lansink, A G J M Oude
2008-06-01
Phytosanitary inspection of imported plants and flowers is a major means for preventing pest invasions through international trade, but in a majority of countries availability of resources prevents inspection of all imports. Prediction of the likelihood of pest infestation in imported shipments could help maximize the efficiency of inspection by targeting inspection on shipments with the highest likelihood of infestation. This paper applies a multinomial logistic (MNL) regression model to data on import inspections of ornamental plant commodities in the Netherlands from 1998 to 2001 to investigate whether it is possible to predict the probability that a shipment will be (i) accepted for import, (ii) rejected for import because of detected pests, or (iii) rejected due to other reasons. Four models were estimated: (i) an all-species model, including all plant imports (136,251 shipments) in the data set, (ii) a four-species model, including records on the four ornamental commodities that accounted for 28.9% of inspected and 49.5% of rejected shipments, and two models for single commodities with large import volumes and percentages of rejections, (iii) Dianthus (16.9% of inspected and 46.3% of rejected shipments), and (iv) Chrysanthemum (6.9 and 8.6%, respectively). All models were highly significant (P < 0.001). The models for Dianthus and Chrysanthemum and for the set of four ornamental commodities showed a better fit to data than the model for all ornamental commodities. Variables that characterized the imported shipment's region of origin, the shipment's size, the company that imported the shipment, and season and year of import, were significant in most of the estimated models. The combined results of this study suggest that the MNL model can be a useful tool for modeling the probability of rejecting imported commodities even with a small set of explanatory variables. The MNL model can be helpful in better targeting of resources for import inspection. The inspecting agencies could enable development of these models by appropriately recording inspection results.
López-Castro, Teresa; Hu, Mei-Chen; Papini, Santiago; Ruglass, Lesia M; Hien, Denise A
2015-05-01
Despite advances towards integration of care for women with co-occurring substance use disorder (SUD) and post-traumatic stress disorder (PTSD), low abstinence rates following SUD/PTSD treatment remain the norm. The utility of investigating distinct substance use trajectories is a critical innovation in the detection and refining of effective interventions for this clinical population. The present study reanalysed data from the largest randomised clinical trial to date for co-occurring SUD and PTSD in women (National Drug Abuse Treatment Clinical Trials Network; Women and Trauma Study). Randomised participants (n = 353) received one of two interventions in addition to treatment as usual for SUD: (i) trauma-informed integrative treatment for PTSD/SUD; or (ii) an active control psychoeducation course on women's health. The present study utilised latent growth mixture models (LGMM) with multiple groups to estimate women's substance use patterns during the 12-month follow-up period. Findings provided support for three different trajectories of substance use in the post-treatment year: (i) consistently low likelihood and use frequency; (ii) consistently high likelihood and use frequency; and (iii) high likelihood and moderate use frequency. Covariate analyses revealed improvement in PTSD severity was associated with membership in a specific substance use trajectory, although receiving trauma-informed treatment was not. Additionally, SUD severity, age and after-care efforts were shown to be related to trajectory membership. Findings highlight the necessity of accounting for heterogeneity in post-treatment substance use, relevance of trauma-informed care in SUD recovery and benefits of incorporating methodologies like LGMM when evaluating SUD treatment outcomes. © 2015 Australasian Professional Society on Alcohol and other Drugs.
Preparation of microspheric Fe(III)-ion imprinted polymer for selective solid phase extraction
NASA Astrophysics Data System (ADS)
Ara, Behisht; Muhammad, Mian; Salman, Muhammad; Ahmad, Raees; Islam, Noor; Zia, Tanveer ul Haq
2018-03-01
In this research work, an Fe(III)-IIP was prepared using methacrylic acid as monomer, divinylbenzene as cross-linker, azobisisobutyronitrile as initiator. The ion imprinted polymer was functionalized with Fe(III)8-hydroxy quinolone complex under thermal conditions by copolymerization with the monomer and the cross-linker. The prepared Fe(III)-ion imprinted polymer (IIP) and non-ion imprinted polymer (Non-IIP) were characterized with fourier transform-infrared spectroscopy, scanning electron microscopic analysis and thermal gravimetric analysis. The polymer showed a good stability to thermal analysis up to a temperature of 500 °C. The size of the polymer obtained was 1 µm, large enough to be filtered easily. At pH 2.5 more affinity was observed with ion imprinted polymer in comparison to non-ion imprinted polymer. For the kinetic study, the most linear and rhythmical relation were seen in pseudo second order. The maximum sorption capacity of Fe(III) ions on Fe(III)-IIP and non-IIP was 170 and 30.0 µmolg-1, respectively. The relative selectivity factor (αr) values of Fe(III)/Fe(II), Fe(III)/Al(III) and Fe(III)/Cr(III) were 151.0, 84.6 and 91.9, respectively. The preconcentration factor was found to be 240. The developed method was successfully applied to the determination of trace Fe in the drinking water.
The Determinants of Place of Death: An Evidence-Based Analysis
Costa, V
2014-01-01
Background According to a conceptual model described in this analysis, place of death is determined by an interplay of factors associated with the illness, the individual, and the environment. Objectives Our objective was to evaluate the determinants of place of death for adult patients who have been diagnosed with an advanced, life-limiting condition and are not expected to stabilize or improve. Data Sources A literature search was performed using Ovid MEDLINE, Ovid MEDLINE In-Process and Other Non-Indexed Citations, Ovid Embase, EBSCO Cumulative Index to Nursing & Allied Health Literature (CINAHL), and EBM Reviews, for studies published from January 1, 2004, to September 24, 2013. Review Methods Different places of death are considered in this analysis—home, nursing home, inpatient hospice, and inpatient palliative care unit, compared with hospital. We selected factors to evaluate from a list of possible predictors—i.e., determinants—of death. We extracted the adjusted odds ratios and 95% confidence intervals of each determinant, performed a meta-analysis if appropriate, and conducted a stratified analysis if substantial heterogeneity was observed. Results From a literature search yielding 5,899 citations, we included 2 systematic reviews and 29 observational studies. Factors that increased the likelihood of home death included multidisciplinary home palliative care, patient preference, having an informal caregiver, and the caregiver's ability to cope. Factors increasing the likelihood of a nursing home death included the availability of palliative care in the nursing home and the existence of advance directives. A cancer diagnosis and the involvement of home care services increased the likelihood of dying in an inpatient palliative care unit. A cancer diagnosis and a longer time between referral to palliative care and death increased the likelihood of inpatient hospice death. The quality of the evidence was considered low. Limitations Our results are based on those of retrospective observational studies. Conclusions The results obtained were consistent with previously published systematic reviews. The analysis identified several factors that are associated with place of death. PMID:26351550
Dai, Cong; Jiang, Min; Sun, Ming-Jun; Cao, Qin
2018-05-01
Fecal immunochemical test (FIT) is a promising marker for assessment of inflammatory bowel disease activity. However, the utility of FIT for predicting mucosal healing (MH) of ulcerative colitis (UC) patients has yet to be clearly demonstrated. The objective of our study was to perform a diagnostic test accuracy test meta-analysis evaluating the diagnostic accuracy of FIT in predicting MH of UC patients. We systematically searched the databases from inception to November 2017 that evaluated MH in UC. The methodological quality of each study was assessed according to the Quality Assessment of Diagnostic Accuracy Studies checklist. The extracted data were pooled using a summary receiver operating characteristic curve model. Random-effects model was used to summarize the diagnostic odds ratio, sensitivity, specificity, positive likelihood ratio, and negative likelihood ratio. Six studies comprising 625 UC patients were included in the meta-analysis. The pooled sensitivity and specificity values for predicting MH in UC were 0.77 (95% confidence interval [CI], 0.72-0.81) and 0.81 (95% CI, 0.76-0.85), respectively. The FIT level had a high rule-in value (positive likelihood ratio, 3.79; 95% CI, 2.85-5.03) and a moderate rule-out value (negative likelihood ratio, 0.26; 95% CI, 0.16-0.43) for predicting MH in UC. The results of the receiver operating characteristic curve analysis (area under the curve, 0.88; standard error of the mean, 0.02) and diagnostic odds ratio (18.08; 95% CI, 9.57-34.13) also revealed improved discrimination for identifying MH in UC with FIT concentration. Our meta-analysis has found that FIT is a simple, reliable non-invasive marker for predicting MH in UC patients. © 2018 Journal of Gastroenterology and Hepatology Foundation and John Wiley & Sons Australia, Ltd.
Lin, Feng-Chang; Zhu, Jun
2012-01-01
We develop continuous-time models for the analysis of environmental or ecological monitoring data such that subjects are observed at multiple monitoring time points across space. Of particular interest are additive hazards regression models where the baseline hazard function can take on flexible forms. We consider time-varying covariates and take into account spatial dependence via autoregression in space and time. We develop statistical inference for the regression coefficients via partial likelihood. Asymptotic properties, including consistency and asymptotic normality, are established for parameter estimates under suitable regularity conditions. Feasible algorithms utilizing existing statistical software packages are developed for computation. We also consider a simpler additive hazards model with homogeneous baseline hazard and develop hypothesis testing for homogeneity. A simulation study demonstrates that the statistical inference using partial likelihood has sound finite-sample properties and offers a viable alternative to maximum likelihood estimation. For illustration, we analyze data from an ecological study that monitors bark beetle colonization of red pines in a plantation of Wisconsin.
Planck intermediate results. XVI. Profile likelihoods for cosmological parameters
NASA Astrophysics Data System (ADS)
Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Arnaud, M.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartlett, J. G.; Battaner, E.; Benabed, K.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bonaldi, A.; Bond, J. R.; Bouchet, F. R.; Burigana, C.; Cardoso, J.-F.; Catalano, A.; Chamballu, A.; Chiang, H. C.; Christensen, P. R.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Couchot, F.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Dickinson, C.; Diego, J. M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Dupac, X.; Enßlin, T. A.; Eriksen, H. K.; Finelli, F.; Forni, O.; Frailis, M.; Franceschi, E.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giraud-Héraud, Y.; González-Nuevo, J.; Górski, K. M.; Gregorio, A.; Gruppuso, A.; Hansen, F. K.; Harrison, D. L.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Jaffe, A. H.; Jaffe, T. R.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kisner, T. S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lawrence, C. R.; Leonardi, R.; Liddle, A.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maffei, B.; Maino, D.; Mandolesi, N.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Massardi, M.; Matarrese, S.; Mazzotta, P.; Melchiorri, A.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschênes, M.-A.; Moneti, A.; Montier, L.; Morgante, G.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Noviello, F.; Novikov, D.; Novikov, I.; Oxborrow, C. A.; Pagano, L.; Pajot, F.; Paoletti, D.; Pasian, F.; Perdereau, O.; Perotto, L.; Perrotta, F.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski∗, S.; Pointecouteau, E.; Polenta, G.; Popa, L.; Pratt, G. W.; Puget, J.-L.; Rachen, J. P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rouillé d'Orfeuil, B.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Savelainen, M.; Savini, G.; Spencer, L. D.; Spinelli, M.; Starck, J.-L.; Sureau, F.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; White, M.; Yvon, D.; Zacchei, A.; Zonca, A.
2014-06-01
We explore the 2013 Planck likelihood function with a high-precision multi-dimensional minimizer (Minuit). This allows a refinement of the ΛCDM best-fit solution with respect to previously-released results, and the construction of frequentist confidence intervals using profile likelihoods. The agreement with the cosmological results from the Bayesian framework is excellent, demonstrating the robustness of the Planck results to the statistical methodology. We investigate the inclusion of neutrino masses, where more significant differences may appear due to the non-Gaussian nature of the posterior mass distribution. By applying the Feldman-Cousins prescription, we again obtain results very similar to those of the Bayesian methodology. However, the profile-likelihood analysis of the cosmic microwave background (CMB) combination (Planck+WP+highL) reveals a minimum well within the unphysical negative-mass region. We show that inclusion of the Planck CMB-lensing information regularizes this issue, and provide a robust frequentist upper limit ∑ mν ≤ 0.26 eV (95% confidence) from the CMB+lensing+BAO data combination.
Burns, Linda J; Logan, Brent R; Chitphakdithai, Pintip; Miller, John P; Drexler, Rebecca; Spellman, Stephen; Switzer, Galen E; Wingard, John R; Anasetti, Claudio; Confer, Dennis L
2016-06-01
We report a comparison of time to recovery, side effects, and change in blood counts from baseline to after donation from unrelated donors who participated in the Blood and Marrow Transplant Clinical Trials Network phase III randomized, multicenter trial (0201) in which donor-recipient pairs were randomized to either peripheral blood stem cell (PBSC) or bone marrow (BM) donation. Of the entire cohort, 262 donated PBSC and 264 donated BM; 372 (71%) donors were from domestic and 154 (29%) were from international centers (145 German and 9 Canadian). PBSC donors recovered in less time, with a median time to recovery of 1 week compared with 2.3 weeks for BM donors. The number of donors reporting full recovery was significantly greater for donors of PBSC than of BM at 1, 2, and 3 weeks and 3 months after donation. Multivariate analysis showed that PBSC donors were more likely to recover at any time after donation compared with BM donors (hazard ratio, 2.08; 95% confidence interval [CI], 1.73 to 2.50; P < .001). Other characteristics that significantly increased the likelihood of complete recovery were being an international donor and donation in more recent years. Donors of BM were more likely to report grades 2 to 4 skeletal pain, body symptoms, and fatigue at 1 week after donation. In logistic regression analysis of domestic donors only in which toxicities at peri-collection time points (day 5 filgrastim for PBSC donors and day 2 after collection of BM donors) could be analyzed, no variable was significantly associated with grades 2 to 4 skeletal pain, including product donated (BM versus PBSC; odds ratio, 1.13; 95% CI, .74 to 1.74; P = .556). Blood counts were affected by product donated, with greater mean change from baseline to after donation for white blood cells, neutrophils, mononuclear cells, and platelets in PBSC donors whereas BM donors experienced a greater mean change in hemoglobin. This analysis provided an enhanced understanding of donor events as product donated was independent of physician bias or donor preference. Copyright © 2016 The American Society for Blood and Marrow Transplantation. Published by Elsevier Inc. All rights reserved.
A new model to predict weak-lensing peak counts. II. Parameter constraint strategies
NASA Astrophysics Data System (ADS)
Lin, Chieh-An; Kilbinger, Martin
2015-11-01
Context. Peak counts have been shown to be an excellent tool for extracting the non-Gaussian part of the weak lensing signal. Recently, we developed a fast stochastic forward model to predict weak-lensing peak counts. Our model is able to reconstruct the underlying distribution of observables for analysis. Aims: In this work, we explore and compare various strategies for constraining a parameter using our model, focusing on the matter density Ωm and the density fluctuation amplitude σ8. Methods: First, we examine the impact from the cosmological dependency of covariances (CDC). Second, we perform the analysis with the copula likelihood, a technique that makes a weaker assumption than does the Gaussian likelihood. Third, direct, non-analytic parameter estimations are applied using the full information of the distribution. Fourth, we obtain constraints with approximate Bayesian computation (ABC), an efficient, robust, and likelihood-free algorithm based on accept-reject sampling. Results: We find that neglecting the CDC effect enlarges parameter contours by 22% and that the covariance-varying copula likelihood is a very good approximation to the true likelihood. The direct techniques work well in spite of noisier contours. Concerning ABC, the iterative process converges quickly to a posterior distribution that is in excellent agreement with results from our other analyses. The time cost for ABC is reduced by two orders of magnitude. Conclusions: The stochastic nature of our weak-lensing peak count model allows us to use various techniques that approach the true underlying probability distribution of observables, without making simplifying assumptions. Our work can be generalized to other observables where forward simulations provide samples of the underlying distribution.
Guindon, Stéphane; Dufayard, Jean-François; Lefort, Vincent; Anisimova, Maria; Hordijk, Wim; Gascuel, Olivier
2010-05-01
PhyML is a phylogeny software based on the maximum-likelihood principle. Early PhyML versions used a fast algorithm performing nearest neighbor interchanges to improve a reasonable starting tree topology. Since the original publication (Guindon S., Gascuel O. 2003. A simple, fast and accurate algorithm to estimate large phylogenies by maximum likelihood. Syst. Biol. 52:696-704), PhyML has been widely used (>2500 citations in ISI Web of Science) because of its simplicity and a fair compromise between accuracy and speed. In the meantime, research around PhyML has continued, and this article describes the new algorithms and methods implemented in the program. First, we introduce a new algorithm to search the tree space with user-defined intensity using subtree pruning and regrafting topological moves. The parsimony criterion is used here to filter out the least promising topology modifications with respect to the likelihood function. The analysis of a large collection of real nucleotide and amino acid data sets of various sizes demonstrates the good performance of this method. Second, we describe a new test to assess the support of the data for internal branches of a phylogeny. This approach extends the recently proposed approximate likelihood-ratio test and relies on a nonparametric, Shimodaira-Hasegawa-like procedure. A detailed analysis of real alignments sheds light on the links between this new approach and the more classical nonparametric bootstrap method. Overall, our tests show that the last version (3.0) of PhyML is fast, accurate, stable, and ready to use. A Web server and binary files are available from http://www.atgc-montpellier.fr/phyml/.
Maximum likelihood convolutional decoding (MCD) performance due to system losses
NASA Technical Reports Server (NTRS)
Webster, L.
1976-01-01
A model for predicting the computational performance of a maximum likelihood convolutional decoder (MCD) operating in a noisy carrier reference environment is described. This model is used to develop a subroutine that will be utilized by the Telemetry Analysis Program to compute the MCD bit error rate. When this computational model is averaged over noisy reference phase errors using a high-rate interpolation scheme, the results are found to agree quite favorably with experimental measurements.
Whiley, Phillip J.; Parsons, Michael T.; Leary, Jennifer; Tucker, Kathy; Warwick, Linda; Dopita, Belinda; Thorne, Heather; Lakhani, Sunil R.; Goldgar, David E.; Brown, Melissa A.; Spurdle, Amanda B.
2014-01-01
Rare exonic, non-truncating variants in known cancer susceptibility genes such as BRCA1 and BRCA2 are problematic for genetic counseling and clinical management of relevant families. This study used multifactorial likelihood analysis and/or bioinformatically-directed mRNA assays to assess pathogenicity of 19 BRCA1 or BRCA2 variants identified following patient referral to clinical genetic services. Two variants were considered to be pathogenic (Class 5). BRCA1:c.4484G> C(p.Arg1495Thr) was shown to result in aberrant mRNA transcripts predicted to encode truncated proteins. The BRCA1:c.122A>G(p.His41Arg) RING-domain variant was found from multifactorial likelihood analysis to have a posterior probability of pathogenicity of 0.995, a result consistent with existing protein functional assay data indicating lost BARD1 binding and ubiquitin ligase activity. Of the remaining variants, seven were determined to be not clinically significant (Class 1), nine were likely not pathogenic (Class 2), and one was uncertain (Class 3).These results have implications for genetic counseling and medical management of families carrying these specific variants. They also provide additional multifactorial likelihood variant classifications as reference to evaluate the sensitivity and specificity of bioinformatic prediction tools and/or functional assay data in future studies. PMID:24489791
Turesky, Ted K.; Turkeltaub, Peter E.; Eden, Guinevere F.
2016-01-01
The functional neuroanatomy of finger movements has been characterized with neuroimaging in young adults. However, less is known about the aging motor system. Several studies have contrasted movement-related activity in older versus young adults, but there is inconsistency among their findings. To address this, we conducted an activation likelihood estimation (ALE) meta-analysis on within-group data from older adults and young adults performing regularly paced right-hand finger movement tasks in response to external stimuli. We hypothesized that older adults would show a greater likelihood of activation in right cortical motor areas (i.e., ipsilateral to the side of movement) compared to young adults. ALE maps were examined for conjunction and between-group differences. Older adults showed overlapping likelihoods of activation with young adults in left primary sensorimotor cortex (SM1), bilateral supplementary motor area, bilateral insula, left thalamus, and right anterior cerebellum. Their ALE map differed from that of the young adults in right SM1 (extending into dorsal premotor cortex), right supramarginal gyrus, medial premotor cortex, and right posterior cerebellum. The finding that older adults uniquely use ipsilateral regions for right-hand finger movements and show age-dependent modulations in regions recruited by both age groups provides a foundation by which to understand age-related motor decline and motor disorders. PMID:27799910
Three regularities of recognition memory: the role of bias.
Hilford, Andrew; Maloney, Laurence T; Glanzer, Murray; Kim, Kisok
2015-12-01
A basic assumption of Signal Detection Theory is that decisions are made on the basis of likelihood ratios. In a preceding paper, Glanzer, Hilford, and Maloney (Psychonomic Bulletin & Review, 16, 431-455, 2009) showed that the likelihood ratio assumption implies that three regularities will occur in recognition memory: (1) the Mirror Effect, (2) the Variance Effect, (3) the normalized Receiver Operating Characteristic (z-ROC) Length Effect. The paper offered formal proofs and computational demonstrations that decisions based on likelihood ratios produce the three regularities. A survey of data based on group ROCs from 36 studies validated the likelihood ratio assumption by showing that its three implied regularities are ubiquitous. The study noted, however, that bias, another basic factor in Signal Detection Theory, can obscure the Mirror Effect. In this paper we examine how bias affects the regularities at the theoretical level. The theoretical analysis shows: (1) how bias obscures the Mirror Effect, not the other two regularities, and (2) four ways to counter that obscuring. We then report the results of five experiments that support the theoretical analysis. The analyses and the experimental results also demonstrate: (1) that the three regularities govern individual, as well as group, performance, (2) alternative explanations of the regularities are ruled out, and (3) that Signal Detection Theory, correctly applied, gives a simple and unified explanation of recognition memory data.
Oviedo-Trespalacios, Oscar; Haque, Md Mazharul; King, Mark; Washington, Simon
2018-05-29
This study investigated how situational characteristics typically encountered in the transport system influence drivers' perceived likelihood of engaging in mobile phone multitasking. The impacts of mobile phone tasks, perceived environmental complexity/risk, and drivers' individual differences were evaluated as relevant individual predictors within the behavioral adaptation framework. An innovative questionnaire, which includes randomized textual and visual scenarios, was administered to collect data from a sample of 447 drivers in South East Queensland-Australia (66% females; n = 296). The likelihood of engaging in a mobile phone task across various scenarios was modeled by a random parameters ordered probit model. Results indicated that drivers who are female, are frequent users of phones for texting/answering calls, have less favorable attitudes towards safety, and are highly disinhibited were more likely to report stronger intentions of engaging in mobile phone multitasking. However, more years with a valid driving license, self-efficacy toward self-regulation in demanding traffic conditions and police enforcement, texting tasks, and demanding traffic conditions were negatively related to self-reported likelihood of mobile phone multitasking. The unobserved heterogeneity warned of riskier groups among female drivers and participants who need a lot of convincing to believe that multitasking while driving is dangerous. This research concludes that behavioral adaptation theory is a robust framework explaining self-regulation of distracted drivers. © 2018 Society for Risk Analysis.
Bates, S E; Sansom, M S; Ball, F G; Ramsey, R L; Usherwood, P N
1990-01-01
Gigaohm recordings have been made from glutamate receptor channels in excised, outside-out patches of collagenase-treated locust muscle membrane. The channels in the excised patches exhibit the kinetic state switching first seen in megaohm recordings from intact muscle fibers. Analysis of channel dwell time distributions reveals that the gating mechanism contains at least four open states and at least four closed states. Dwell time autocorrelation function analysis shows that there are at least three gateways linking the open states of the channel with the closed states. A maximum likelihood procedure has been used to fit six different gating models to the single channel data. Of these models, a cooperative model yields the best fit, and accurately predicts most features of the observed channel gating kinetics. PMID:1696510
40 CFR 52.1490 - Original identification of plan.
Code of Federal Regulations, 2012 CFR
2012-07-01
... measures. (ii) A modeling analysis indicating 1982 attainment. (iii) Documentation of the modeling analysis... agencies, (ii) Additional supporting documentation for the 1982 attainment modeling analysis which included... factors for the model. (iii) A revised 1982 attainment modeling analysis and supporting documentation...
40 CFR 52.1490 - Original identification of plan.
Code of Federal Regulations, 2013 CFR
2013-07-01
... measures. (ii) A modeling analysis indicating 1982 attainment. (iii) Documentation of the modeling analysis... agencies, (ii) Additional supporting documentation for the 1982 attainment modeling analysis which included... factors for the model. (iii) A revised 1982 attainment modeling analysis and supporting documentation...
40 CFR 52.1490 - Original identification of plan.
Code of Federal Regulations, 2014 CFR
2014-07-01
... measures. (ii) A modeling analysis indicating 1982 attainment. (iii) Documentation of the modeling analysis... agencies, (ii) Additional supporting documentation for the 1982 attainment modeling analysis which included... factors for the model. (iii) A revised 1982 attainment modeling analysis and supporting documentation...
Wubs, Matthias; Bshary, Redouan; Lehmann, Laurent
2016-06-15
Cooperation based on mutual investments can occur between unrelated individuals when they are engaged in repeated interactions. Individuals then need to use a conditional strategy to deter their interaction partners from defecting. Responding to defection such that the future payoff of a defector is reduced relative to cooperating with it is called a partner control mechanism. Three main partner control mechanisms are (i) to switch from cooperation to defection when being defected ('positive reciprocity'), (ii) to actively reduce the payoff of a defecting partner ('punishment'), or (iii) to stop interacting and switch partner ('partner switching'). However, such mechanisms to stabilize cooperation are often studied in isolation from each other. In order to better understand the conditions under which each partner control mechanism tends to be favoured by selection, we here analyse by way of individual-based simulations the coevolution between positive reciprocity, punishment, and partner switching. We show that random interactions in an unstructured population and a high number of rounds increase the likelihood that selection favours partner switching. In contrast, interactions localized in small groups (without genetic structure) increase the likelihood that selection favours punishment and/or positive reciprocity. This study thus highlights the importance of comparing different control mechanisms for cooperation under different conditions. © 2016 The Author(s).
Development and validation of a parent-report measure for detection of cognitive delay in infancy.
Schafer, Graham; Genesoni, Lucia; Boden, Greg; Doll, Helen; Jones, Rosamond A K; Gray, Ron; Adams, Eleri; Jefferson, Ros
2014-12-01
To develop a brief, parent-completed instrument (ERIC - Early Report by Infant Caregivers) for detection of cognitive delay in 10- to 24-month-olds born preterm, or of low birthweight, or with perinatal complications, and to establish ERIC's diagnostic properties. Scores for ERIC were collected from the parents of 317 children meeting ≥inclusion criterion (birthweight <1500 g, gestational age <34 completed weeks, 5 min Apgar score <7, or presence of hypoxic-ischaemic encephalopathy) and no exclusion criteria. Children were assessed using a criterion score of below 80 on the Bayley Scales of Infant and Toddler Development-III cognitive scale. Items were retained according to their individual associations with delay. Sensitivity, specificity, and positive and negative predictive values were estimated and a truncated ERIC was developed for use in children <14 months old. ERIC correctly detected developmental delay in 17 out of 18 children in the sample, with 94.4% sensitivity, 76.9% specificity, 19.8% positive predictive value, 99.6% negative predictive value, 4.09 likelihood ratio positive, and 0.07 likelihood ratio negative. ERIC has potential value as a quickly administered diagnostic instrument for the absence of early cognitive delay in 10- to 24-month-old preterm infants and as a screen for cognitive delay. © 2014 Mac Keith Press.
Phylogenetic analysis of the envelope protein (domain lll) of dengue 4 viruses
Mota, Javier; Ramos-Castañeda, José; Rico-Hesse, Rebeca; Ramos, Celso
2011-01-01
Objective To evaluate the genetic variability of domain III of envelope (E) protein and to estimate phylogenetic relationships of dengue 4 (Den-4) viruses isolated in Mexico and from other endemic areas of the world. Material and Methods A phylogenetic study of domain III of envelope (E) protein of Den-4 viruses was conducted in 1998 using virus strains from Mexico and other parts of the world, isolated in different years. Specific primers were used to amplify by RT-PCR the domain III and to obtain nucleotide sequence. Based on nucleotide and deduced aminoacid sequence, genetic variability was estimated and a phylogenetic tree was generated. To make an easy genetic analysis of domain III region, a Restriction Fragment Length Polymorphism (RFLP) assay was performed, using six restriction enzymes. Results Study results demonstrate that nucleotide and aminoacid sequence analysis of domain III are similar to those reported from the complete E protein gene. Based on the RFLP analysis of domain III using the restriction enzymes Nla III, Dde I and Cfo I, Den-4 viruses included in this study were clustered into genotypes 1 and 2 previously reported. Conclusions Study results suggest that domain III may be used as a genetic marker for phylogenetic and molecular epidemiology studies of dengue viruses. The English version of this paper is available too at: http://www.insp.mx/salud/index.html PMID:12132320
Confirmatory Factor Analysis of the WISC-III with Child Psychiatric Inpatients.
ERIC Educational Resources Information Center
Tupa, David J.; Wright, Margaret O'Dougherty; Fristad, Mary A.
1997-01-01
Factor models of the Wechsler Intelligence Scale for Children-Third Edition (WISC-III) for one, two, three, and four factors were tested using confirmatory factor analysis with a sample of 177 child psychiatric inpatients. The four-factor model proposed in the WISC-III manual provided the best fit to the data. (SLD)
Chen, Yunshun; Lun, Aaron T L; Smyth, Gordon K
2016-01-01
In recent years, RNA sequencing (RNA-seq) has become a very widely used technology for profiling gene expression. One of the most common aims of RNA-seq profiling is to identify genes or molecular pathways that are differentially expressed (DE) between two or more biological conditions. This article demonstrates a computational workflow for the detection of DE genes and pathways from RNA-seq data by providing a complete analysis of an RNA-seq experiment profiling epithelial cell subsets in the mouse mammary gland. The workflow uses R software packages from the open-source Bioconductor project and covers all steps of the analysis pipeline, including alignment of read sequences, data exploration, differential expression analysis, visualization and pathway analysis. Read alignment and count quantification is conducted using the Rsubread package and the statistical analyses are performed using the edgeR package. The differential expression analysis uses the quasi-likelihood functionality of edgeR.
Schwappach, David L. B.; Gehring, Katrin
2014-01-01
Purpose To investigate the likelihood of speaking up about patient safety in oncology and to clarify the effect of clinical and situational context factors on the likelihood of voicing concerns. Patients and Methods 1013 nurses and doctors in oncology rated four clinical vignettes describing coworkers’ errors and rule violations in a self-administered factorial survey (65% response rate). Multiple regression analysis was used to model the likelihood of speaking up as outcome of vignette attributes, responder’s evaluations of the situation and personal characteristics. Results Respondents reported a high likelihood of speaking up about patient safety but the variation between and within types of errors and rule violations was substantial. Staff without managerial function provided significantly higher levels of decision difficulty and discomfort to speak up. Based on the information presented in the vignettes, 74%−96% would speak up towards a supervisor failing to check a prescription, 45%−81% would point a coworker to a missed hand disinfection, 82%−94% would speak up towards nurses who violate a safety rule in medication preparation, and 59%−92% would question a doctor violating a safety rule in lumbar puncture. Several vignette attributes predicted the likelihood of speaking up. Perceived potential harm, anticipated discomfort, and decision difficulty were significant predictors of the likelihood of speaking up. Conclusions Clinicians’ willingness to speak up about patient safety is considerably affected by contextual factors. Physicians and nurses without managerial function report substantial discomfort with speaking up. Oncology departments should provide staff with clear guidance and trainings on when and how to voice safety concerns. PMID:25116338
Somers, George T; Spencer, Ryan J
2012-04-01
Do undergraduate rural clinical rotations increase the likelihood of medical students to choose a rural career once pre-existent likelihood is accounted for? A prospective, controlled quasi-experiment using self-paired scores on the SOMERS Index of rural career choice likelihood, before and after 3 years of clinical rotations in either mainly rural or mainly urban locations. Monash University medical school, Australia. Fifty-eight undergraduate-entry medical students (35% of the 2002 entry class). The SOMERS Index of rural career choice likelihood and its component indicators. There was an overall decline in SOMERS Index score (22%) and in each of its components (12-41%). Graduating students who attended rural rotations were more likely to choose a rural career on graduation (difference in SOMERS score: 24.1 (95% CI, 15.0-33.3) P<0.0001); however, at entry, students choosing rural rotations had an even greater SOMERS score (difference: 27.1 (95% CI, 18.2-36.1) P<0.0001). Self-paired pre-post reductions in likelihood were not affected by attending mainly rural or urban rotations, nor were there differences based on rural background alone or sex. While rural rotations are an important component of undergraduate medical training, it is the nature of the students choosing to study in rural locations rather than experiences during the course that is the greater influence on rural career choice. In order to improve the rural medical workforce crisis, medical schools should attract more students with pre-existent likelihood to choose a rural career. The SOMERS Index was found to be a useful tool for this quantitative analysis. © 2012 The Authors. Australian Journal of Rural Health © 2012 National Rural Health Alliance Inc.
Kanda, Kojun; Gomez, R. Antonio; Van Driesche, Richard; Miller, Kelly B.; Maddison, David R.
2016-01-01
Abstract Stygoporus oregonensis Larson & LaBonte is a little-known subterranean diving beetle, which, until recently, had not been collected since the type series was taken from a shallow well in western Oregon, USA, in 1984. Here we report the discovery of additional specimens collected from a nearby well in the Willamette Valley. Sequence data from four mitochondrial genes, wingless, and histone III place Stygoporus Larson & LaBonte in the predominantly Mediterranean subtribe Siettitiina of the Hydroporini. Morphological support for these results is discussed, and details of the collecting circumstances of the new specimens are presented. We argue that the biogeographic patterns of Nearctic Siettitiina highlight the likelihood of additional undiscovered subterranean dytiscids in North America. PMID:27920606
Johnson, Rebecca N; Agapow, Paul-Michael; Crozier, Ross H
2003-11-01
The ant subfamily Formicinae is a large assemblage (2458 species (J. Nat. Hist. 29 (1995) 1037), including species that weave leaf nests together with larval silk and in which the metapleural gland-the ancestrally defining ant character-has been secondarily lost. We used sequences from two mitochondrial genes (cytochrome b and cytochrome oxidase 2) from 18 formicine and 4 outgroup taxa to derive a robust phylogeny, employing a search for tree islands using 10000 randomly constructed trees as starting points and deriving a maximum likelihood consensus tree from the ML tree and those not significantly different from it. Non-parametric bootstrapping showed that the ML consensus tree fit the data significantly better than three scenarios based on morphology, with that of Bolton (Identification Guide to the Ant Genera of the World, Harvard University Press, Cambridge, MA) being the best among these alternative trees. Trait mapping showed that weaving had arisen at least four times and possibly been lost once. A maximum likelihood analysis showed that loss of the metapleural gland is significantly associated with the weaver life-pattern. The graph of the frequencies with which trees were discovered versus their likelihood indicates that trees with high likelihoods have much larger basins of attraction than those with lower likelihoods. While this result indicates that single searches are more likely to find high- than low-likelihood tree islands, it also indicates that searching only for the single best tree may lose important information.
Quasi-likelihood generalized linear regression analysis of fatality risk data
DOT National Transportation Integrated Search
2009-01-01
Transportation-related fatality risks is a function of many interacting human, vehicle, and environmental factors. Statisitcally valid analysis of such data is challenged both by the complexity of plausable structural models relating fatality rates t...
Barwick, R S; Mohammed, H O; White, M E; Bryant, R B
2003-03-01
A study was conducted to identify factors associated with the likelihood of detecting Giardia spp. and Cryptosporidium spp. in the soil of dairy farms in a watershed area. A total of 37 farms were visited, and 782 soil samples were collected from targeted areas on these farms. The samples were analyzed for the presence of Cryptosporidium spp. oocysts, Giardia spp. cysts, percent moisture content, and pH. Logistic regression analysis was used to identify risk factors associated with the likelihood of the presence of these organisms. The use of the land at the sampling site was associated with the likelihood of environmental contamination with Cryptosporidium spp. Barn cleaner equipment area and agricultural fields were associated with increased likelihood of environmental contamination with Cryptosporidium spp. The risk of environmental contamination decreased with the pH of the soil and with the score of the potential likelihood of Cryptosporidium spp. The size of the sampling site, as determined by the sampling design, in square feet, was associated nonlinearly with the risk of detecting Cryptosporidium spp. The likelihood of the Giardia cyst in the soil increased with the prevalence of Giardia spp. in animals (i.e., 18 to 39%). As the size of the farm increased, there was decreased risk of Giardia spp. in the soil, and sampling sites which were covered with brush or bare soil showed a decrease in likelihood of detecting Giardia spp. when compared to land which had managed grass. The number of cattle on the farm less than 6 mo of age was negatively associated with the risk of detecting Giardia spp. in the soil, and the percent moisture content was positively associated with the risk of detecting Giardia spp. Our study showed that these two protozoan exist in dairy farm soil at different rates, and this risk could be modified by manipulating the pH of the soil.
A Molecular Phylogeny of the Chalcidoidea (Hymenoptera)
Munro, James B.; Heraty, John M.; Burks, Roger A.; Hawks, David; Mottern, Jason; Cruaud, Astrid; Rasplus, Jean-Yves; Jansta, Petr
2011-01-01
Chalcidoidea (Hymenoptera) are extremely diverse with more than 23,000 species described and over 500,000 species estimated to exist. This is the first comprehensive phylogenetic analysis of the superfamily based on a molecular analysis of 18S and 28S ribosomal gene regions for 19 families, 72 subfamilies, 343 genera and 649 species. The 56 outgroups are comprised of Ceraphronoidea and most proctotrupomorph families, including Mymarommatidae. Data alignment and the impact of ambiguous regions are explored using a secondary structure analysis and automated (MAFFT) alignments of the core and pairing regions and regions of ambiguous alignment. Both likelihood and parsimony approaches are used to analyze the data. Overall there is no impact of alignment method, and few but substantial differences between likelihood and parsimony approaches. Monophyly of Chalcidoidea and a sister group relationship between Mymaridae and the remaining Chalcidoidea is strongly supported in all analyses. Either Mymarommatoidea or Diaprioidea are the sister group of Chalcidoidea depending on the analysis. Likelihood analyses place Rotoitidae as the sister group of the remaining Chalcidoidea after Mymaridae, whereas parsimony nests them within Chalcidoidea. Some traditional family groups are supported as monophyletic (Agaonidae, Eucharitidae, Encyrtidae, Eulophidae, Leucospidae, Mymaridae, Ormyridae, Signiphoridae, Tanaostigmatidae and Trichogrammatidae). Several other families are paraphyletic (Perilampidae) or polyphyletic (Aphelinidae, Chalcididae, Eupelmidae, Eurytomidae, Pteromalidae, Tetracampidae and Torymidae). Evolutionary scenarios discussed for Chalcidoidea include the evolution of phytophagy, egg parasitism, sternorrhynchan parasitism, hypermetamorphic development and heteronomy. PMID:22087244
Forecasting drought risks for a water supply storage system using bootstrap position analysis
Tasker, Gary; Dunne, Paul
1997-01-01
Forecasting the likelihood of drought conditions is an integral part of managing a water supply storage and delivery system. Position analysis uses a large number of possible flow sequences as inputs to a simulation of a water supply storage and delivery system. For a given set of operating rules and water use requirements, water managers can use such a model to forecast the likelihood of specified outcomes such as reservoir levels falling below a specified level or streamflows falling below statutory passing flows a few months ahead conditioned on the current reservoir levels and streamflows. The large number of possible flow sequences are generated using a stochastic streamflow model with a random resampling of innovations. The advantages of this resampling scheme, called bootstrap position analysis, are that it does not rely on the unverifiable assumption of normality and it allows incorporation of long-range weather forecasts into the analysis.
NASA Astrophysics Data System (ADS)
Luo, Yi-Ming; Chen, Zhe; Tang, Rui-Ren; Xiao, Lin-Xiang; Peng, Hong-Jian
2008-02-01
A novel bis- β-diketon ligand, 1,1'-(2,6-bispyridyl)bis-3-phenyl-1,3-propane-dione (L), was designed and synthesized and its complexes with Eu(III), Tb(III), Sm(III) and Gd(III) ions were successfully prepared. The ligand and the corresponding metal complexes were characterized by elemental analysis, and infrared, mass and proton nuclear magnetic resonance spectroscopy. Analysis of the IR spectra suggested that each of the lanthanide metal ions coordinated to the ligand via the carbonyl oxygen atoms and the nitrogen atom of the pyridine ring. The fluorescence properties of these complexes in solid state were investigated and it was discovered that all of the lanthanide ions could be sensitized by the ligand (L) to some extent. In particular, the Tb(III) complex was an excellent green-emitter and would be a potential candidate material for applications in organic light-emitting devices (OLEDs) and medical diagnosis.
Thermal Design and Analysis of an ISS Science Payload - SAGE III on ISS
NASA Technical Reports Server (NTRS)
Liles, Kaitlin, A. K.; Amundsen, Ruth M.; Davis, Warren T.; Carrillo, Laurie Y.
2017-01-01
The Stratospheric Aerosol and Gas Experiment III (SAGE III) instrument is the fifth in a series of instruments developed for monitoring aerosols and gaseous constituents in the stratosphere and troposphere. SAGE III will be launched in the SpaceX Dragon vehicle in 2017 and mounted to an external stowage platform on the International Space Station (ISS) to begin its three-year mission. The SAGE III thermal team at NASA Langley Research Center (LaRC) worked with ISS thermal engineers to ensure that SAGE III, as an ISS payload, would meet requirements specific to ISS and the Dragon vehicle. This document presents an overview of the SAGE III thermal design and analysis efforts, focusing on aspects that are relevant for future ISS payload developers. This includes development of detailed and reduced Thermal Desktop (TD) models integrated with the ISS and launch vehicle models, definition of analysis cases necessary to verify thermal requirements considering all mission phases from launch through installation and operation on-orbit, and challenges associated with thermal hardware selection including heaters, multi-layer insulation (MLI) blankets, and thermal tapes.
Kendler, Kenneth S
2017-03-01
This review traces, through psychiatric textbooks, the history of the Kraepelinian concept of paranoia in the 20th century and then relates the common reported symptoms and signs to the diagnostic criteria for paranoia/delusional disorder in DSM-III through DSM-5. Clinical descriptions of paranoia appearing in 10 textbooks, published 1899 to 1970, revealed 11 prominent symptoms and signs reported by 5 or more authors. Three symptoms (systematized delusions, minimal hallucinations, and prominent ideas of reference) and 2 signs (chronic course and minimal affective deterioration) were reported by 8 or 9 of the authors. Four textbook authors rejected the Kraepelinian concept of paranoia. A weak relationship was seen between the frequency with which the clinical features were reported and the likelihood of their inclusion in modern DSM manuals. Indeed, the diagnostic criteria for paranoia/delusional disorder shifted substantially from DSM-III to DSM-5. The modern operationalized criteria for paranoia/delusional disorder do not well reflect the symptoms and signs frequently reported by historical experts. In contrast to results of similar reviews for depression, schizophrenia and mania, the clinical construct of paranoia/delusional disorder has been somewhat unstable in Western Psychiatry since the turn of the 20th century as reflected in both textbooks and the DSM editions. © The Author 2017. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Microbial community assembly patterns under incipient conditions in a basaltic soil system
NASA Astrophysics Data System (ADS)
Sengupta, A.; Stegen, J.; Alves Meira Neto, A.; Wang, Y.; Chorover, J.; Troch, P. A. A.; Maier, R. M.
2017-12-01
In sub-surface environments, the biotic components are critically linked to the abiotic processes. However, there is limited understanding of community establishment, functional associations, and community assembly processes of such microbes in sub-surface environments. This study presents the first analysis of microbial signatures in an incipient terrestrial basalt soil system conducted under controlled conditions. A sub-meter scale sampling of a soil mesocosm revealed the contrasting distribution patterns of simple soil parameters such as bulk density and electrical conductivity. Phylogenetic analysis of 16S rRNA gene indicated the presence of a total 40 bacterial and archaeal phyla, with high relative abundance of Actinobacteria on the surface and highest abundance of Proteobacteria throughout the system. Community diversity patterns were inferred to be dependent on depth profile and average water content in the system. Predicted functional gene analysis suggested mixotrophy lifestyles with both autotrophic and heterotrophic metabolisms, likelihood of a unique salt tolerant methanogenic pathway with links to novel Euryarchea, signatures of an incomplete nitrogen cycle, and predicted enzymes of extracellular iron (II) to iron (III) conversion followed by intracellular uptake, transport and regulation. Null modeling revealed microbial community assembly was predominantly governed by variable selection, but the influence of the variable selection did not show systematic spatial structure. The presence of significant heterogeneity in predicted functions and ecologically deterministic shifts in community composition in a homogeneous incipient basalt highlights the complexity exhibited by microorganisms even in the simplest of environmental systems. This presents an opportunity to further develop our understanding of how microbial communities establish, evolve, impact, and respond in sub-surface environments.
Modeling Adversaries in Counterterrorism Decisions Using Prospect Theory.
Merrick, Jason R W; Leclerc, Philip
2016-04-01
Counterterrorism decisions have been an intense area of research in recent years. Both decision analysis and game theory have been used to model such decisions, and more recently approaches have been developed that combine the techniques of the two disciplines. However, each of these approaches assumes that the attacker is maximizing its utility. Experimental research shows that human beings do not make decisions by maximizing expected utility without aid, but instead deviate in specific ways such as loss aversion or likelihood insensitivity. In this article, we modify existing methods for counterterrorism decisions. We keep expected utility as the defender's paradigm to seek for the rational decision, but we use prospect theory to solve for the attacker's decision to descriptively model the attacker's loss aversion and likelihood insensitivity. We study the effects of this approach in a critical decision, whether to screen containers entering the United States for radioactive materials. We find that the defender's optimal decision is sensitive to the attacker's levels of loss aversion and likelihood insensitivity, meaning that understanding such descriptive decision effects is important in making such decisions. © 2014 Society for Risk Analysis.
Collinear Latent Variables in Multilevel Confirmatory Factor Analysis
van de Schoot, Rens; Hox, Joop
2014-01-01
Because variables may be correlated in the social and behavioral sciences, multicollinearity might be problematic. This study investigates the effect of collinearity manipulated in within and between levels of a two-level confirmatory factor analysis by Monte Carlo simulation. Furthermore, the influence of the size of the intraclass correlation coefficient (ICC) and estimation method; maximum likelihood estimation with robust chi-squares and standard errors and Bayesian estimation, on the convergence rate are investigated. The other variables of interest were rate of inadmissible solutions and the relative parameter and standard error bias on the between level. The results showed that inadmissible solutions were obtained when there was between level collinearity and the estimation method was maximum likelihood. In the within level multicollinearity condition, all of the solutions were admissible but the bias values were higher compared with the between level collinearity condition. Bayesian estimation appeared to be robust in obtaining admissible parameters but the relative bias was higher than for maximum likelihood estimation. Finally, as expected, high ICC produced less biased results compared to medium ICC conditions. PMID:29795827
NASA Astrophysics Data System (ADS)
Peng, Juan-juan; Wang, Jian-qiang; Yang, Wu-E.
2017-01-01
In this paper, multi-criteria decision-making (MCDM) problems based on the qualitative flexible multiple criteria method (QUALIFLEX), in which the criteria values are expressed by multi-valued neutrosophic information, are investigated. First, multi-valued neutrosophic sets (MVNSs), which allow the truth-membership function, indeterminacy-membership function and falsity-membership function to have a set of crisp values between zero and one, are introduced. Then the likelihood of multi-valued neutrosophic number (MVNN) preference relations is defined and the corresponding properties are also discussed. Finally, an extended QUALIFLEX approach based on likelihood is explored to solve MCDM problems where the assessments of alternatives are in the form of MVNNs; furthermore an example is provided to illustrate the application of the proposed method, together with a comparison analysis.
Physical constraints on the likelihood of life on exoplanets
NASA Astrophysics Data System (ADS)
Lingam, Manasvi; Loeb, Abraham
2018-04-01
One of the most fundamental questions in exoplanetology is to determine whether a given planet is habitable. We estimate the relative likelihood of a planet's propensity towards habitability by considering key physical characteristics such as the role of temperature on ecological and evolutionary processes, and atmospheric losses via hydrodynamic escape and stellar wind erosion. From our analysis, we demonstrate that Earth-sized exoplanets in the habitable zone around M-dwarfs seemingly display much lower prospects of being habitable relative to Earth, owing to the higher incident ultraviolet fluxes and closer distances to the host star. We illustrate our results by specifically computing the likelihood (of supporting life) for the recently discovered exoplanets, Proxima b and TRAPPIST-1e, which we find to be several orders of magnitude smaller than that of Earth.
A maximum pseudo-profile likelihood estimator for the Cox model under length-biased sampling
Huang, Chiung-Yu; Qin, Jing; Follmann, Dean A.
2012-01-01
This paper considers semiparametric estimation of the Cox proportional hazards model for right-censored and length-biased data arising from prevalent sampling. To exploit the special structure of length-biased sampling, we propose a maximum pseudo-profile likelihood estimator, which can handle time-dependent covariates and is consistent under covariate-dependent censoring. Simulation studies show that the proposed estimator is more efficient than its competitors. A data analysis illustrates the methods and theory. PMID:23843659
Liou, Kevin; Negishi, Kazuaki; Ho, Suyen; Russell, Elizabeth A; Cranney, Greg; Ooi, Sze-Yuan
2016-08-01
Global longitudinal strain (GLS) is well validated and has important applications in contemporary clinical practice. The aim of this analysis was to evaluate the accuracy of resting peak GLS in the diagnosis of obstructive coronary artery disease (CAD). A systematic literature search was performed through July 2015 using four databases. Data were extracted independently by two authors and correlated before analyses. Using a random-effect model, the pooled sensitivity, specificity, positive likelihood ratio, negative likelihood ratio, diagnostic odds ratio, and summary area under the curve for GLS were estimated with their respective 95% CIs. Screening of 1,669 articles yielded 10 studies with 1,385 patients appropriate for inclusion in the analysis. The mean age and left ventricular ejection fraction were 59.9 years and 61.1%. On the whole, 54.9% and 20.9% of the patients had hypertension and diabetes, respectively. Overall, abnormal GLS detected moderate to severe CAD with a pooled sensitivity, specificity, positive likelihood ratio, and negative likelihood ratio of 74.4%, 72.1%, 2.9, and 0.35 respectively. The area under the curve and diagnostic odds ratio were 0.81 and 8.5. The mean values of GLS for those with and without CAD were -16.5% (95% CI, -15.8% to -17.3%) and -19.7% (95% CI, -18.8% to -20.7%), respectively. Subgroup analyses for patients with severe CAD and normal left ventricular ejection fractions yielded similar results. Current evidence supports the use of GLS in the detection of moderate to severe obstructive CAD in symptomatic patients. GLS may complement existing diagnostic algorithms and act as an early adjunctive marker of cardiac ischemia. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.
A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment
ERIC Educational Resources Information Center
Finch, Holmes; Monahan, Patrick
2008-01-01
This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…
NASA Astrophysics Data System (ADS)
Wang, Z.
2015-12-01
For decades, distributed and lumped hydrological models have furthered our understanding of hydrological system. The development of hydrological simulation in large scale and high precision elaborated the spatial descriptions and hydrological behaviors. Meanwhile, the new trend is also followed by the increment of model complexity and number of parameters, which brings new challenges of uncertainty quantification. Generalized Likelihood Uncertainty Estimation (GLUE) has been widely used in uncertainty analysis for hydrological models referring to Monte Carlo method coupled with Bayesian estimation. However, the stochastic sampling method of prior parameters adopted by GLUE appears inefficient, especially in high dimensional parameter space. The heuristic optimization algorithms utilizing iterative evolution show better convergence speed and optimality-searching performance. In light of the features of heuristic optimization algorithms, this study adopted genetic algorithm, differential evolution, shuffled complex evolving algorithm to search the parameter space and obtain the parameter sets of large likelihoods. Based on the multi-algorithm sampling, hydrological model uncertainty analysis is conducted by the typical GLUE framework. To demonstrate the superiority of the new method, two hydrological models of different complexity are examined. The results shows the adaptive method tends to be efficient in sampling and effective in uncertainty analysis, providing an alternative path for uncertainty quantilization.
Variational Bayesian Parameter Estimation Techniques for the General Linear Model
Starke, Ludger; Ostwald, Dirk
2017-01-01
Variational Bayes (VB), variational maximum likelihood (VML), restricted maximum likelihood (ReML), and maximum likelihood (ML) are cornerstone parametric statistical estimation techniques in the analysis of functional neuroimaging data. However, the theoretical underpinnings of these model parameter estimation techniques are rarely covered in introductory statistical texts. Because of the widespread practical use of VB, VML, ReML, and ML in the neuroimaging community, we reasoned that a theoretical treatment of their relationships and their application in a basic modeling scenario may be helpful for both neuroimaging novices and practitioners alike. In this technical study, we thus revisit the conceptual and formal underpinnings of VB, VML, ReML, and ML and provide a detailed account of their mathematical relationships and implementational details. We further apply VB, VML, ReML, and ML to the general linear model (GLM) with non-spherical error covariance as commonly encountered in the first-level analysis of fMRI data. To this end, we explicitly derive the corresponding free energy objective functions and ensuing iterative algorithms. Finally, in the applied part of our study, we evaluate the parameter and model recovery properties of VB, VML, ReML, and ML, first in an exemplary setting and then in the analysis of experimental fMRI data acquired from a single participant under visual stimulation. PMID:28966572
Ling, Cheng; Hamada, Tsuyoshi; Gao, Jingyang; Zhao, Guoguang; Sun, Donghong; Shi, Weifeng
2016-01-01
MrBayes is a widespread phylogenetic inference tool harnessing empirical evolutionary models and Bayesian statistics. However, the computational cost on the likelihood estimation is very expensive, resulting in undesirably long execution time. Although a number of multi-threaded optimizations have been proposed to speed up MrBayes, there are bottlenecks that severely limit the GPU thread-level parallelism of likelihood estimations. This study proposes a high performance and resource-efficient method for GPU-oriented parallelization of likelihood estimations. Instead of having to rely on empirical programming, the proposed novel decomposition storage model implements high performance data transfers implicitly. In terms of performance improvement, a speedup factor of up to 178 can be achieved on the analysis of simulated datasets by four Tesla K40 cards. In comparison to the other publicly available GPU-oriented MrBayes, the tgMC 3 ++ method (proposed herein) outperforms the tgMC 3 (v1.0), nMC 3 (v2.1.1) and oMC 3 (v1.00) methods by speedup factors of up to 1.6, 1.9 and 2.9, respectively. Moreover, tgMC 3 ++ supports more evolutionary models and gamma categories, which previous GPU-oriented methods fail to take into analysis.
Application of the Bootstrap Methods in Factor Analysis.
ERIC Educational Resources Information Center
Ichikawa, Masanori; Konishi, Sadanori
1995-01-01
A Monte Carlo experiment was conducted to investigate the performance of bootstrap methods in normal theory maximum likelihood factor analysis when the distributional assumption was satisfied or unsatisfied. Problems arising with the use of bootstrap methods are highlighted. (SLD)
Boeve, Koos; Schepman, Kees-Pieter; Vegt, Bert van der; Schuuring, Ed; Roodenburg, Jan L; Brouwers, Adrienne H; Witjes, Max J
2017-03-01
There is debate if the lymphatic drainage pattern of oral maxillary cancer is to the retropharyngeal lymph nodes or to the cervical lymph nodes. Insight in drainage patterns is important for the indication for neck treatment. The purpose of this study was to identify the lymphatic drainage pattern of oral maxillary cancer via preoperative lymphoscintigraphy. Eleven consecutive patients with oral maxillary cancer treated in our center between December 1, 2012, and April 22, 2016 were studied. Sentinel lymph nodes identified by preoperative lymphoscintigraphy after injection of 99m Tc-nanocolloid and by intraoperative detection using a γ-probe, were surgically removed and histopathologically examined. In 10 patients, sentinel lymph nodes were detected and harvested at cervical levels I, II, or III in the neck. In 2 patients, a parapharyngeal sentinel lymph node was detected. One of the harvested sentinel lymph nodes (1/19) was tumor positive. This study suggests the likelihood of 73% of exclusively cervical level I to III sentinel lymph nodes in oral maxillary cancer. © 2016 Wiley Periodicals, Inc. Head Neck 39: 486-491, 2017. © 2016 Wiley Periodicals, Inc.
Lovelock, Paul K; Spurdle, Amanda B; Mok, Myth T S; Farrugia, Daniel J; Lakhani, Sunil R; Healey, Sue; Arnold, Stephen; Buchanan, Daniel; Couch, Fergus J; Henderson, Beric R; Goldgar, David E; Tavtigian, Sean V; Chenevix-Trench, Georgia; Brown, Melissa A
2007-01-01
Many of the DNA sequence variants identified in the breast cancer susceptibility gene BRCA1 remain unclassified in terms of their potential pathogenicity. Both multifactorial likelihood analysis and functional approaches have been proposed as a means to elucidate likely clinical significance of such variants, but analysis of the comparative value of these methods for classifying all sequence variants has been limited. We have compared the results from multifactorial likelihood analysis with those from several functional analyses for the four BRCA1 sequence variants A1708E, G1738R, R1699Q, and A1708V. Our results show that multifactorial likelihood analysis, which incorporates sequence conservation, co-inheritance, segregation, and tumour immunohistochemical analysis, may improve classification of variants. For A1708E, previously shown to be functionally compromised, analysis of oestrogen receptor, cytokeratin 5/6, and cytokeratin 14 tumour expression data significantly strengthened the prediction of pathogenicity, giving a posterior probability of pathogenicity of 99%. For G1738R, shown to be functionally defective in this study, immunohistochemistry analysis confirmed previous findings of inconsistent 'BRCA1-like' phenotypes for the two tumours studied, and the posterior probability for this variant was 96%. The posterior probabilities of R1699Q and A1708V were 54% and 69%, respectively, only moderately suggestive of increased risk. Interestingly, results from functional analyses suggest that both of these variants have only partial functional activity. R1699Q was defective in foci formation in response to DNA damage and displayed intermediate transcriptional transactivation activity but showed no evidence for centrosome amplification. In contrast, A1708V displayed an intermediate transcriptional transactivation activity and a normal foci formation response in response to DNA damage but induced centrosome amplification. These data highlight the need for a range of functional studies to be performed in order to identify variants with partially compromised function. The results also raise the possibility that A1708V and R1699Q may be associated with a low or moderate risk of cancer. While data pooling strategies may provide more information for multifactorial analysis to improve the interpretation of the clinical significance of these variants, it is likely that the development of current multifactorial likelihood approaches and the consideration of alternative statistical approaches will be needed to determine whether these individually rare variants do confer a low or moderate risk of breast cancer.
Liu, Qiu-Ning; Lin, Kun-Zhang; Yang, Lin-Nan; Dai, Li-Shang; Wang, Lei; Sun, Yu; Qian, Cen; Wei, Guo-Qing; Liu, Dong-Ran; Zhu, Bao-Jian; Liu, Chao-Liang
2015-03-01
Apolipophorin-III (ApoLp-III) acts in lipid transport, lipoprotein metabolism, and innate immunity in insects. In this study, an ApoLp-III gene of Antheraea pernyi pupae (Ap-ApoLp-III) was isolated and characterized. The full-length cDNA of Ap-ApoLp-III is 687 bp, including a 5'-untranslated region (UTR) of 40 bp, 3'-UTR of 86 bp and an open reading frame of 561 bp encoding a polypeptide of 186 amino acids that contains an Apolipophorin-III precursor domain (PF07464). The deduced Ap-apoLp-III protein sequence has 68, 59, and 23% identity with its orthologs of Manduca sexta, Bombyx mori, and Aedes aegypti, respectively. Phylogenetic analysis showed that the Ap-apoLp-III was close to that of Bombycoidea. qPCR analysis revealed that Ap-ApoLp-III expressed during the four developmental stages and in integument, fat body, and ovaries. After six types of microorganism infections, expression levels of the Ap-ApoLp-III gene were upregulated significantly at different time points compared with control. RNA interference (RNAi) of Ap-ApoLp-III showed that the expression of Ap-ApoLp-III was significantly downregulated using qPCR after injection of E. coli. We infer that the Ap-ApoLp-III gene acts in the innate immunity of A. pernyi. © 2014 Wiley Periodicals, Inc.
The Maximum Likelihood Solution for Inclination-only Data
NASA Astrophysics Data System (ADS)
Arason, P.; Levi, S.
2006-12-01
The arithmetic means of inclination-only data are known to introduce a shallowing bias. Several methods have been proposed to estimate unbiased means of the inclination along with measures of the precision. Most of the inclination-only methods were designed to maximize the likelihood function of the marginal Fisher distribution. However, the exact analytical form of the maximum likelihood function is fairly complicated, and all these methods require various assumptions and approximations that are inappropriate for many data sets. For some steep and dispersed data sets, the estimates provided by these methods are significantly displaced from the peak of the likelihood function to systematically shallower inclinations. The problem in locating the maximum of the likelihood function is partly due to difficulties in accurately evaluating the function for all values of interest. This is because some elements of the log-likelihood function increase exponentially as precision parameters increase, leading to numerical instabilities. In this study we succeeded in analytically cancelling exponential elements from the likelihood function, and we are now able to calculate its value for any location in the parameter space and for any inclination-only data set, with full accuracy. Furtermore, we can now calculate the partial derivatives of the likelihood function with desired accuracy. Locating the maximum likelihood without the assumptions required by previous methods is now straight forward. The information to separate the mean inclination from the precision parameter will be lost for very steep and dispersed data sets. It is worth noting that the likelihood function always has a maximum value. However, for some dispersed and steep data sets with few samples, the likelihood function takes its highest value on the boundary of the parameter space, i.e. at inclinations of +/- 90 degrees, but with relatively well defined dispersion. Our simulations indicate that this occurs quite frequently for certain data sets, and relatively small perturbations in the data will drive the maxima to the boundary. We interpret this to indicate that, for such data sets, the information needed to separate the mean inclination and the precision parameter is permanently lost. To assess the reliability and accuracy of our method we generated large number of random Fisher-distributed data sets and used seven methods to estimate the mean inclination and precision paramenter. These comparisons are described by Levi and Arason at the 2006 AGU Fall meeting. The results of the various methods is very favourable to our new robust maximum likelihood method, which, on average, is the most reliable, and the mean inclination estimates are the least biased toward shallow values. Further information on our inclination-only analysis can be obtained from: http://www.vedur.is/~arason/paleomag
Implicit timing activates the left inferior parietal cortex.
Wiener, Martin; Turkeltaub, Peter E; Coslett, H Branch
2010-11-01
Coull and Nobre (2008) suggested that tasks that employ temporal cues might be divided on the basis of whether these cues are explicitly or implicitly processed. Furthermore, they suggested that implicit timing preferentially engages the left cerebral hemisphere. We tested this hypothesis by conducting a quantitative meta-analysis of eleven neuroimaging studies of implicit timing using the activation-likelihood estimation (ALE) algorithm (Turkeltaub, Eden, Jones, & Zeffiro, 2002). Our analysis revealed a single but robust cluster of activation-likelihood in the left inferior parietal cortex (supramarginal gyrus). This result is in accord with the hypothesis that the left hemisphere subserves implicit timing mechanisms. Furthermore, in conjunction with a previously reported meta-analysis of explicit timing tasks, our data support the claim that implicit and explicit timing are supported by at least partially distinct neural structures. Copyright © 2010 Elsevier Ltd. All rights reserved.
Lun, Aaron T L; Chen, Yunshun; Smyth, Gordon K
2016-01-01
RNA sequencing (RNA-seq) is widely used to profile transcriptional activity in biological systems. Here we present an analysis pipeline for differential expression analysis of RNA-seq experiments using the Rsubread and edgeR software packages. The basic pipeline includes read alignment and counting, filtering and normalization, modelling of biological variability and hypothesis testing. For hypothesis testing, we describe particularly the quasi-likelihood features of edgeR. Some more advanced downstream analysis steps are also covered, including complex comparisons, gene ontology enrichment analyses and gene set testing. The code required to run each step is described, along with an outline of the underlying theory. The chapter includes a case study in which the pipeline is used to study the expression profiles of mammary gland cells in virgin, pregnant and lactating mice.
Robust Methods for Moderation Analysis with a Two-Level Regression Model.
Yang, Miao; Yuan, Ke-Hai
2016-01-01
Moderation analysis has many applications in social sciences. Most widely used estimation methods for moderation analysis assume that errors are normally distributed and homoscedastic. When these assumptions are not met, the results from a classical moderation analysis can be misleading. For more reliable moderation analysis, this article proposes two robust methods with a two-level regression model when the predictors do not contain measurement error. One method is based on maximum likelihood with Student's t distribution and the other is based on M-estimators with Huber-type weights. An algorithm for obtaining the robust estimators is developed. Consistent estimates of standard errors of the robust estimators are provided. The robust approaches are compared against normal-distribution-based maximum likelihood (NML) with respect to power and accuracy of parameter estimates through a simulation study. Results show that the robust approaches outperform NML under various distributional conditions. Application of the robust methods is illustrated through a real data example. An R program is developed and documented to facilitate the application of the robust methods.
Ran, Li; Zhao, Wenli; Zhao, Ye; Bu, Huaien
2017-07-01
Contrast-enhanced ultrasound (CEUS) is considered a novel method for diagnosing pancreatic cancer, but currently, there is no conclusive evidence of its accuracy. Using CEUS in discriminating between pancreatic carcinoma and other pancreatic lesions, we aimed to evaluate the diagnostic accuracy of CEUS in predicting pancreatic tumours. Relevant studies were selected from the PubMed, Cochrane Library, Elsevier, CNKI, VIP, and WANFANG databases dating from January 2006 to May 2017. The following terms were used as keywords: "pancreatic cancer" OR "pancreatic carcinoma," "contrast-enhanced ultrasonography" OR "contrast-enhanced ultrasound" OR "CEUS," and "diagnosis." The selection criteria are as follows: pancreatic carcinomas diagnosed by CEUS while the main reference standard was surgical pathology or biopsy (if it involved a clinical diagnosis, particular criteria emphasized); SonoVue or Levovist was the contrast agent; true positive, false positive, false negative, and true negative rates were obtained or calculated to construct the 2 × 2 contingency table; English or Chinese articles; at least 20 patients were enrolled in each group. The Quality Assessment for Studies of Diagnostic Accuracy was employed to evaluate the quality of articles. Pooled sensitivity, specificity, positive likelihood ratio, negative likelihood ratio, diagnostic odds ratio, summary receiver-operating characteristic curves, and the area under curve were evaluated to estimate the overall diagnostic efficiency. Pooled sensitivity, specificity, positive likelihood ratio, and negative likelihood ratio with 95% confidence intervals (CIs) were calculated with fixed-effect models. Eight of 184 records were eligible for a meta-analysis after independent scrutinization by 2 reviewers. The pooled sensitivity, specificity, positive likelihood ratio, negative likelihood ratio, and diagnostic odds ratios were 0.86 (95% CI 0.81-0.90), 0.75 (95% CI 0.68-0.82), 3.56 (95% CI 2.64-4.78), 0.19 (95% CI 0.13-0.27), and 22.260 (95% CI 8.980-55.177), respectively. The area under the SROC curve was 0.9088. CEUS has a satisfying pooled sensitivity and specificity for discriminating pancreatic cancer from other pancreatic lesions.
Wales, Joshua; Kurahashi, Allison M; Husain, Amna
2018-06-20
Home is a preferred place of death for many people; however, access to a home death may not be equitable. The impact of socioeconomic status on one's ability to die at home has been documented, yet there remains little literature exploring mechanisms that contribute to this disparity. By exploring the experiences and insights of physicians who provide end-of-life care in the home, this study aims to identify the factors perceived to influence patients' likelihood of home death and describe the mechanisms by which they interact with socioeconomic status. In this exploratory qualitative study, we conducted interviews with 9 physicians who provide home-based care at a specialized palliative care centre. Participants were asked about their experiences caring for patients at the end of life, focusing on factors believed to impact likelihood of home death with an emphasis on socioeconomic status, and opportunities for intervention. We relied on participants' perceptions of SES, rather than objective measures. We used an inductive content analysis to identify and describe factors that physicians perceive to influence a patient's likelihood of dying at home. Factors identified by physicians were organized into three categories: patient characteristics, physical environment and support network. Patient preference for home death was seen as a necessary factor. If this was established, participants suggested that having a strong support network to supplement professional care was critical to achieving home death. Finally, safe and sustainable housing were also felt to improve likelihood of home death. Higher SES was perceived to increase the likelihood of a desired home death by affording access to more resources within each of the categories. This included better health and health care understanding, a higher capacity for advocacy, a more stable home environment, and more caregiver support. SES was not perceived to be an isolated factor impacting likelihood of home death, but rather a means to address shortfalls in the three identified categories. Identifying the factors that influence ability is the first step in ensuring home death is accessible to all patients who desire it, regardless of socioeconomic status.
Likelihood ratio meta-analysis: New motivation and approach for an old method.
Dormuth, Colin R; Filion, Kristian B; Platt, Robert W
2016-03-01
A 95% confidence interval (CI) in an updated meta-analysis may not have the expected 95% coverage. If a meta-analysis is simply updated with additional data, then the resulting 95% CI will be wrong because it will not have accounted for the fact that the earlier meta-analysis failed or succeeded to exclude the null. This situation can be avoided by using the likelihood ratio (LR) as a measure of evidence that does not depend on type-1 error. We show how an LR-based approach, first advanced by Goodman, can be used in a meta-analysis to pool data from separate studies to quantitatively assess where the total evidence points. The method works by estimating the log-likelihood ratio (LogLR) function from each study. Those functions are then summed to obtain a combined function, which is then used to retrieve the total effect estimate, and a corresponding 'intrinsic' confidence interval. Using as illustrations the CAPRIE trial of clopidogrel versus aspirin in the prevention of ischemic events, and our own meta-analysis of higher potency statins and the risk of acute kidney injury, we show that the LR-based method yields the same point estimate as the traditional analysis, but with an intrinsic confidence interval that is appropriately wider than the traditional 95% CI. The LR-based method can be used to conduct both fixed effect and random effects meta-analyses, it can be applied to old and new meta-analyses alike, and results can be presented in a format that is familiar to a meta-analytic audience. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Brouwer, Derk H.; van Duuren-Stuurman, Birgit; Berges, Markus; Bard, Delphine; Jankowska, Elzbieta; Moehlmann, Carsten; Pelzer, Johannes; Mark, Dave
2013-11-01
Manufactured nano-objects, agglomerates, and aggregates (NOAA) may have adverse effect on human health, but little is known about occupational risks since actual estimates of exposure are lacking. In a large-scale workplace air-monitoring campaign, 19 enterprises were visited and 120 potential exposure scenarios were measured. A multi-metric exposure assessment approach was followed and a decision logic was developed to afford analysis of all results in concert. The overall evaluation was classified by categories of likelihood of exposure. At task level about 53 % showed increased particle number or surface area concentration compared to "background" level, whereas 72 % of the TEM samples revealed an indication that NOAA were present in the workplace. For 54 out of the 120 task-based exposure scenarios, an overall evaluation could be made based on all parameters of the decision logic. For only 1 exposure scenario (approximately 2 %), the highest level of potential likelihood was assigned, whereas in total in 56 % of the exposure scenarios the overall evaluation revealed the lowest level of likelihood. However, for the remaining 42 % exposure to NOAA could not be excluded.
Maximum Likelihood Estimation with Emphasis on Aircraft Flight Data
NASA Technical Reports Server (NTRS)
Iliff, K. W.; Maine, R. E.
1985-01-01
Accurate modeling of flexible space structures is an important field that is currently under investigation. Parameter estimation, using methods such as maximum likelihood, is one of the ways that the model can be improved. The maximum likelihood estimator has been used to extract stability and control derivatives from flight data for many years. Most of the literature on aircraft estimation concentrates on new developments and applications, assuming familiarity with basic estimation concepts. Some of these basic concepts are presented. The maximum likelihood estimator and the aircraft equations of motion that the estimator uses are briefly discussed. The basic concepts of minimization and estimation are examined for a simple computed aircraft example. The cost functions that are to be minimized during estimation are defined and discussed. Graphic representations of the cost functions are given to help illustrate the minimization process. Finally, the basic concepts are generalized, and estimation from flight data is discussed. Specific examples of estimation of structural dynamics are included. Some of the major conclusions for the computed example are also developed for the analysis of flight data.
2013-01-01
Background Falls among the elderly are a major public health concern. Therefore, the possibility of a modeling technique which could better estimate fall probability is both timely and needed. Using biomedical, pharmacological and demographic variables as predictors, latent class analysis (LCA) is demonstrated as a tool for the prediction of falls among community dwelling elderly. Methods Using a retrospective data-set a two-step LCA modeling approach was employed. First, we looked for the optimal number of latent classes for the seven medical indicators, along with the patients’ prescription medication and three covariates (age, gender, and number of medications). Second, the appropriate latent class structure, with the covariates, were modeled on the distal outcome (fall/no fall). The default estimator was maximum likelihood with robust standard errors. The Pearson chi-square, likelihood ratio chi-square, BIC, Lo-Mendell-Rubin Adjusted Likelihood Ratio test and the bootstrap likelihood ratio test were used for model comparisons. Results A review of the model fit indices with covariates shows that a six-class solution was preferred. The predictive probability for latent classes ranged from 84% to 97%. Entropy, a measure of classification accuracy, was good at 90%. Specific prescription medications were found to strongly influence group membership. Conclusions In conclusion the LCA method was effective at finding relevant subgroups within a heterogenous at-risk population for falling. This study demonstrated that LCA offers researchers a valuable tool to model medical data. PMID:23705639
Schroeder, Thomas J; Rodgers, Gregory B
2013-10-01
While unintentional injuries and hazard patterns involving consumer products have been studied extensively in recent years, little attention has focused on the characteristics of those who are hospitalized after treatment in emergency departments, as opposed to those treated and released. This study quantifies the impact of the age and sex of the injury victims, and other factors, on the likelihood of hospitalization. The analysis focuses on consumer product injuries, and was based on approximately 400,000 injury cases reported through the U.S. Consumer Product Safety Commission's National Electronic Injury Surveillance System, a national probability sample of U.S. hospital emergency departments. Logistic regression was used to quantify the factors associated with the likelihood of hospitalization. The analysis suggests a smooth U-shaped relationship between the age of the victim and the likelihood of hospitalization, declining from about 3.4% for children under age 5 years to 1.9% for 15-24 year-olds, but then rising to more than 25% for those ages 75 years and older. The likelihood of hospitalization was also significantly affected by the victim's sex, as well as by the types of products involved, fire involvement, and the size and type of hospital at which the injury was treated. This study shows that the probability of hospitalization is strongly correlated with the characteristics of those who are injured, as well as other factors. Published by Elsevier Ltd.
Harbert, Robert S; Nixon, Kevin C
2015-08-01
• Plant distributions have long been understood to be correlated with the environmental conditions to which species are adapted. Climate is one of the major components driving species distributions. Therefore, it is expected that the plants coexisting in a community are reflective of the local environment, particularly climate.• Presented here is a method for the estimation of climate from local plant species coexistence data. The method, Climate Reconstruction Analysis using Coexistence Likelihood Estimation (CRACLE), is a likelihood-based method that employs specimen collection data at a global scale for the inference of species climate tolerance. CRACLE calculates the maximum joint likelihood of coexistence given individual species climate tolerance characterization to estimate the expected climate.• Plant distribution data for more than 4000 species were used to show that this method accurately infers expected climate profiles for 165 sites with diverse climatic conditions. Estimates differ from the WorldClim global climate model by less than 1.5°C on average for mean annual temperature and less than ∼250 mm for mean annual precipitation. This is a significant improvement upon other plant-based climate-proxy methods.• CRACLE validates long hypothesized interactions between climate and local associations of plant species. Furthermore, CRACLE successfully estimates climate that is consistent with the widely used WorldClim model and therefore may be applied to the quantitative estimation of paleoclimate in future studies. © 2015 Botanical Society of America, Inc.
Cross-validation to select Bayesian hierarchical models in phylogenetics.
Duchêne, Sebastián; Duchêne, David A; Di Giallonardo, Francesca; Eden, John-Sebastian; Geoghegan, Jemma L; Holt, Kathryn E; Ho, Simon Y W; Holmes, Edward C
2016-05-26
Recent developments in Bayesian phylogenetic models have increased the range of inferences that can be drawn from molecular sequence data. Accordingly, model selection has become an important component of phylogenetic analysis. Methods of model selection generally consider the likelihood of the data under the model in question. In the context of Bayesian phylogenetics, the most common approach involves estimating the marginal likelihood, which is typically done by integrating the likelihood across model parameters, weighted by the prior. Although this method is accurate, it is sensitive to the presence of improper priors. We explored an alternative approach based on cross-validation that is widely used in evolutionary analysis. This involves comparing models according to their predictive performance. We analysed simulated data and a range of viral and bacterial data sets using a cross-validation approach to compare a variety of molecular clock and demographic models. Our results show that cross-validation can be effective in distinguishing between strict- and relaxed-clock models and in identifying demographic models that allow growth in population size over time. In most of our empirical data analyses, the model selected using cross-validation was able to match that selected using marginal-likelihood estimation. The accuracy of cross-validation appears to improve with longer sequence data, particularly when distinguishing between relaxed-clock models. Cross-validation is a useful method for Bayesian phylogenetic model selection. This method can be readily implemented even when considering complex models where selecting an appropriate prior for all parameters may be difficult.
Who are the patients that default tuberculosis treatment? - space matters!
Nunes, C; Duarte, R; Veiga, A M; Taylor, B
2017-04-01
The goals of this article are: (i) to understand how individual characteristics affect the likelihood of patients defaulting their pulmonary tuberculosis (PTB) treatment regimens; (ii) to quantify the predictive capacity of these risk factors; and (iii) to quantify and map spatial variation in the risk of defaulting. We used logistic regression models and generalized additive models with a spatial component to determine the odds of default across continental Portugal. We focused on new PTB cases, diagnosed between 2000 and 2013, and included some individual information (sex, age, residence area, alcohol abuse, intravenous drug use, homelessness, HIV, imprisonment status). We found that the global default rate was 4·88%, higher in individuals with well-known risk profiles (males, immigrants, HIV positive, homeless, prisoners, alcohol and drug users). Of specific epidemiological interest was that our geographical analysis found that Portugal's main urban areas (the two biggest cities) and one tourist region have higher default rates compared to the rest of the country, after adjusting for the previously mentioneded risk factors. The challenge of treatment defaulting, either due to other individual non-measured characteristics, healthcare system failure or patient recalcitrance requires further analysis in the spatio-temporal domain. Our findings suggest the presence of significant within-country variation in the risk of defaulting that cannot be explained by these classical individual risk factors alone. The methods we advocate are simple to implement and could easily be applied to other diseases.
Olsho, Lauren Ew; Payne, Gayle Holmes; Walker, Deborah Klein; Baronberg, Sabrina; Jernigan, Jan; Abrami, Alyson
2015-10-01
The present study examines the impact of Health Bucks, a farmers' market incentive programme, on awareness of and access to farmers' markets, and fruit and vegetable purchase and consumption in low-income New York City neighbourhoods. The evaluation used two primary data collection methods: (i) an on-site point-of-purchase survey of farmers' market shoppers; and (ii) a random-digit-dial telephone survey of residents in neighbourhoods where the programme operates. Additionally, we conducted a quasi-experimental analysis examining differential time trends in consumption before and after programme introduction using secondary Community Health Survey (CHS) data. New York City farmers' markets and communities. Farmers' market shoppers (n 2287) completing point-of-purchase surveys in a representative sample of New York City farmers' markets in 2010; residents (n 1025) completing random-digit-dial telephone survey interviews in 2010; and respondents (n 35 606) completing CHS interviews in 2002, 2004, 2008 and 2009. Greater Health Bucks exposure was associated with: (i) greater awareness of farmers' markets; (ii) increased frequency and amount of farmers' market purchases; and (iii) greater likelihood of a self-reported year-over-year increase in fruit and vegetable consumption. However, our CHS analysis did not detect impacts on consumption. While our study provides promising evidence that use of farmers' market incentives is associated with increased awareness and use of farmers' markets, additional research is needed to better understand impacts on fruit and vegetable consumption.
Oliva-Moreno, Juan; Peña-Longobardo, Luz María; Mar, Javier; Masjuan, Jaime; Soulard, Stéphane; Gonzalez-Rojas, Nuria; Becerra, Virginia; Casado, Miguel Ángel; Torres, Covadonga; Yebenes, María; Quintana, Manuel; Alvarez-Sabín, Jose
2018-01-01
The aim of this article was to analyze the likelihood of receiving informal care after a stroke and to study the burden and risk of burnout of primary caregivers in Spain. The CONOCES study is an epidemiological, observational, prospective, multicenter study of patients diagnosed with stroke and admitted to a Stroke Unit in the Spanish healthcare system. At 3 and 12 months post-event, we estimated the time spent caring for the patient and the burden borne by primary caregivers. Several multivariate models were applied to estimate the likelihood of receiving informal caregiving, the burden, and the likelihood of caregivers being at a high risk of burnout. Eighty percent of those still alive at 3 and 12 months poststroke were receiving informal care. More than 40% of those receiving care needed a secondary caregiver at 3 months poststroke. The likelihood of receiving informal care was associated with stroke severity and the individual's health-related quality of life. When informal care was provided, both the burden borne by caregivers and the likelihood of caregivers being at a high risk of burnout was associated with (1) caregiving hours; (2) the patient's health-related quality of life; (3) the severity of the stroke measured at discharge; (4) the patient having atrial fibrillation; and (5) the degree of dependence. This study reveals the heavy burden borne by the caregivers of stroke survivors. Our analysis also identifies explanatory and predictive variables for the likelihood of receiving informal care, caregiver burden, and high risk of burnout. © 2017 American Heart Association, Inc.
Hollander, Anna-Clara
2013-01-01
The aim of this PhD project was to increase knowledge, using population-based registers, of how pre- and post-migration factors and social determinants of health are associated with inequalities in poor mental health and mortality among refugees and other immigrants to Sweden. Study I and II had cross-sectional designs and used logistic regression analysis to study differences in poor mental health (measured with prescribed psychotropic drugs purchased) between refugee and non-refugee immigrants. In Study I, there was a significant difference in poor mental health between female refugees and non-refugees (OR=1.27; CI=1.15–1.40) when adjusted for socio-economic factors. In Study II, refugees of most origins had a higher likelihood of poor mental health than non-refugees of the same origin. Study III and IV had cohort designs and used Cox regression analysis. Study III analysed mortality rates among non-labour immigrants. Male refugees had higher relative risks of mortality from cardiovascular disease (HR=1.53; CI=1.04–2.24) and external causes (HR=1.59; CI=1.01–2.50) than male non-refugees did, adjusted for socio-economic factors. Study IV included the population with a strong connection to the labour market in 1999 to analyse the relative risk of hospitalisation due to depressive disorder following unemployment. The lowest relative risk was found among employed Swedish-born men and the highest among foreign-born females who lost employment during follow-up (HR=3.47; CI=3.02–3.98). Immigrants, and particularly refugees, have poorer mental health than native Swedes. Refugee men have a higher relative mortality risk for cardiovascular disease and external causes of death than do non-refugees. The relative risk of hospitalisation due to depressive disorder following unemployment was highest among immigrant women. To promote mental health and reduce mortality among immigrants, it is important to consider pre- and post-migration factors and the general social determinants of health. PMID:23810108
Harding, Richard; Marchetti, Stefano; Onwuteaka-Philipsen, Bregje D; Wilson, Donna M; Ruiz-Ramos, Miguel; Cardenas-Turanzas, Maria; Rhee, YongJoo; Morin, Lucas; Hunt, Katherine; Teno, Joan; Hakanson, Cecilia; Houttekier, Dirk; Deliens, Luc; Cohen, Joachim
2018-01-25
With over 1 million HIV-related deaths annually, quality end-of-life care remains a priority. Given strong public preference for home death, place of death is an important consideration for quality care. This 11 country study aimed to i) describe the number, proportion of all deaths, and demographics of HIV-related deaths; ii) identify place of death; iii) compare place of death to cancer patients iv), determine patient/health system factors associated with place of HIV-related death. In this retrospective analysis of death certification, data were extracted for the full population (ICD-10 codes B20-B24) for 1-year period: deceased's demographic characteristics, place of death, healthcare supply. i) 19,739 deaths were attributed to HIV. The highest proportion (per 1000 deaths) was for Mexico (9.8‰), and the lowest Sweden (0.2‰). The majority of deaths were among men (75%), and those aged <50 (69.1%). ii) Hospital was most common place of death in all countries: from 56.6% in the Netherlands to 90.9% in South Korea. The least common places were hospice facility (3.3%-5.7%), nursing home (0%-17.6%) and home (5.9%-26.3%).iii) Age-standardised relative risks found those with HIV less likely to die at home and more likely to die in hospital compared with cancer patients, and in most countries more likely to die in a nursing home. iv) Multivariate analysis found that men were more likely to die at home in UK, Canada, USA and Mexico; a greater number of hospital beds reduced the likelihood of dying at home in Italy and Mexico; a higher number of GPs was associated with home death in Italy and Mexico. With increasing comorbidity among people ageing with HIV, it is essential that end-of-life preferences are established and met. Differences in place of death according to country and diagnosis demonstrate the importance of ensuring a "good death" for people with HIV, alongside efforts to optimise treatment.
Dong, Xiaolin; Zhai, Yifan; Hu, Meiying; Zhong, Guohua; Huang, Wanjun; Zheng, Zhihua; Han, Pengfei
2013-01-01
Background Rhodojaponin III, as a botanical insecticide, affects a wide variety of biological processes in insects, including reduction of feeding, suspension of development, and oviposition deterring of adults in a dose-dependent manner. However, the mode of these actions remains obscure. Principal Findings In this study, a comparative proteomic approach was adopted to examine the effect of rhodojaponin III on the Plutella xyllostella (L.). Following treating 48 hours, newly emergence moths were collected and protein samples were prepared. The proteins were separated by 2-DE, and total 31 proteins were significantly affected by rhodojaponin III compared to the control identified by MALDI-TOF/TOF-MS/MS. These differentially expressed proteins act in the nervous transduction, odorant degradation and metabolic change pathways. Further, gene expression patterns in treated and untreated moths were confirmed by qRT-PCR and western blot analysis. RNAi of the chemosensory protein (PxCSP) gene resulted in oviposition significantly increased on cabbage plants treated with rhodojaponin III. Conclusions These rhodojaponin III-induced proteins and gene properties analysis would be essential for a better understanding of the potential molecular mechanism of the response to rhodojaponin III from moths of P. xylostella. PMID:23861792
NASA Technical Reports Server (NTRS)
Bueno, R. A.
1977-01-01
Results of the generalized likelihood ratio (GLR) technique for the detection of failures in aircraft application are presented, and its relationship to the properties of the Kalman-Bucy filter is examined. Under the assumption that the system is perfectly modeled, the detectability and distinguishability of four failure types are investigated by means of analysis and simulations. Detection of failures is found satisfactory, but problems in identifying correctly the mode of a failure may arise. These issues are closely examined as well as the sensitivity of GLR to modeling errors. The advantages and disadvantages of this technique are discussed, and various modifications are suggested to reduce its limitations in performance and computational complexity.
A maximum likelihood analysis of the CoGeNT public dataset
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kelso, Chris, E-mail: ckelso@unf.edu
The CoGeNT detector, located in the Soudan Underground Laboratory in Northern Minnesota, consists of a 475 grams (fiducial mass of 330 grams) target mass of p-type point contact germanium detector that measures the ionization charge created by nuclear recoils. This detector has searched for recoils created by dark matter since December of 2009. We analyze the public dataset from the CoGeNT experiment to search for evidence of dark matter interactions with the detector. We perform an unbinned maximum likelihood fit to the data and compare the significance of different WIMP hypotheses relative to each other and the null hypothesis ofmore » no WIMP interactions. This work presents the current status of the analysis.« less
NASA Astrophysics Data System (ADS)
Aartsen, M. G.; Abraham, K.; Ackermann, M.; Adams, J.; Aguilar, J. A.; Ahlers, M.; Ahrens, M.; Altmann, D.; Anderson, T.; Ansseau, I.; Anton, G.; Archinger, M.; Arguelles, C.; Arlen, T. C.; Auffenberg, J.; Bai, X.; Barwick, S. W.; Baum, V.; Bay, R.; Beatty, J. J.; Becker Tjus, J.; Becker, K.-H.; Beiser, E.; BenZvi, S.; Berghaus, P.; Berley, D.; Bernardini, E.; Bernhard, A.; Besson, D. Z.; Binder, G.; Bindig, D.; Bissok, M.; Blaufuss, E.; Blumenthal, J.; Boersma, D. J.; Bohm, C.; Börner, M.; Bos, F.; Bose, D.; Böser, S.; Botner, O.; Braun, J.; Brayeur, L.; Bretz, H.-P.; Buzinsky, N.; Casey, J.; Casier, M.; Cheung, E.; Chirkin, D.; Christov, A.; Clark, K.; Classen, L.; Coenders, S.; Collin, G. H.; Conrad, J. M.; Cowen, D. F.; Cruz Silva, A. H.; Danninger, M.; Daughhetee, J.; Davis, J. C.; Day, M.; de André, J. P. A. M.; De Clercq, C.; del Pino Rosendo, E.; Dembinski, H.; De Ridder, S.; Desiati, P.; de Vries, K. D.; de Wasseige, G.; de With, M.; DeYoung, T.; Díaz-Vélez, J. C.; di Lorenzo, V.; Dumm, J. P.; Dunkman, M.; Eberhardt, B.; Edsjö, J.; Ehrhardt, T.; Eichmann, B.; Euler, S.; Evenson, P. A.; Fahey, S.; Fazely, A. R.; Feintzeig, J.; Felde, J.; Filimonov, K.; Finley, C.; Flis, S.; Fösig, C.-C.; Fuchs, T.; Gaisser, T. K.; Gaior, R.; Gallagher, J.; Gerhardt, L.; Ghorbani, K.; Gier, D.; Gladstone, L.; Glagla, M.; Glüsenkamp, T.; Goldschmidt, A.; Golup, G.; Gonzalez, J. G.; Góra, D.; Grant, D.; Griffith, Z.; Groß, A.; Ha, C.; Haack, C.; Haj Ismail, A.; Hallgren, A.; Halzen, F.; Hansen, E.; Hansmann, B.; Hanson, K.; Hebecker, D.; Heereman, D.; Helbing, K.; Hellauer, R.; Hickford, S.; Hignight, J.; Hill, G. C.; Hoffman, K. D.; Hoffmann, R.; Holzapfel, K.; Homeier, A.; Hoshina, K.; Huang, F.; Huber, M.; Huelsnitz, W.; Hulth, P. O.; Hultqvist, K.; In, S.; Ishihara, A.; Jacobi, E.; Japaridze, G. S.; Jeong, M.; Jero, K.; Jones, B. J. P.; Jurkovic, M.; Kappes, A.; Karg, T.; Karle, A.; Katz, U.; Kauer, M.; Keivani, A.; Kelley, J. L.; Kemp, J.; Kheirandish, A.; Kiryluk, J.; Klein, S. R.; Kohnen, G.; Koirala, R.; Kolanoski, H.; Konietz, R.; Köpke, L.; Kopper, C.; Kopper, S.; Koskinen, D. J.; Kowalski, M.; Krings, K.; Kroll, G.; Kroll, M.; Krückl, G.; Kunnen, J.; Kurahashi, N.; Kuwabara, T.; Labare, M.; Lanfranchi, J. L.; Larson, M. J.; Lesiak-Bzdak, M.; Leuermann, M.; Leuner, J.; Lu, L.; Lünemann, J.; Madsen, J.; Maggi, G.; Mahn, K. B. M.; Mandelartz, M.; Maruyama, R.; Mase, K.; Matis, H. S.; Maunu, R.; McNally, F.; Meagher, K.; Medici, M.; Meier, M.; Meli, A.; Menne, T.; Merino, G.; Meures, T.; Miarecki, S.; Middell, E.; Mohrmann, L.; Montaruli, T.; Morse, R.; Nahnhauer, R.; Naumann, U.; Neer, G.; Niederhausen, H.; Nowicki, S. C.; Nygren, D. R.; Obertacke Pollmann, A.; Olivas, A.; Omairat, A.; O'Murchadha, A.; Palczewski, T.; Pandya, H.; Pankova, D. V.; Paul, L.; Pepper, J. A.; Pérez de los Heros, C.; Pfendner, C.; Pieloth, D.; Pinat, E.; Posselt, J.; Price, P. B.; Przybylski, G. T.; Quinnan, M.; Raab, C.; Rädel, L.; Rameez, M.; Rawlins, K.; Reimann, R.; Relich, M.; Resconi, E.; Rhode, W.; Richman, M.; Richter, S.; Riedel, B.; Robertson, S.; Rongen, M.; Rott, C.; Ruhe, T.; Ryckbosch, D.; Sabbatini, L.; Sander, H.-G.; Sandrock, A.; Sandroos, J.; Sarkar, S.; Savage, C.; Schatto, K.; Schimp, M.; Schlunder, P.; Schmidt, T.; Schoenen, S.; Schöneberg, S.; Schönwald, A.; Schulte, L.; Schumacher, L.; Scott, P.; Seckel, D.; Seunarine, S.; Silverwood, H.; Soldin, D.; Song, M.; Spiczak, G. M.; Spiering, C.; Stahlberg, M.; Stamatikos, M.; Stanev, T.; Stasik, A.; Steuer, A.; Stezelberger, T.; Stokstad, R. G.; Stößl, A.; Ström, R.; Strotjohann, N. L.; Sullivan, G. W.; Sutherland, M.; Taavola, H.; Taboada, I.; Tatar, J.; Ter-Antonyan, S.; Terliuk, A.; Te{š}ić, G.; Tilav, S.; Toale, P. A.; Tobin, M. N.; Toscano, S.; Tosi, D.; Tselengidou, M.; Turcati, A.; Unger, E.; Usner, M.; Vallecorsa, S.; Vandenbroucke, J.; van Eijndhoven, N.; Vanheule, S.; van Santen, J.; Veenkamp, J.; Vehring, M.; Voge, M.; Vraeghe, M.; Walck, C.; Wallace, A.; Wallraff, M.; Wandkowsky, N.; Weaver, Ch.; Wendt, C.; Westerhoff, S.; Whelan, B. J.; Wiebe, K.; Wiebusch, C. H.; Wille, L.; Williams, D. R.; Wills, L.; Wissing, H.; Wolf, M.; Wood, T. R.; Woschnagg, K.; Xu, D. L.; Xu, X. W.; Xu, Y.; Yanez, J. P.; Yodh, G.; Yoshida, S.; Zoll, M.
2016-04-01
We present an improved event-level likelihood formalism for including neutrino telescope data in global fits to new physics. We derive limits on spin-dependent dark matter-proton scattering by employing the new formalism in a re-analysis of data from the 79-string IceCube search for dark matter annihilation in the Sun, including explicit energy information for each event. The new analysis excludes a number of models in the weak-scale minimal supersymmetric standard model (MSSM) for the first time. This work is accompanied by the public release of the 79-string IceCube data, as well as an associated computer code for applying the new likelihood to arbitrary dark matter models.
Tissue preservation with mass spectroscopic analysis: Implications for cancer diagnostics.
Hall, O Morgan; Peer, Cody J; Figg, William D
2018-05-17
Surgical intervention is a common treatment modality for localized cancer. Post-operative analysis involves evaluation of surgical margins to assess whether all malignant tissue has been resected because positive surgical margins lead to a greater likelihood of recurrence. Secondary treatments are utilized to minimize the negative effects of positive surgical margins. Recently, in Science Translational Medicine, Zhang et al describe a new mass spectroscopic technique that could potentially decrease the likelihood of positive surgical margins. Their nondestructive in vivo tissue sampling leads to a highly accurate and rapid cancer diagnosis with great precision between healthy and malignant tissue. This new tool has the potential to improve surgical margins and accelerate cancer diagnostics by analyzing biomolecular signatures of various tissues and diseases.
Meta-analysis: accuracy of rapid tests for malaria in travelers returning from endemic areas.
Marx, Arthur; Pewsner, Daniel; Egger, Matthias; Nüesch, Reto; Bucher, Heiner C; Genton, Blaise; Hatz, Christoph; Jüni, Peter
2005-05-17
Microscopic diagnosis of malaria is unreliable outside specialized centers. Rapid tests have become available in recent years, but their accuracy has not been assessed systematically. To determine the accuracy of rapid diagnostic tests for ruling out malaria in nonimmune travelers returning from malaria-endemic areas. The authors searched MEDLINE, EMBASE, CAB Health, and CINAHL (1988 to September 2004); hand-searched conference proceedings; checked reference lists; and contacted experts and manufacturers. Diagnostic accuracy studies in nonimmune individuals with suspected malaria were included if they compared rapid tests with expert microscopic examination or polymerase chain reaction tests. Data on study and patient characteristics and results were extracted in duplicate. The main outcome was the likelihood ratio for a negative test result (negative likelihood ratio) for Plasmodium falciparum malaria. Likelihood ratios were combined by using random-effects meta-analysis, stratified by the antigen targeted (histidine-rich protein-2 [HRP-2] or parasite lactate dehydrogenase [LDH]) and by test generation. Nomograms of post-test probabilities were constructed. The authors included 21 studies and 5747 individuals. For P. falciparum, HRP-2-based tests were more accurate than parasite LDH-based tests: Negative likelihood ratios were 0.08 and 0.13, respectively (P = 0.019 for difference). Three-band HRP-2 tests had similar negative likelihood ratios but higher positive likelihood ratios compared with 2-band tests (34.7 vs. 98.5; P = 0.003). For P. vivax, negative likelihood ratios tended to be closer to 1.0 for HRP-2-based tests than for parasite LDH-based tests (0.24 vs. 0.13; P = 0.22), but analyses were based on a few heterogeneous studies. Negative likelihood ratios for the diagnosis of P. malariae or P. ovale were close to 1.0 for both types of tests. In febrile travelers returning from sub-Saharan Africa, the typical probability of P. falciparum malaria is estimated at 1.1% (95% CI, 0.6% to 1.9%) after a negative 3-band HRP-2 test result and 97% (CI, 92% to 99%) after a positive test result. Few studies evaluated 3-band HRP-2 tests. The evidence is also limited for species other than P. falciparum because of the few available studies and their more heterogeneous results. Further studies are needed to determine whether the use of rapid diagnostic tests improves outcomes in returning travelers with suspected malaria. Rapid malaria tests may be a useful diagnostic adjunct to microscopy in centers without major expertise in tropical medicine. Initial decisions on treatment initiation and choice of antimalarial drugs can be based on travel history and post-test probabilities after rapid testing. Expert microscopy is still required for species identification and confirmation.
Lee, E Henry; Wickham, Charlotte; Beedlow, Peter A; Waschmann, Ronald S; Tingey, David T
2017-10-01
A time series intervention analysis (TSIA) of dendrochronological data to infer the tree growth-climate-disturbance relations and forest disturbance history is described. Maximum likelihood is used to estimate the parameters of a structural time series model with components for climate and forest disturbances (i.e., pests, diseases, fire). The statistical method is illustrated with a tree-ring width time series for a mature closed-canopy Douglas-fir stand on the west slopes of the Cascade Mountains of Oregon, USA that is impacted by Swiss needle cast disease caused by the foliar fungus, Phaecryptopus gaeumannii (Rhode) Petrak. The likelihood-based TSIA method is proposed for the field of dendrochronology to understand the interaction of temperature, water, and forest disturbances that are important in forest ecology and climate change studies.
Exploring Neutrino Oscillation Parameter Space with a Monte Carlo Algorithm
NASA Astrophysics Data System (ADS)
Espejel, Hugo; Ernst, David; Cogswell, Bernadette; Latimer, David
2015-04-01
The χ2 (or likelihood) function for a global analysis of neutrino oscillation data is first calculated as a function of the neutrino mixing parameters. A computational challenge is to obtain the minima or the allowed regions for the mixing parameters. The conventional approach is to calculate the χ2 (or likelihood) function on a grid for a large number of points, and then marginalize over the likelihood function. As the number of parameters increases with the number of neutrinos, making the calculation numerically efficient becomes necessary. We implement a new Monte Carlo algorithm (D. Foreman-Mackey, D. W. Hogg, D. Lang and J. Goodman, Publications of the Astronomical Society of the Pacific, 125 306 (2013)) to determine its computational efficiency at finding the minima and allowed regions. We examine a realistic example to compare the historical and the new methods.
Efficient estimators for likelihood ratio sensitivity indices of complex stochastic dynamics.
Arampatzis, Georgios; Katsoulakis, Markos A; Rey-Bellet, Luc
2016-03-14
We demonstrate that centered likelihood ratio estimators for the sensitivity indices of complex stochastic dynamics are highly efficient with low, constant in time variance and consequently they are suitable for sensitivity analysis in long-time and steady-state regimes. These estimators rely on a new covariance formulation of the likelihood ratio that includes as a submatrix a Fisher information matrix for stochastic dynamics and can also be used for fast screening of insensitive parameters and parameter combinations. The proposed methods are applicable to broad classes of stochastic dynamics such as chemical reaction networks, Langevin-type equations and stochastic models in finance, including systems with a high dimensional parameter space and/or disparate decorrelation times between different observables. Furthermore, they are simple to implement as a standard observable in any existing simulation algorithm without additional modifications.
He, Ye; Lin, Huazhen; Tu, Dongsheng
2018-06-04
In this paper, we introduce a single-index threshold Cox proportional hazard model to select and combine biomarkers to identify patients who may be sensitive to a specific treatment. A penalized smoothed partial likelihood is proposed to estimate the parameters in the model. A simple, efficient, and unified algorithm is presented to maximize this likelihood function. The estimators based on this likelihood function are shown to be consistent and asymptotically normal. Under mild conditions, the proposed estimators also achieve the oracle property. The proposed approach is evaluated through simulation analyses and application to the analysis of data from two clinical trials, one involving patients with locally advanced or metastatic pancreatic cancer and one involving patients with resectable lung cancer. Copyright © 2018 John Wiley & Sons, Ltd.
Quasar microlensing models with constraints on the Quasar light curves
NASA Astrophysics Data System (ADS)
Tie, S. S.; Kochanek, C. S.
2018-01-01
Quasar microlensing analyses implicitly generate a model of the variability of the source quasar. The implied source variability may be unrealistic yet its likelihood is generally not evaluated. We used the damped random walk (DRW) model for quasar variability to evaluate the likelihood of the source variability and applied the revized algorithm to a microlensing analysis of the lensed quasar RX J1131-1231. We compared estimates of the size of the quasar disc and the average stellar mass of the lens galaxy with and without applying the DRW likelihoods for the source variability model and found no significant effect on the estimated physical parameters. The most likely explanation is that unreliastic source light-curve models are generally associated with poor microlensing fits that already make a negligible contribution to the probability distributions of the derived parameters.
Efficient estimators for likelihood ratio sensitivity indices of complex stochastic dynamics
NASA Astrophysics Data System (ADS)
Arampatzis, Georgios; Katsoulakis, Markos A.; Rey-Bellet, Luc
2016-03-01
We demonstrate that centered likelihood ratio estimators for the sensitivity indices of complex stochastic dynamics are highly efficient with low, constant in time variance and consequently they are suitable for sensitivity analysis in long-time and steady-state regimes. These estimators rely on a new covariance formulation of the likelihood ratio that includes as a submatrix a Fisher information matrix for stochastic dynamics and can also be used for fast screening of insensitive parameters and parameter combinations. The proposed methods are applicable to broad classes of stochastic dynamics such as chemical reaction networks, Langevin-type equations and stochastic models in finance, including systems with a high dimensional parameter space and/or disparate decorrelation times between different observables. Furthermore, they are simple to implement as a standard observable in any existing simulation algorithm without additional modifications.
Fuzzy multinomial logistic regression analysis: A multi-objective programming approach
NASA Astrophysics Data System (ADS)
Abdalla, Hesham A.; El-Sayed, Amany A.; Hamed, Ramadan
2017-05-01
Parameter estimation for multinomial logistic regression is usually based on maximizing the likelihood function. For large well-balanced datasets, Maximum Likelihood (ML) estimation is a satisfactory approach. Unfortunately, ML can fail completely or at least produce poor results in terms of estimated probabilities and confidence intervals of parameters, specially for small datasets. In this study, a new approach based on fuzzy concepts is proposed to estimate parameters of the multinomial logistic regression. The study assumes that the parameters of multinomial logistic regression are fuzzy. Based on the extension principle stated by Zadeh and Bárdossy's proposition, a multi-objective programming approach is suggested to estimate these fuzzy parameters. A simulation study is used to evaluate the performance of the new approach versus Maximum likelihood (ML) approach. Results show that the new proposed model outperforms ML in cases of small datasets.
Efficient estimators for likelihood ratio sensitivity indices of complex stochastic dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arampatzis, Georgios; Katsoulakis, Markos A.; Rey-Bellet, Luc
2016-03-14
We demonstrate that centered likelihood ratio estimators for the sensitivity indices of complex stochastic dynamics are highly efficient with low, constant in time variance and consequently they are suitable for sensitivity analysis in long-time and steady-state regimes. These estimators rely on a new covariance formulation of the likelihood ratio that includes as a submatrix a Fisher information matrix for stochastic dynamics and can also be used for fast screening of insensitive parameters and parameter combinations. The proposed methods are applicable to broad classes of stochastic dynamics such as chemical reaction networks, Langevin-type equations and stochastic models in finance, including systemsmore » with a high dimensional parameter space and/or disparate decorrelation times between different observables. Furthermore, they are simple to implement as a standard observable in any existing simulation algorithm without additional modifications.« less
An Evaluation of the Effects of Variable Sampling on Component, Image, and Factor Analysis.
ERIC Educational Resources Information Center
Velicer, Wayne F.; Fava, Joseph L.
1987-01-01
Principal component analysis, image component analysis, and maximum likelihood factor analysis were compared to assess the effects of variable sampling. Results with respect to degree of saturation and average number of variables per factor were clear and dramatic. Differential effects on boundary cases and nonconvergence problems were also found.…
Optimizing EDELWEISS detectors for low-mass WIMP searches
NASA Astrophysics Data System (ADS)
Arnaud, Q.; Armengaud, E.; Augier, C.; Benoît, A.; Bergé, L.; Billard, J.; Broniatowski, A.; Camus, P.; Cazes, A.; Chapellier, M.; Charlieux, F.; de Jésus, M.; Dumoulin, L.; Eitel, K.; Foerster, N.; Gascon, J.; Giuliani, A.; Gros, M.; Hehn, L.; Jin, Y.; Juillard, A.; Kleifges, M.; Kozlov, V.; Kraus, H.; Kudryavtsev, V. A.; Le-Sueur, H.; Maisonobe, R.; Marnieros, S.; Navick, X.-F.; Nones, C.; Olivieri, E.; Pari, P.; Paul, B.; Poda, D.; Queguiner, E.; Rozov, S.; Sanglard, V.; Scorza, S.; Siebenborn, B.; Vagneron, L.; Weber, M.; Yakushev, E.; EDELWEISS Collaboration
2018-01-01
The physics potential of EDELWEISS detectors for the search of low-mass weakly interacting massive particles (WIMPs) is studied. Using a data-driven background model, projected exclusion limits are computed using frequentist and multivariate analysis approaches, namely, profile likelihood and boosted decision tree. Both current and achievable experimental performances are considered. The optimal strategy for detector optimization depends critically on whether the emphasis is put on WIMP masses below or above ˜5 GeV /c2 . The projected sensitivity for the next phase of the EDELWEISS-III experiment at the Modane Underground Laboratory (LSM) for low-mass WIMP search is presented. By 2018 an upper limit on the spin-independent WIMP-nucleon cross section of σSI=7 ×10-42 cm2 is expected for a WIMP mass in the range 2 - 5 GeV /c2 . The requirements for a future hundred-kilogram-scale experiment designed to reach the bounds imposed by the coherent scattering of solar neutrinos are also described. By improving the ionization resolution down to 50 eVe e , we show that such an experiment installed in an even lower background environment (e.g., at SNOLAB) together with an exposure of 1 000 kg .yr , should allow us to observe about 80
Uncertainty estimation of Intensity-Duration-Frequency relationships: A regional analysis
NASA Astrophysics Data System (ADS)
Mélèse, Victor; Blanchet, Juliette; Molinié, Gilles
2018-03-01
We propose in this article a regional study of uncertainties in IDF curves derived from point-rainfall maxima. We develop two generalized extreme value models based on the simple scaling assumption, first in the frequentist framework and second in the Bayesian framework. Within the frequentist framework, uncertainties are obtained i) from the Gaussian density stemming from the asymptotic normality theorem of the maximum likelihood and ii) with a bootstrap procedure. Within the Bayesian framework, uncertainties are obtained from the posterior densities. We confront these two frameworks on the same database covering a large region of 100, 000 km2 in southern France with contrasted rainfall regime, in order to be able to draw conclusion that are not specific to the data. The two frameworks are applied to 405 hourly stations with data back to the 1980's, accumulated in the range 3 h-120 h. We show that i) the Bayesian framework is more robust than the frequentist one to the starting point of the estimation procedure, ii) the posterior and the bootstrap densities are able to better adjust uncertainty estimation to the data than the Gaussian density, and iii) the bootstrap density give unreasonable confidence intervals, in particular for return levels associated to large return period. Therefore our recommendation goes towards the use of the Bayesian framework to compute uncertainty.
Attigala, Lakshmi; Wysocki, William P; Duvall, Melvin R; Clark, Lynn G
2016-08-01
We explored phylogenetic relationships among the twelve lineages of the temperate woody bamboo clade (tribe Arundinarieae) based on plastid genome (plastome) sequence data. A representative sample of 28 taxa was used and maximum parsimony, maximum likelihood and Bayesian inference analyses were conducted to estimate the Arundinarieae phylogeny. All the previously recognized clades of Arundinarieae were supported, with Ampelocalamus calcareus (Clade XI) as sister to the rest of the temperate woody bamboos. Well supported sister relationships between Bergbambos tessellata (Clade I) and Thamnocalamus spathiflorus (Clade VII) and between Kuruna (Clade XII) and Chimonocalmus (Clade III) were revealed by the current study. The plastome topology was tested by taxon removal experiments and alternative hypothesis testing and the results supported the current plastome phylogeny as robust. Neighbor-net analyses showed few phylogenetic signal conflicts, but suggested some potentially complex relationships among these taxa. Analyses of morphological character evolution of rhizomes and reproductive structures revealed that pachymorph rhizomes were most likely the ancestral state in Arundinarieae. In contrast leptomorph rhizomes either evolved once with reversions to the pachymorph condition or multiple times in Arundinarieae. Further, pseudospikelets evolved independently at least twice in the Arundinarieae, but the ancestral state is ambiguous. Copyright © 2016 Elsevier Inc. All rights reserved.
Tracking of childhood overweight into adulthood: a systematic review of the literature.
Singh, A S; Mulder, C; Twisk, J W R; van Mechelen, W; Chinapaw, M J M
2008-09-01
Overweight and obesity in youth are important public health concerns and are of particular interest because of possible long-term associations with adult weight status and morbidity. The aim of this study was to systematically review the literature and update evidence concerning persistence of childhood overweight. A computerized bibliographical search--restricted to studies with a prospective or retrospective longitudinal design--was conducted. Two authors independently extracted data and assessed the methodological quality of the included studies in four dimensions (i) study population and participation rate; (ii) study attrition; (iii) data collection and (iv) data analysis. Conclusions were based on a rating system of three levels of evidence. A total of 25 publications were selected for inclusion in this review. According to a methodological quality assessment, 13 studies were considered to be of high quality. The majority of these high-quality studies were published after 2001, indicating that recently published data, in particular, provide us with reliable information. All included studies consistently report an increased risk of overweight and obese youth becoming overweight adults, suggesting that the likelihood of persistence of overweight into adulthood is moderate for overweight and obese youth. However, predictive values varied considerably. Limiting aspects with respect to generalizability and methodological issues are discussed.
Solar neutrino measurements in Super-Kamiokande-IV
NASA Astrophysics Data System (ADS)
Abe, K.; Haga, Y.; Hayato, Y.; Ikeda, M.; Iyogi, K.; Kameda, J.; Kishimoto, Y.; Marti, Ll.; Miura, M.; Moriyama, S.; Nakahata, M.; Nakajima, T.; Nakayama, S.; Orii, A.; Sekiya, H.; Shiozawa, M.; Sonoda, Y.; Takeda, A.; Tanaka, H.; Takenaga, Y.; Tasaka, S.; Tomura, T.; Ueno, K.; Yokozawa, T.; Akutsu, R.; Irvine, T.; Kaji, H.; Kajita, T.; Kametani, I.; Kaneyuki, K.; Lee, K. P.; Nishimura, Y.; McLachlan, T.; Okumura, K.; Richard, E.; Labarga, L.; Fernandez, P.; Blaszczyk, F. d. M.; Gustafson, J.; Kachulis, C.; Kearns, E.; Raaf, J. L.; Stone, J. L.; Sulak, L. R.; Berkman, S.; Tobayama, S.; Goldhaber, M.; Bays, K.; Carminati, G.; Griskevich, N. J.; Kropp, W. R.; Mine, S.; Renshaw, A.; Smy, M. B.; Sobel, H. W.; Takhistov, V.; Weatherly, P.; Ganezer, K. S.; Hartfiel, B. L.; Hill, J.; Keig, W. E.; Hong, N.; Kim, J. Y.; Lim, I. T.; Park, R. G.; Akiri, T.; Albert, J. B.; Himmel, A.; Li, Z.; O'Sullivan, E.; Scholberg, K.; Walter, C. W.; Wongjirad, T.; Ishizuka, T.; Nakamura, T.; Jang, J. S.; Choi, K.; Learned, J. G.; Matsuno, S.; Smith, S. N.; Friend, M.; Hasegawa, T.; Ishida, T.; Ishii, T.; Kobayashi, T.; Nakadaira, T.; Nakamura, K.; Nishikawa, K.; Oyama, Y.; Sakashita, K.; Sekiguchi, T.; Tsukamoto, T.; Nakano, Y.; Suzuki, A. T.; Takeuchi, Y.; Yano, T.; Cao, S. V.; Hayashino, T.; Hiraki, T.; Hirota, S.; Huang, K.; Ieki, K.; Jiang, M.; Kikawa, T.; Minamino, A.; Murakami, A.; Nakaya, T.; Patel, N. D.; Suzuki, K.; Takahashi, S.; Wendell, R. A.; Fukuda, Y.; Itow, Y.; Mitsuka, G.; Muto, F.; Suzuki, T.; Mijakowski, P.; Frankiewicz, K.; Hignight, J.; Imber, J.; Jung, C. K.; Li, X.; Palomino, J. L.; Santucci, G.; Taylor, I.; Vilela, C.; Wilking, M. J.; Yanagisawa, C.; Fukuda, D.; Ishino, H.; Kayano, T.; Kibayashi, A.; Koshio, Y.; Mori, T.; Sakuda, M.; Takeuchi, J.; Yamaguchi, R.; Kuno, Y.; Tacik, R.; Kim, S. B.; Okazawa, H.; Choi, Y.; Ito, K.; Nishijima, K.; Koshiba, M.; Totsuka, Y.; Suda, Y.; Yokoyama, M.; Bronner, C.; Calland, R. G.; Hartz, M.; Martens, K.; Obayashi, Y.; Suzuki, Y.; Vagins, M. R.; Nantais, C. M.; Martin, J. F.; de Perio, P.; Tanaka, H. A.; Konaka, A.; Chen, S.; Sui, H.; Wan, L.; Yang, Z.; Zhang, H.; Zhang, Y.; Connolly, K.; Dziomba, M.; Wilkes, R. J.; Super-Kamiokande Collaboration
2016-09-01
Upgraded electronics, improved water system dynamics, better calibration and analysis techniques allowed Super-Kamiokande-IV to clearly observe very low-energy 8B solar neutrino interactions, with recoil electron kinetic energies as low as ˜3.5 MeV . Super-Kamiokande-IV data-taking began in September of 2008; this paper includes data until February 2014, a total livetime of 1664 days. The measured solar neutrino flux is (2.308 ±0.020 (stat)-0.040 +0.039(syst ))×1 06/(cm2 sec ) assuming no oscillations. The observed recoil electron energy spectrum is consistent with no distortions due to neutrino oscillations. An extended maximum likelihood fit to the amplitude of the expected solar zenith angle variation of the neutrino-electron elastic scattering rate in SK-IV results in a day/night asymmetry of (-3.6 ±1.6 (stat )±0.6 (syst ))% . The SK-IV solar neutrino data determine the solar mixing angle as sin2θ12=0.327-0.031+0.026 , all SK solar data (SK-I, SK-II, SK III and SK-IV) measures this angle to be sin2θ12=0.334-0.023+0.027 , the determined mass-squared splitting is Δ m212=4.8-0.8+1.5×10-5 eV2 .
Building integral projection models: a user's guide
Rees, Mark; Childs, Dylan Z; Ellner, Stephen P; Coulson, Tim
2014-01-01
In order to understand how changes in individual performance (growth, survival or reproduction) influence population dynamics and evolution, ecologists are increasingly using parameterized mathematical models. For continuously structured populations, where some continuous measure of individual state influences growth, survival or reproduction, integral projection models (IPMs) are commonly used. We provide a detailed description of the steps involved in constructing an IPM, explaining how to: (i) translate your study system into an IPM; (ii) implement your IPM; and (iii) diagnose potential problems with your IPM. We emphasize how the study organism's life cycle, and the timing of censuses, together determine the structure of the IPM kernel and important aspects of the statistical analysis used to parameterize an IPM using data on marked individuals. An IPM based on population studies of Soay sheep is used to illustrate the complete process of constructing, implementing and evaluating an IPM fitted to sample data. We then look at very general approaches to parameterizing an IPM, using a wide range of statistical techniques (e.g. maximum likelihood methods, generalized additive models, nonparametric kernel density estimators). Methods for selecting models for parameterizing IPMs are briefly discussed. We conclude with key recommendations and a brief overview of applications that extend the basic model. The online Supporting Information provides commented R code for all our analyses. PMID:24219157
Latent Class Analysis of Differential Item Functioning on the Peabody Picture Vocabulary Test-III
ERIC Educational Resources Information Center
Webb, Mi-young Lee; Cohen, Allan S.; Schwanenflugel, Paula J.
2008-01-01
This study investigated the use of latent class analysis for the detection of differences in item functioning on the Peabody Picture Vocabulary Test-Third Edition (PPVT-III). A two-class solution for a latent class model appeared to be defined in part by ability because Class 1 was lower in ability than Class 2 on both the PPVT-III and the…
14 CFR 437.55 - Hazard analysis.
Code of Federal Regulations, 2014 CFR
2014-01-01
... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must...) Design inadequacies; or (vi) Procedural deficiencies. (2) Determine the likelihood of occurrence and... include one or more of the following: (i) Designing for minimum risk, (ii) Incorporating safety devices...
14 CFR 437.55 - Hazard analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must...) Design inadequacies; or (vi) Procedural deficiencies. (2) Determine the likelihood of occurrence and... include one or more of the following: (i) Designing for minimum risk, (ii) Incorporating safety devices...
14 CFR 437.55 - Hazard analysis.
Code of Federal Regulations, 2011 CFR
2011-01-01
... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must...) Design inadequacies; or (vi) Procedural deficiencies. (2) Determine the likelihood of occurrence and... include one or more of the following: (i) Designing for minimum risk, (ii) Incorporating safety devices...
Fault Tree Analysis as a Planning and Management Tool: A Case Study
ERIC Educational Resources Information Center
Witkin, Belle Ruth
1977-01-01
Fault Tree Analysis is an operations research technique used to analyse the most probable modes of failure in a system, in order to redesign or monitor the system more closely in order to increase its likelihood of success. (Author)
2012-01-01
Background The Nymphaeales (waterlilly and relatives) lineage has diverged as the second branch of basal angiosperms and comprises of two families: Cabombaceae and Nymphaceae. The classification of Nymphaeales and phylogeny within the flowering plants are quite intriguing as several systems (Thorne system, Dahlgren system, Cronquist system, Takhtajan system and APG III system (Angiosperm Phylogeny Group III system) have attempted to redefine the Nymphaeales taxonomy. There have been also fossil records consisting especially of seeds, pollen, stems, leaves and flowers as early as the lower Cretaceous. Here we present an in silico study of the order Nymphaeales taking maturaseK (matK) and internal transcribed spacer (ITS2) as biomarkers for phylogeny reconstruction (using character-based methods and Bayesian approach) and identification of motifs for DNA barcoding. Results The Maximum Likelihood (ML) and Bayesian approach yielded congruent fully resolved and well-supported trees using a concatenated (ITS2+ matK) supermatrix aligned dataset. The taxon sampling corroborates the monophyly of Cabombaceae. Nuphar emerges as a monophyletic clade in the family Nymphaeaceae while there are slight discrepancies in the monophyletic nature of the genera Nymphaea owing to Victoria-Euryale and Ondinea grouping in the same node of Nymphaeaceae. ITS2 secondary structures alignment corroborate the primary sequence analysis. Hydatellaceae emerged as a sister clade to Nymphaeaceae and had a basal lineage amongst the water lilly clades. Species from Cycas and Ginkgo were taken as outgroups and were rooted in the overall tree topology from various methods. Conclusions MatK genes are fast evolving highly variant regions of plant chloroplast DNA that can serve as potential biomarkers for DNA barcoding and also in generating primers for angiosperms with identification of unique motif regions. We have reported unique genus specific motif regions in the Order Nymphaeles from matK dataset which can be further validated for barcoding and designing of PCR primers. Our analysis using a novel approach of sequence-structure alignment and phylogenetic reconstruction using molecular morphometrics congrue with the current placement of Hydatellaceae within the early-divergent angiosperm order Nymphaeales. The results underscore the fact that more diverse genera, if not fully resolved to be monophyletic, should be represented by all major lineages. PMID:23282079
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Lei; Xiao, Yongsheng; Wang, Yinsheng, E-mail: yinsheng.wang@ucr.edu
Human exposure to arsenic in drinking water is a widespread public health concern, and such exposure is known to be associated with many human diseases. The detailed molecular mechanisms about how arsenic species contribute to the adverse human health effects, however, remain incompletely understood. Monomethylarsonous acid [MMA(III)] is a highly toxic and stable metabolite of inorganic arsenic. To exploit the mechanisms through which MMA(III) exerts its cytotoxic effect, we adopted a quantitative proteomic approach, by coupling stable isotope labeling by amino acids in cell culture (SILAC) with LC-MS/MS analysis, to examine the variation in the entire proteome of GM00637 humanmore » skin fibroblasts following acute MMA(III) exposure. Among the ∼ 6500 unique proteins quantified, ∼ 300 displayed significant changes in expression after exposure with 2 μM MMA(III) for 24 h. Subsequent analysis revealed the perturbation of de novo cholesterol biosynthesis, selenoprotein synthesis and Nrf2 pathways evoked by MMA(III) exposure. Particularly, MMA(III) treatment resulted in considerable down-regulation of several enzymes involved in cholesterol biosynthesis. In addition, real-time PCR analysis showed reduced mRNA levels of select genes in this pathway. Furthermore, MMA(III) exposure contributed to a distinct decline in cellular cholesterol content and significant growth inhibition of multiple cell lines, both of which could be restored by supplementation of cholesterol to the culture media. Collectively, the present study demonstrated that the cytotoxicity of MMA(III) may arise, at least in part, from the down-regulation of cholesterol biosynthesis enzymes and the resultant decrease of cellular cholesterol content. - Highlights: • MMA(III)-induced perturbation of the entire proteome of GM00637 cells is studied. • Quantitative proteomic approach revealed alterations of multiple cellular pathways. • MMA(III) inhibits de novo cholesterol biosynthesis. • MMA(III) perturbs Nrf2 pathway and selenoprotein synthesis.« less
NASA Astrophysics Data System (ADS)
Anyika, Chinedum; Asri, Nur Asilayana Mohd; Majid, Zaiton Abdul; Jaafar, Jafariah; Yahya, Adibah
2017-12-01
In this study, we converted activated carbon (AC) into magnetic activated carbon (MAC), which was established to have removed arsenic (III) from wastewater. Arsenic (III) is a toxic heavy metal which is readily soluble in water and can be detrimental to human health. The MAC was prepared by incorporating Fe3O4 into the AC by using Fe3O4 extracted from a ferrous sulfate solution, designated: magnetic palm kernel shell from iron suspension (MPKSF). Batch experiments were conducted using two methods: (1) one-factor-at-a-time and (2) Box-Behnken statistical analysis. Results showed that the optimum conditions resulted in 95% of As(III) removal in the wastewater sample. The adsorption data were best fitted to the Langmuir isotherm. The adsorption of As(III) onto the MPKSF was confirmed by energy dispersive X-ray spectrometry analysis which detected the presence of As(III) of 0.52% on the surface of the MPKSF. The Fourier transform infrared spectroscopy analysis of the MPKSF-As presented a peak at 573 cm-1, which was assigned to M-O (metal-oxygen) bending, indicating the coordination of As(III) with oxygen through the formation of inner-sphere complexation, thereby indicating a covalent bonding between the MPKSF functional groups and As(III). The findings suggested that the MPKSF exhibited a strong capacity to efficiently remove As(III) from wastewater, while the desorption studies showed that the As(III) was rigidly bound to the MPKSF thereby eliminating the possibility of secondary pollution.
Singh, G D; McNamara, J A; Lozanoff, S
1998-01-01
While the dynamics of maxillo-mandibular allometry associated with treatment modalities available for the management of Class III malocclusions currently are under investigation, developmental aberration of the soft tissues in untreated Class III malocclusions requires specification. In this study, lateral cephalographs of 124 prepubertal European-American children (71 with untreated Class III malocclusion; 53 with Class I occlusion) were traced, and 12 soft-tissue landmarks digitized. Resultant geometries were scaled to an equivalent size and mean Class III and Class I configurations compared. Procrustes analysis established statistical difference (P < 0.001) between the mean configurations. Comparing the overall untreated Class III and Class I configurations, thin-plate spline (TPS) analysis indicated that both affine and non-affine transformations contribute towards the deformation (total spline) of the averaged Class III soft tissue configuration. For non-affine transformations, partial warp 8 had the highest magnitude, indicating large-scale deformations visualized as a combination of columellar retrusion and lower labial protrusion. In addition, partial warp 5 also had a high magnitude, demonstrating upper labial vertical compression with antero-inferior elongation of the lower labio-mental soft tissue complex. Thus, children with Class III malocclusions demonstrate antero-posterior and vertical deformations of the maxillary soft tissue complex in combination with antero-inferior mandibular soft tissue elongation. This pattern of deformations may represent gene-environment interactions, resulting in Class III malocclusions with characteristic phenotypes, that are amenable to orthodontic and dentofacial orthopedic manipulations.
A maximum likelihood convolutional decoder model vs experimental data comparison
NASA Technical Reports Server (NTRS)
Chen, R. Y.
1979-01-01
This article describes the comparison of a maximum likelihood convolutional decoder (MCD) prediction model and the actual performance of the MCD at the Madrid Deep Space Station. The MCD prediction model is used to develop a subroutine that has been utilized by the Telemetry Analysis Program (TAP) to compute the MCD bit error rate for a given signal-to-noise ratio. The results indicate that that the TAP can predict quite well compared to the experimental measurements. An optimal modulation index also can be found through TAP.
Health insurance coverage among disabled Medicare enrollees
Rubin, Jeffrey I.; Wilcox-Gök, Virginia
1991-01-01
In this article, we use the Survey of Income and Program Participation to identify patterns of non-Medicare insurance coverage among disabled Medicare enrollees. Compared with the aged, the disabled are less likely to have private insurance coverage and more likely to have Medicaid. Probit analysis of the determinants of private insurance for disabled Medicare enrollees shows that income, education, marital status, sex, and having an employed family member are positively related to the likelihood of having private health insurance, whereas age and the probability of Medicaid enrollment are negatively related to this likelihood. PMID:10170806
Salje, Ekhard K H; Planes, Antoni; Vives, Eduard
2017-10-01
Crackling noise can be initiated by competing or coexisting mechanisms. These mechanisms can combine to generate an approximate scale invariant distribution that contains two or more contributions. The overall distribution function can be analyzed, to a good approximation, using maximum-likelihood methods and assuming that it follows a power law although with nonuniversal exponents depending on a varying lower cutoff. We propose that such distributions are rather common and originate from a simple superposition of crackling noise distributions or exponential damping.
Using phase II data for the analysis of phase III studies: An application in rare diseases.
Wandel, Simon; Neuenschwander, Beat; Röver, Christian; Friede, Tim
2017-06-01
Clinical research and drug development in orphan diseases are challenging, since large-scale randomized studies are difficult to conduct. Formally synthesizing the evidence is therefore of great value, yet this is rarely done in the drug-approval process. Phase III designs that make better use of phase II data can facilitate drug development in orphan diseases. A Bayesian meta-analytic approach is used to inform the phase III study with phase II data. It is particularly attractive, since uncertainty of between-trial heterogeneity can be dealt with probabilistically, which is critical if the number of studies is small. Furthermore, it allows quantifying and discounting the phase II data through the predictive distribution relevant for phase III. A phase III design is proposed which uses the phase II data and considers approval based on a phase III interim analysis. The design is illustrated with a non-inferiority case study from a Food and Drug Administration approval in herpetic keratitis (an orphan disease). Design operating characteristics are compared to those of a traditional design, which ignores the phase II data. An analysis of the phase II data reveals good but insufficient evidence for non-inferiority, highlighting the need for a phase III study. For the phase III study supported by phase II data, the interim analysis is based on half of the patients. For this design, the meta-analytic interim results are conclusive and would justify approval. In contrast, based on the phase III data only, interim results are inconclusive and require further evidence. To accelerate drug development for orphan diseases, innovative study designs and appropriate methodology are needed. Taking advantage of randomized phase II data when analyzing phase III studies looks promising because the evidence from phase II supports informed decision-making. The implementation of the Bayesian design is straightforward with public software such as R.
On meeting capital requirements with a chance-constrained optimization model.
Atta Mills, Ebenezer Fiifi Emire; Yu, Bo; Gu, Lanlan
2016-01-01
This paper deals with a capital to risk asset ratio chance-constrained optimization model in the presence of loans, treasury bill, fixed assets and non-interest earning assets. To model the dynamics of loans, we introduce a modified CreditMetrics approach. This leads to development of a deterministic convex counterpart of capital to risk asset ratio chance constraint. We pursue the scope of analyzing our model under the worst-case scenario i.e. loan default. The theoretical model is analyzed by applying numerical procedures, in order to administer valuable insights from a financial outlook. Our results suggest that, our capital to risk asset ratio chance-constrained optimization model guarantees banks of meeting capital requirements of Basel III with a likelihood of 95 % irrespective of changes in future market value of assets.
Likelihood-Based Random-Effect Meta-Analysis of Binary Events.
Amatya, Anup; Bhaumik, Dulal K; Normand, Sharon-Lise; Greenhouse, Joel; Kaizar, Eloise; Neelon, Brian; Gibbons, Robert D
2015-01-01
Meta-analysis has been used extensively for evaluation of efficacy and safety of medical interventions. Its advantages and utilities are well known. However, recent studies have raised questions about the accuracy of the commonly used moment-based meta-analytic methods in general and for rare binary outcomes in particular. The issue is further complicated for studies with heterogeneous effect sizes. Likelihood-based mixed-effects modeling provides an alternative to moment-based methods such as inverse-variance weighted fixed- and random-effects estimators. In this article, we compare and contrast different mixed-effect modeling strategies in the context of meta-analysis. Their performance in estimation and testing of overall effect and heterogeneity are evaluated when combining results from studies with a binary outcome. Models that allow heterogeneity in both baseline rate and treatment effect across studies have low type I and type II error rates, and their estimates are the least biased among the models considered.
Haker, Steven; Wells, William M; Warfield, Simon K; Talos, Ion-Florin; Bhagwat, Jui G; Goldberg-Zimring, Daniel; Mian, Asim; Ohno-Machado, Lucila; Zou, Kelly H
2005-01-01
In any medical domain, it is common to have more than one test (classifier) to diagnose a disease. In image analysis, for example, there is often more than one reader or more than one algorithm applied to a certain data set. Combining of classifiers is often helpful, but determining the way in which classifiers should be combined is not trivial. Standard strategies are based on learning classifier combination functions from data. We describe a simple strategy to combine results from classifiers that have not been applied to a common data set, and therefore can not undergo this type of joint training. The strategy, which assumes conditional independence of classifiers, is based on the calculation of a combined Receiver Operating Characteristic (ROC) curve, using maximum likelihood analysis to determine a combination rule for each ROC operating point. We offer some insights into the use of ROC analysis in the field of medical imaging.
Haker, Steven; Wells, William M.; Warfield, Simon K.; Talos, Ion-Florin; Bhagwat, Jui G.; Goldberg-Zimring, Daniel; Mian, Asim; Ohno-Machado, Lucila; Zou, Kelly H.
2010-01-01
In any medical domain, it is common to have more than one test (classifier) to diagnose a disease. In image analysis, for example, there is often more than one reader or more than one algorithm applied to a certain data set. Combining of classifiers is often helpful, but determining the way in which classifiers should be combined is not trivial. Standard strategies are based on learning classifier combination functions from data. We describe a simple strategy to combine results from classifiers that have not been applied to a common data set, and therefore can not undergo this type of joint training. The strategy, which assumes conditional independence of classifiers, is based on the calculation of a combined Receiver Operating Characteristic (ROC) curve, using maximum likelihood analysis to determine a combination rule for each ROC operating point. We offer some insights into the use of ROC analysis in the field of medical imaging. PMID:16685884
Clinical predictors of resectability of pancreatic adenocarcinoma.
Almadi, Majid A; Alharbi, Othman; Azzam, Nahla; Altayeb, Mohannad; Javed, Moammed; Alsaif, Faisal; Hassanain, Mazen; Alsharabi, Abdulsalam; Al-Saleh, Khalid; Aljebreen, Abdulrahman M
2013-01-01
Identifying patient-related factors as well as symptoms and signs that can predict pancreatic cancer at a resectable stage, which could be used in an attempt to identify patients at an early stage of pancreatic cancer that would be appropriate for surgical resection and those at an unresectable stage be sparred unnecessary surgery. A retrospective chart review was conducted at a major tertiary care, university hospital in Riyadh, Saudi Arabia. The study population included individuals who underwent a computed tomography and a pancreatic mass was reported as well as the endoscopic reporting database of endoscopic procedures where the indication was a pancreatic mass, between April 1996 and April 2012. Any patient with a histologically confirmed diagnosis of adenocarcinoma of the pancreas was included in the analysis. We included patients' demographic information (age, gender), height, weight, body mass index, historical data (smoking, comorbidities), symptoms (abdominal pain and its duration, anorexia and its duration, weight loss and its amount, and over what duration, vomiting, abdominal distention, itching and its duration, change in bowel movements, change in urine color), jaundice and its duration. Other variables were also collected including laboratory values, location of the mass, the investigation undertaken, and the stage of the tumor. A total of 61 patients were included, the mean age was 61.2 ± 1.51 years, 25 (41%) were females. The tumors were located in the head (83.6%), body (10.9%), tail (1.8%), and in multiple locations (3.6%) of the pancreas. Half of the patients (50%) had Stage IV, 16.7% stages IIB and III, and only 8.3% were stages IB and IIA. On univariable analysis a lower hemoglobin level predicted resectability odds ratio 0.65 (95% confidence interval, 0.42-0.98), whereas on multivariable regression none of the variables included in the model could predict resectability of pancreatic cancer. A CA 19-9 cutoff level of 166 ng/mL had a sensitivity of 89%, specificity of 75%, positive likelihood ratio of 3.6, and a negative likelihood ratio of 0.15 for resectability of pancreatic adenocarcinoma. This study describes the clinical characteristics of patients with pancreatic adenocarcinoma in Saudi Arabia. None of the clinical or laboratory variables that were included in our study could independently predict resectability of pancreatic adenocarcinoma. Further studies are warranted to validate these results.
Vermunt, Neeltje P C A; Westert, Gert P; Olde Rikkert, Marcel G M; Faber, Marjan J
2018-03-01
To assess the impact of patient characteristics, patient-professional engagement, communication and context on the probability that healthcare professionals will discuss goals or priorities with older patients. Secondary analysis of cross-sectional data from the 2014 Commonwealth Fund International Health Policy Survey of Older Adults. 11 western countries. Community-dwelling adults, aged 55 or older. Assessment of goals and priorities. The final sample size consisted of 17,222 respondents, 54% of whom reported an assessment of their goals and priorities (AGP) by healthcare professionals. In logistic regression model 1, which was used to analyse the entire population, the determinants found to have moderate to large effects on the likelihood of AGP were information exchange on stress, diet or exercise, or both. Country (living in Sweden) and continuity of care (no regular professional or organisation) had moderate to large negative effects on the likelihood of AGP. In model 2, which focussed on respondents who experienced continuity of care, country and information exchange on stress and lifestyle were the main determinants of AGP, with comparable odds ratios to model 1. Furthermore, a professional asking questions also increased the likelihood of AGP. Continuity of care and information exchange is associated with a higher probability of AGP, while people living in Sweden are less likely to experience these assessments. Further study is required to determine whether increasing information exchange and professionals asking more questions may improve goal setting with older patients. Key points A patient goal-oriented approach can be beneficial for older patients with chronic conditions or multimorbidity; however, discussing goals with these patients is not a common practice. The likelihood of discussing goals varies by country, occurring most commonly in the USA, and least often in Sweden. Country-level differences in continuity of care and questions asked by a regularly visited professional affect the goal discussion probability. Patient characteristics, including age, have less impact than expected on the likelihood of sharing goals.
Incorrect likelihood methods were used to infer scaling laws of marine predator search behaviour.
Edwards, Andrew M; Freeman, Mervyn P; Breed, Greg A; Jonsen, Ian D
2012-01-01
Ecologists are collecting extensive data concerning movements of animals in marine ecosystems. Such data need to be analysed with valid statistical methods to yield meaningful conclusions. We demonstrate methodological issues in two recent studies that reached similar conclusions concerning movements of marine animals (Nature 451:1098; Science 332:1551). The first study analysed vertical movement data to conclude that diverse marine predators (Atlantic cod, basking sharks, bigeye tuna, leatherback turtles and Magellanic penguins) exhibited "Lévy-walk-like behaviour", close to a hypothesised optimal foraging strategy. By reproducing the original results for the bigeye tuna data, we show that the likelihood of tested models was calculated from residuals of regression fits (an incorrect method), rather than from the likelihood equations of the actual probability distributions being tested. This resulted in erroneous Akaike Information Criteria, and the testing of models that do not correspond to valid probability distributions. We demonstrate how this led to overwhelming support for a model that has no biological justification and that is statistically spurious because its probability density function goes negative. Re-analysis of the bigeye tuna data, using standard likelihood methods, overturns the original result and conclusion for that data set. The second study observed Lévy walk movement patterns by mussels. We demonstrate several issues concerning the likelihood calculations (including the aforementioned residuals issue). Re-analysis of the data rejects the original Lévy walk conclusion. We consequently question the claimed existence of scaling laws of the search behaviour of marine predators and mussels, since such conclusions were reached using incorrect methods. We discourage the suggested potential use of "Lévy-like walks" when modelling consequences of fishing and climate change, and caution that any resulting advice to managers of marine ecosystems would be problematic. For reproducibility and future work we provide R source code for all calculations.
Anodic Stripping Voltammetry with Pencil Graphite Electrode for Determination of Chromium (III)
NASA Astrophysics Data System (ADS)
Wyantuti, S.; Hafidza, R. A.; Ishmayana, S.; Hartati, Y. W.
2017-02-01
Chromium is required as micronutrient that has roles in insulin metabolism and blood glucose level regulation. Chromium (III) deficiency can cause hyperglycemia and glycosuria. However, a high amount of chromium in body can cause allergic reaction, organ damage, and even death because of its toxicity. Chromium is commonly used in steel industries. Simultaneously with the development of industry, the waste disposal that can endanger environment also increased. Therefore, a sensitive and specific analysis method for chromium detection is required. Stripping voltammetry is one of the voltammetric methods that is commonly used for heavy metal analysis due to the very low limit of detection (sub ppb). The present study was conducted to develop an analysis method for chromium (III) determination using pencil graphite electrode. Quantitative determination was performed for chromium (III) which measured at -0.8 to +1.0 V with deposition time for 60 s and 50 mV/s scan rate. Stripping voltammetric analysis of chromium (III) using pencil graphite electrode gave linear range at 12.5 to 75 ppm with limit of detection of 0.31 ppm.
Phylogenetic evidence for cladogenetic polyploidization in land plants.
Zhan, Shing H; Drori, Michal; Goldberg, Emma E; Otto, Sarah P; Mayrose, Itay
2016-07-01
Polyploidization is a common and recurring phenomenon in plants and is often thought to be a mechanism of "instant speciation". Whether polyploidization is associated with the formation of new species (cladogenesis) or simply occurs over time within a lineage (anagenesis), however, has never been assessed systematically. We tested this hypothesis using phylogenetic and karyotypic information from 235 plant genera (mostly angiosperms). We first constructed a large database of combined sequence and chromosome number data sets using an automated procedure. We then applied likelihood models (ClaSSE) that estimate the degree of synchronization between polyploidization and speciation events in maximum likelihood and Bayesian frameworks. Our maximum likelihood analysis indicated that 35 genera supported a model that includes cladogenetic transitions over a model with only anagenetic transitions, whereas three genera supported a model that incorporates anagenetic transitions over one with only cladogenetic transitions. Furthermore, the Bayesian analysis supported a preponderance of cladogenetic change in four genera but did not support a preponderance of anagenetic change in any genus. Overall, these phylogenetic analyses provide the first broad confirmation that polyploidization is temporally associated with speciation events, suggesting that it is indeed a major speciation mechanism in plants, at least in some genera. © 2016 Botanical Society of America.
PHYSICAL EXAMINATIONS FOR DIAGNOSING MENISCAL INJURIES: CORRELATION WITH SURGICAL FINDINGS
Gobbo, Ricardo da Rocha; Rangel, Victor de Oliveira; Karam, Francisco Consoli; Pires, Luiz Antônio Simões
2015-01-01
Objective: A set of five maneuvers for meniscal injuries (McMurray, Apley, Childress and Steinmann 1 and 2) was evaluated and their sensitivity, specificity, accuracy and likelihood were calculated. The same methods were applied to each test individually. Methods: One hundred and fifty-two patients of both sexes who were going to undergo videoarthroscopy on the knee were examined blindly by one of five residents at this hospital, without knowledge of the clinical data and why the patient was going to undergo an operation. This examination was conducted immediately before the videoarthroscopy and its results were recorded in an electronic spreadsheet. The set of maneuvers was considered positive when one was positive. In the individual analysis, it was enough for the test to be positive. Results: The analysis showed that the set of five meniscal tests presented sensitivity of 89%, specificity of 42%, accuracy of 75%, positive likelihood of 1.53 and negative likelihood of 0.26. Individually, the tests presented accuracy of between 48% and 53%. Conclusion: The set of maneuvers for meniscal injuries presented a good accuracy and significant value, especially for ruling out injury. Individually, the tests had less diagnostic value, although the Apley test had better specificity. PMID:27047833
Gender bias among children in India in their diet and immunisation against disease.
Borooah, Vani K
2004-05-01
This paper conducts an econometric analysis of data for a sample of over 4000 children in India, between the ages of 1 and 2 years, with a view to studying two aspects of the neglect of children: their likelihood of being immunised against disease and their likelihood of receiving a nutritious diet. The starting hypothesis, consistent with an universal interest in gender issues, was that girls were more likely to be neglected than boys. The analysis confirmed this hypothesis. In respect of vaccinations, the likelihood of girls being fully vaccinated, after controlling for other variables, was 5 percentage points lower than that for boys. In respect of receiving a nutritious diet, the treatment of girls depended very much on whether or not their mothers were literate: there was no gender discrimination between children of literate mothers; on the other hand, when the mother was illiterate, girls were 5 percentage points less likely to be well-fed relative to their brothers and the presence of a literate father did little to dent this gender gap. But the analysis also pointed to a broader conclusion which was that all children in India suffered from sharper, but less publicised forms of disadvantage than that engendered solely by gender. These were the consequences which stemmed from children being born to illiterate mothers and being brought up in the more impoverished parts of India.
Estimation of rank correlation for clustered data.
Rosner, Bernard; Glynn, Robert J
2017-06-30
It is well known that the sample correlation coefficient (R xy ) is the maximum likelihood estimator of the Pearson correlation (ρ xy ) for independent and identically distributed (i.i.d.) bivariate normal data. However, this is not true for ophthalmologic data where X (e.g., visual acuity) and Y (e.g., visual field) are available for each eye and there is positive intraclass correlation for both X and Y in fellow eyes. In this paper, we provide a regression-based approach for obtaining the maximum likelihood estimator of ρ xy for clustered data, which can be implemented using standard mixed effects model software. This method is also extended to allow for estimation of partial correlation by controlling both X and Y for a vector U_ of other covariates. In addition, these methods can be extended to allow for estimation of rank correlation for clustered data by (i) converting ranks of both X and Y to the probit scale, (ii) estimating the Pearson correlation between probit scores for X and Y, and (iii) using the relationship between Pearson and rank correlation for bivariate normally distributed data. The validity of the methods in finite-sized samples is supported by simulation studies. Finally, two examples from ophthalmology and analgesic abuse are used to illustrate the methods. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Levin, A; Rahman, M A; Quayyum, Z; Routh, S; Barkat-e-Khuda
2001-01-01
This paper seeks to investigate the determinants of child health care seeking behaviours in rural Bangladesh. In particular, the effects of income, women's access to income, and the prices of obtaining child health care are examined. Data on the use of child curative care were collected in two rural areas of Bangladesh--Abhoynagar Thana of Jessore District and Mirsarai Thana of Chittagong District--in March 1997. In estimating the use of child curative care, the nested multinomial logit specification was used. The results of the analysis indicate that a woman's involvement in a credit union or income generation affected the likelihood that curative child care was used. Household wealth decreased the likelihood that the child had an illness episode and affected the likelihood that curative child care was sought. Among facility characteristics, travel time was statistically significant and was negatively associated with the use of a provider.
Harrell-Williams, Leigh; Wolfe, Edward W
2014-01-01
Previous research has investigated the influence of sample size, model misspecification, test length, ability distribution offset, and generating model on the likelihood ratio difference test in applications of item response models. This study extended that research to the evaluation of dimensionality using the multidimensional random coefficients multinomial logit model (MRCMLM). Logistic regression analysis of simulated data reveal that sample size and test length have a large effect on the capacity of the LR difference test to correctly identify unidimensionality, with shorter tests and smaller sample sizes leading to smaller Type I error rates. Higher levels of simulated misfit resulted in fewer incorrect decisions than data with no or little misfit. However, Type I error rates indicate that the likelihood ratio difference test is not suitable under any of the simulated conditions for evaluating dimensionality in applications of the MRCMLM.
Perception of risk from automobile safety defects.
Slovic, P; MacGregor, D; Kraus, N N
1987-10-01
Descriptions of safety engineering defects of the kind that compel automobile manufacturers to initiate a recall campaign were evaluated by individuals on a set of risk characteristic scales that included overall vehicle riskiness, manufacturer's ability to anticipate the defect, importance for vehicle operation, severity of consequences and likelihood of compliance with a recall notice. A factor analysis of the risk characteristics indicated that judgments could be summarized in terms of two composite scales, one representing the uncontrollability of the damage the safety defect might cause and the other representing the foreseeability of the defect by the manufacturer. Motor vehicle defects were found to be highly diverse in terms of the perceived qualities of their risks. Location of individual defects within the factor space was closely associated with perceived riskiness, perceived likelihood of purchasing another car from the same manufacturer, perceived likelihood of compliance with a recall notice, and actual compliance rates.
Dixon, Helen G; Warne, Charles D; Scully, Maree L; Wakefield, Melanie A; Dobbinson, Suzanne J
2011-04-01
Content analysis data on the tans of 4,422 female Caucasian models sampled from spring and summer magazine issues were combined with readership data to generate indices of potential exposure to social modeling of tanning via popular women's magazines over a 15-year period (1987 to 2002). Associations between these indices and cross-sectional telephone survey data from the same period on 5,675 female teenagers' and adults' tanning attitudes, beliefs, and behavior were examined using logistic regression models. Among young women, greater exposure to tanning in young women's magazines was associated with increased likelihood of endorsing pro-tan attitudes and beliefs. Among women of all ages, greater exposure to tanned models via the most popular women's magazines was associated with increased likelihood of attempting to get a tan but lower likelihood of endorsing pro-tan attitudes. Popular women's magazines may promote and reflect real women's tanning beliefs and behavior.
Master teachers' responses to twenty literacy and science/mathematics practices in deaf education.
Easterbrooks, Susan R; Stephenson, Brenda; Mertens, Donna
2006-01-01
Under a grant to improve outcomes for students who are deaf or hard of hearing awarded to the Association of College Educators--Deaf/Hard of Hearing, a team identified content that all teachers of students who are deaf and hard of hearing must understand and be able to teach. Also identified were 20 practices associated with content standards (10 each, literacy and science/mathematics). Thirty-seven master teachers identified by grant agents rated the practices on a Likert-type scale indicating the maximum benefit of each practice and maximum likelihood that they would use the practice, yielding a likelihood-impact analysis. The teachers showed strong agreement on the benefits and likelihood of use of the rated practices. Concerns about implementation of many of the practices related to time constraints and mixed-ability classrooms were themes of the reviews. Actions for teacher preparation programs were recommended.
Influence of weather, rank, and home advantage on football outcomes in the Gulf region.
Brocherie, Franck; Girard, Olivier; Farooq, Abdulaziz; Millet, Grégoire P
2015-02-01
The objective of this study was to investigate the effects of weather, rank, and home advantage on international football match results and scores in the Gulf Cooperation Council (GCC) region. Football matches (n = 2008) in six GCC countries were analyzed. To determine the weather influence on the likelihood of favorable outcome and goal difference, generalized linear model with a logit link function and multiple regression analysis were performed. In the GCC region, home teams tend to have greater likelihood of a favorable outcome (P < 0.001) and higher goal difference (P < 0.001). Temperature difference was identified as a significant explanatory variable when used independently (P < 0.001) or after adjustment for home advantage and team ranking (P < 0.001). The likelihood of favorable outcome for GCC teams increases by 3% for every 1-unit increase in temperature difference. After inclusion of interaction with opposition, this advantage remains significant only when playing against non-GCC opponents. While home advantage increased the odds of favorable outcome (P < 0.001) and goal difference (P < 0.001) after inclusion of interaction term, the likelihood of favorable outcome for a GCC team decreased (P < 0.001) when playing against a stronger opponent. Finally, the temperature and wet bulb globe temperature approximation were found as better indicators of the effect of environmental conditions than absolute and relative humidity or heat index on match outcomes. In GCC region, higher temperature increased the likelihood of a favorable outcome when playing against non-GCC teams. However, international ranking should be considered because an opponent with a higher rank reduced, but did not eliminate, the likelihood of a favorable outcome.
Su, Jingjun; Du, Xinzhong; Li, Xuyong
2018-05-16
Uncertainty analysis is an important prerequisite for model application. However, the existing phosphorus (P) loss indexes or indicators were rarely evaluated. This study applied generalized likelihood uncertainty estimation (GLUE) method to assess the uncertainty of parameters and modeling outputs of a non-point source (NPS) P indicator constructed in R language. And the influences of subjective choices of likelihood formulation and acceptability threshold of GLUE on model outputs were also detected. The results indicated the following. (1) Parameters RegR 2 , RegSDR 2 , PlossDP fer , PlossDP man , DPDR, and DPR were highly sensitive to overall TP simulation and their value ranges could be reduced by GLUE. (2) Nash efficiency likelihood (L 1 ) seemed to present better ability in accentuating high likelihood value simulations than the exponential function (L 2 ) did. (3) The combined likelihood integrating the criteria of multiple outputs acted better than single likelihood in model uncertainty assessment in terms of reducing the uncertainty band widths and assuring the fitting goodness of whole model outputs. (4) A value of 0.55 appeared to be a modest choice of threshold value to balance the interests between high modeling efficiency and high bracketing efficiency. Results of this study could provide (1) an option to conduct NPS modeling under one single computer platform, (2) important references to the parameter setting for NPS model development in similar regions, (3) useful suggestions for the application of GLUE method in studies with different emphases according to research interests, and (4) important insights into the watershed P management in similar regions.
Delaney, W; Grube, J W; Ames, G M
1998-03-01
This research investigated belief, social support and background predictors of employee likelihood to use an Employee Assistance Program (EAP) for a drinking problem. An anonymous cross-sectional survey was administered in the home. Bivariate analyses and simultaneous equations path analysis were used to explore a model of EAP use. Survey and ethnographic research were conducted in a unionized heavy machinery manufacturing plant in the central states of the United States. A random sample of 852 hourly and salaried employees was selected. In addition to background variables, measures included: likelihood of going to an EAP for a drinking problem, belief the EAP can help, social support for the EAP from co-workers/others, belief that EAP use will harm employment, and supervisor encourages the EAP for potential drinking problems. Belief in EAP efficacy directly increased the likelihood of going to an EAP. Greater perceived social support and supervisor encouragement increased the likelihood of going to an EAP both directly and indirectly through perceived EAP efficacy. Black and union hourly employees were more likely to say they would use an EAP. Males and those who reported drinking during working hours were less likely to say they would use an EAP for a drinking problem. EAP beliefs and social support have significant effects on likelihood to go to an EAP for a drinking problem. EAPs may wish to focus their efforts on creating an environment where there is social support from coworkers and encouragement from supervisors for using EAP services. Union networks and team members have an important role to play in addition to conventional supervisor intervention.
Regression estimators for generic health-related quality of life and quality-adjusted life years.
Basu, Anirban; Manca, Andrea
2012-01-01
To develop regression models for outcomes with truncated supports, such as health-related quality of life (HRQoL) data, and account for features typical of such data such as a skewed distribution, spikes at 1 or 0, and heteroskedasticity. Regression estimators based on features of the Beta distribution. First, both a single equation and a 2-part model are presented, along with estimation algorithms based on maximum-likelihood, quasi-likelihood, and Bayesian Markov-chain Monte Carlo methods. A novel Bayesian quasi-likelihood estimator is proposed. Second, a simulation exercise is presented to assess the performance of the proposed estimators against ordinary least squares (OLS) regression for a variety of HRQoL distributions that are encountered in practice. Finally, the performance of the proposed estimators is assessed by using them to quantify the treatment effect on QALYs in the EVALUATE hysterectomy trial. Overall model fit is studied using several goodness-of-fit tests such as Pearson's correlation test, link and reset tests, and a modified Hosmer-Lemeshow test. The simulation results indicate that the proposed methods are more robust in estimating covariate effects than OLS, especially when the effects are large or the HRQoL distribution has a large spike at 1. Quasi-likelihood techniques are more robust than maximum likelihood estimators. When applied to the EVALUATE trial, all but the maximum likelihood estimators produce unbiased estimates of the treatment effect. One and 2-part Beta regression models provide flexible approaches to regress the outcomes with truncated supports, such as HRQoL, on covariates, after accounting for many idiosyncratic features of the outcomes distribution. This work will provide applied researchers with a practical set of tools to model outcomes in cost-effectiveness analysis.
Marcu, Afrodita; Lyratzopoulos, Georgios; Black, Georgia; Vedsted, Peter; Whitaker, Katriina L
2016-10-01
Stage at diagnosis of breast cancer varies by socio-economic status (SES), with lower SES associated with poorer survival. We investigated associations between SES (indexed by education), and the likelihood of attributing breast symptoms to breast cancer. We conducted an online survey with 961 women (47-92 years) with variable educational levels. Two vignettes depicted familiar and unfamiliar breast changes (axillary lump and nipple rash). Without making breast cancer explicit, women were asked 'What do you think this […..] could be?' After the attribution question, women were asked to indicate their level of agreement with a cancer avoidance statement ('I would not want to know if I have breast cancer'). Women were more likely to mention cancer as a possible cause of an axillary lump (64%) compared with nipple rash (30%). In multivariable analysis, low and mid education were independently associated with being less likely to attribute a nipple rash to cancer (OR 0.51, 0.36-0.73 and OR 0.55, 0.40-0.77, respectively). For axillary lump, low education was associated with lower likelihood of mentioning cancer as a possible cause (OR 0.58, 0.41-0.83). Although cancer avoidance was also associated with lower education, the association between education and lower likelihood of making a cancer attribution was independent. Lower education was associated with lower likelihood of making cancer attributions for both symptoms, also after adjustment for cancer avoidance. Lower likelihood of considering cancer may delay symptomatic presentation and contribute to educational differences in stage at diagnosis. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Samulski, Maurice; Karssemeijer, Nico
2008-03-01
Most of the current CAD systems detect suspicious mass regions independently in single views. In this paper we present a method to match corresponding regions in mediolateral oblique (MLO) and craniocaudal (CC) mammographic views of the breast. For every possible combination of mass regions in the MLO view and CC view, a number of features are computed, such as the difference in distance of a region to the nipple, a texture similarity measure, the gray scale correlation and the likelihood of malignancy of both regions computed by single-view analysis. In previous research, Linear Discriminant Analysis was used to discriminate between correct and incorrect links. In this paper we investigate if the performance can be improved by employing a statistical method in which four classes are distinguished. These four classes are defined by the combinations of view (MLO/CC) and pathology (TP/FP) labels. We use distance-weighted k-Nearest Neighbor density estimation to estimate the likelihood of a region combination. Next, a correspondence score is calculated as the likelihood that the region combination is a TP-TP link. The method was tested on 412 cases with a malignant lesion visible in at least one of the views. In 82.4% of the cases a correct link could be established between the TP detections in both views. In future work, we will use the framework presented here to develop a context dependent region matching scheme, which takes the number and likelihood of possible alternatives into account. It is expected that more accurate determination of matching probabilities will lead to improved CAD performance.
Martyna, Agnieszka; Zadora, Grzegorz; Neocleous, Tereza; Michalska, Aleksandra; Dean, Nema
2016-08-10
Many chemometric tools are invaluable and have proven effective in data mining and substantial dimensionality reduction of highly multivariate data. This becomes vital for interpreting various physicochemical data due to rapid development of advanced analytical techniques, delivering much information in a single measurement run. This concerns especially spectra, which are frequently used as the subject of comparative analysis in e.g. forensic sciences. In the presented study the microtraces collected from the scenarios of hit-and-run accidents were analysed. Plastic containers and automotive plastics (e.g. bumpers, headlamp lenses) were subjected to Fourier transform infrared spectrometry and car paints were analysed using Raman spectroscopy. In the forensic context analytical results must be interpreted and reported according to the standards of the interpretation schemes acknowledged in forensic sciences using the likelihood ratio approach. However, for proper construction of LR models for highly multivariate data, such as spectra, chemometric tools must be employed for substantial data compression. Conversion from classical feature representation to distance representation was proposed for revealing hidden data peculiarities and linear discriminant analysis was further applied for minimising the within-sample variability while maximising the between-sample variability. Both techniques enabled substantial reduction of data dimensionality. Univariate and multivariate likelihood ratio models were proposed for such data. It was shown that the combination of chemometric tools and the likelihood ratio approach is capable of solving the comparison problem of highly multivariate and correlated data after proper extraction of the most relevant features and variance information hidden in the data structure. Copyright © 2016 Elsevier B.V. All rights reserved.
Langholz, Bryan; Thomas, Duncan C.; Stovall, Marilyn; Smith, Susan A.; Boice, John D.; Shore, Roy E.; Bernstein, Leslie; Lynch, Charles F.; Zhang, Xinbo; Bernstein, Jonine L.
2009-01-01
Summary Methods for the analysis of individually matched case-control studies with location-specific radiation dose and tumor location information are described. These include likelihood methods for analyses that just use cases with precise location of tumor information and methods that also include cases with imprecise tumor location information. The theory establishes that each of these likelihood based methods estimates the same radiation rate ratio parameters, within the context of the appropriate model for location and subject level covariate effects. The underlying assumptions are characterized and the potential strengths and limitations of each method are described. The methods are illustrated and compared using the WECARE study of radiation and asynchronous contralateral breast cancer. PMID:18647297
Jackson, Dan; White, Ian R; Riley, Richard D
2013-01-01
Multivariate meta-analysis is becoming more commonly used. Methods for fitting the multivariate random effects model include maximum likelihood, restricted maximum likelihood, Bayesian estimation and multivariate generalisations of the standard univariate method of moments. Here, we provide a new multivariate method of moments for estimating the between-study covariance matrix with the properties that (1) it allows for either complete or incomplete outcomes and (2) it allows for covariates through meta-regression. Further, for complete data, it is invariant to linear transformations. Our method reduces to the usual univariate method of moments, proposed by DerSimonian and Laird, in a single dimension. We illustrate our method and compare it with some of the alternatives using a simulation study and a real example. PMID:23401213
Pourfaraj, Majid; Mohammadi, Nourallah; Taghavi, Mohammadreza
2008-12-01
The purpose of this study is to examine the psychometric properties of Thought-Action Fusion revised scale (TAF-R; Amir, N., freshman, M., Ramsey, B., Neary, E., & Brigidi, B. (2001). Thought-action fusion in individuals with OCD symptoms. Behaviour Research and Therapy, 39, 765-776) in a sample of 565 (321 female) students of Shiraz university. The results of factor analysis with using varimax rotation yielded eight factors that explained 80% variances of total scale. These factors are labeled: moral TAF, responsibility for positive thoughts, likelihood negative events, likelihood positive events, responsibility for negative thoughts, responsibility for harm avoidance, likelihood harm avoidance and likelihood self, respectively. The reliability coefficients of total scale are calculated by two methods: internal consistency and test-retest, which were 0.81 and 0.61, respectively. Concurrent validity showed that TAF-R scores positively and significantly correlate with responsibility, guilt and obsessive-compulsive symptoms. Confirming the expectations, there were people with high obsessive-compulsive symptoms having higher TAF-R scores than those with low symptoms. Moreover, subscales-total correlations showed that the correlations between subscales were low, but subscales correlating with total score of TAF-R were moderated.
Empirical likelihood method for non-ignorable missing data problems.
Guan, Zhong; Qin, Jing
2017-01-01
Missing response problem is ubiquitous in survey sampling, medical, social science and epidemiology studies. It is well known that non-ignorable missing is the most difficult missing data problem where the missing of a response depends on its own value. In statistical literature, unlike the ignorable missing data problem, not many papers on non-ignorable missing data are available except for the full parametric model based approach. In this paper we study a semiparametric model for non-ignorable missing data in which the missing probability is known up to some parameters, but the underlying distributions are not specified. By employing Owen (1988)'s empirical likelihood method we can obtain the constrained maximum empirical likelihood estimators of the parameters in the missing probability and the mean response which are shown to be asymptotically normal. Moreover the likelihood ratio statistic can be used to test whether the missing of the responses is non-ignorable or completely at random. The theoretical results are confirmed by a simulation study. As an illustration, the analysis of a real AIDS trial data shows that the missing of CD4 counts around two years are non-ignorable and the sample mean based on observed data only is biased.
Hui, Siu-Kuen Azor; Grandner, Michael A
2015-01-01
Using the Transtheoretical Model of behavioral change, this study evaluates the relationship between sleep quality and the motivation and maintenance processes of healthy behavior change. The current study is an analysis of data collected in 2008 from an online health risk assessment (HRA) survey completed by participants of the Kansas State employee wellness program (N=13,322). Using multinomial logistic regression, associations between self-reported sleep quality and stages of change (i.e. precontemplation, contemplation, preparation, action, maintenance) in five health behaviors (stress management, weight management, physical activities, alcohol use, and smoking) were analyzed. Adjusted for covariates, poor sleep quality was associated with an increased likelihood of contemplation, preparation, and in some cases action stage when engaging in the health behavior change process, but generally a lower likelihood of maintenance of the healthy behavior. The present study demonstrated that poor sleep quality was associated with an elevated likelihood of contemplating or initiating behavior change, but a decreased likelihood of maintaining healthy behavior change. It is important to include sleep improvement as one of the lifestyle management interventions offered in EWP to comprehensively reduce health risks and promote the health of a large employee population.
Lovelock, Paul K; Spurdle, Amanda B; Mok, Myth TS; Farrugia, Daniel J; Lakhani, Sunil R; Healey, Sue; Arnold, Stephen; Buchanan, Daniel; Investigators, kConFab; Couch, Fergus J; Henderson, Beric R; Goldgar, David E; Tavtigian, Sean V; Chenevix-Trench, Georgia; Brown, Melissa A
2007-01-01
Introduction Many of the DNA sequence variants identified in the breast cancer susceptibility gene BRCA1 remain unclassified in terms of their potential pathogenicity. Both multifactorial likelihood analysis and functional approaches have been proposed as a means to elucidate likely clinical significance of such variants, but analysis of the comparative value of these methods for classifying all sequence variants has been limited. Methods We have compared the results from multifactorial likelihood analysis with those from several functional analyses for the four BRCA1 sequence variants A1708E, G1738R, R1699Q, and A1708V. Results Our results show that multifactorial likelihood analysis, which incorporates sequence conservation, co-inheritance, segregation, and tumour immunohistochemical analysis, may improve classification of variants. For A1708E, previously shown to be functionally compromised, analysis of oestrogen receptor, cytokeratin 5/6, and cytokeratin 14 tumour expression data significantly strengthened the prediction of pathogenicity, giving a posterior probability of pathogenicity of 99%. For G1738R, shown to be functionally defective in this study, immunohistochemistry analysis confirmed previous findings of inconsistent 'BRCA1-like' phenotypes for the two tumours studied, and the posterior probability for this variant was 96%. The posterior probabilities of R1699Q and A1708V were 54% and 69%, respectively, only moderately suggestive of increased risk. Interestingly, results from functional analyses suggest that both of these variants have only partial functional activity. R1699Q was defective in foci formation in response to DNA damage and displayed intermediate transcriptional transactivation activity but showed no evidence for centrosome amplification. In contrast, A1708V displayed an intermediate transcriptional transactivation activity and a normal foci formation response in response to DNA damage but induced centrosome amplification. Conclusion These data highlight the need for a range of functional studies to be performed in order to identify variants with partially compromised function. The results also raise the possibility that A1708V and R1699Q may be associated with a low or moderate risk of cancer. While data pooling strategies may provide more information for multifactorial analysis to improve the interpretation of the clinical significance of these variants, it is likely that the development of current multifactorial likelihood approaches and the consideration of alternative statistical approaches will be needed to determine whether these individually rare variants do confer a low or moderate risk of breast cancer. PMID:18036263
NASA Astrophysics Data System (ADS)
Refat, Moamen S.; Al-Azab, Fathi M.; Al-Maydama, Hussein M. A.; Amin, Ragab R.; Jamil, Yasmin M. S.
2014-06-01
Metal complexes of pyridoxine mono hydrochloride (vitamin B6) are prepared using La(III), Ce(III), Sm(III) and Y(III). The resulting complexes are investigated. Some physical properties, conductivity, analytical data and the composition of the four pyridoxine complexes are discussed. The elemental analysis shows that the formed complexes of La(III), Ce(III), Sm(III) and Y(III) with pyridoxine are of 1:2 (metal:PN) molar ratio. All the synthesized complexes are brown in color and possess high melting points. These complexes are partially soluble in hot methanol, dimethylsulfoxide and dimethylformamide and insoluble in water and some other organic solvents. Elemental analysis data, spectroscopic (IR, UV-vis. and florescence), effective magnetic moment in Bohr magnetons and the proton NMR suggest the structures. However, definite particle size is determined by invoking the X-ray powder diffraction and scanning electron microscopy data. The results obtained suggested that pyridoxine reacted with metal ions as a bidentate ligand through its phenolate oxygen and the oxygen of the adjacent group at the 4‧-position. The molar conductance measurements proved that the pyridoxine complexes are electrolytic in nature. The kinetic and thermodynamic parameters such as: Ea, ΔH*, ΔS* and ΔG* were estimated from the DTG curves. The antibacterial evaluation of the pyridoxine and their complexes were also performed against some gram positive, negative bacteria as well as fungi.
ERIC Educational Resources Information Center
Wall, Melanie M.; Guo, Jia; Amemiya, Yasuo
2012-01-01
Mixture factor analysis is examined as a means of flexibly estimating nonnormally distributed continuous latent factors in the presence of both continuous and dichotomous observed variables. A simulation study compares mixture factor analysis with normal maximum likelihood (ML) latent factor modeling. Different results emerge for continuous versus…
Developing Multidimensional Likert Scales Using Item Factor Analysis: The Case of Four-Point Items
ERIC Educational Resources Information Center
Asún, Rodrigo A.; Rdz-Navarro, Karina; Alvarado, Jesús M.
2016-01-01
This study compares the performance of two approaches in analysing four-point Likert rating scales with a factorial model: the classical factor analysis (FA) and the item factor analysis (IFA). For FA, maximum likelihood and weighted least squares estimations using Pearson correlation matrices among items are compared. For IFA, diagonally weighted…
Johal, Ama; Chaggar, Amrit; Zou, Li Fong
2018-03-01
The present study used the optical surface laser scanning technique to compare the facial features of patients aged 8-18 years presenting with Class I and Class III incisor relationship in a case-control design. Subjects with a Class III incisor relationship, aged 8-18 years, were age and gender matched with Class I control and underwent a 3-dimensional (3-D) optical surface scan of the facial soft tissues. Landmark analysis revealed Class III subjects displayed greater mean dimensions compared to the control group most notably between the ages of 8-10 and 17-18 years in both males and females, in respect of antero-posterior (P = 0.01) and vertical (P = 0.006) facial dimensions. Surface-based analysis, revealed the greatest difference in the lower facial region, followed by the mid-face, whilst the upper face remained fairly consistent. Significant detectable differences were found in the surface facial features of developing Class III subjects.
Brozek, Wolfgang; Manhardt, Teresa; Kállay, Enikö; Peterlik, Meinrad; Cross, Heide S
2012-07-26
Previous studies on the significance of vitamin D insufficiency and chronic inflammation in colorectal cancer development clearly indicated that maintenance of cellular homeostasis in the large intestinal epithelium requires balanced interaction of 1,25-(OH)2D3 and prostaglandin cellular signaling networks. The present study addresses the question how colorectal cancer pathogenesis depends on alterations of activities of vitamin D hydroxylases, i.e., CYP27B1-encoded 25-hydroxyvitamin D-1a-hydroxylase and CYP24A1-encoded 25-hydroxyvitamin D-24-hydroxylase, and inflammation-induced cyclooxygenase-2 (COX-2). Data from 105 cancer patients on CYP27B1, VDR, CYP24A1, and COX-2 mRNA expression in relation to tumor grade, anatomical location, gender and age were fit into a multivariate model of exploratory factor analysis. Nearly identical results were obtained by the principal factor and the maximum likelihood method, and these were confirmed by hierarchical cluster analysis: Within the eight mutually dependent variables studied four independent constellations were found that identify different features of colorectal cancer pathogenesis: (i) Escape of COX-2 activity from restraints by the CYP27B1/VDR system can initiate cancer growth anywhere in the colorectum regardless of age and gender; (ii) variations in COX-2 expression are mainly responsible for differences in cancer incidence in relation to tumor location; (iii) advancing age has a strong gender-specific influence on cancer incidence; (iv) progression from well differentiated to undifferentiated cancer is solely associated with a rise in CYP24A1 expression.
Brozek, Wolfgang; Manhardt, Teresa; Kállay, Enikö; Peterlik, Meinrad; Cross, Heide S.
2012-01-01
Previous studies on the significance of vitamin D insufficiency and chronic inflammation in colorectal cancer development clearly indicated that maintenance of cellular homeostasis in the large intestinal epithelium requires balanced interaction of 1,25-(OH)2D3 and prostaglandin cellular signaling networks. The present study addresses the question how colorectal cancer pathogenesis depends on alterations of activities of vitamin D hydroxylases, i.e., CYP27B1-encoded 25-hydroxyvitamin D-1α-hydroxylase and CYP24A1-encoded 25-hydroxyvitamin D-24-hydroxylase, and inflammation-induced cyclooxygenase-2 (COX-2). Data from 105 cancer patients on CYP27B1, VDR, CYP24A1, and COX-2 mRNA expression in relation to tumor grade, anatomical location, gender and age were fit into a multivariate model of exploratory factor analysis. Nearly identical results were obtained by the principal factor and the maximum likelihood method, and these were confirmed by hierarchical cluster analysis: Within the eight mutually dependent variables studied four independent constellations were found that identify different features of colorectal cancer pathogenesis: (i) Escape of COX-2 activity from restraints by the CYP27B1/VDR system can initiate cancer growth anywhere in the colorectum regardless of age and gender; (ii) variations in COX-2 expression are mainly responsible for differences in cancer incidence in relation to tumor location; (iii) advancing age has a strong gender-specific influence on cancer incidence; (iv) progression from well differentiated to undifferentiated cancer is solely associated with a rise in CYP24A1 expression. PMID:24213465
NASA Astrophysics Data System (ADS)
Demianski, Marek; Piedipalumbo, Ester; Sawant, Disha; Amati, Lorenzo
2017-02-01
Context. Explaining the accelerated expansion of the Universe is one of the fundamental challenges in physics today. Cosmography provides information about the evolution of the universe derived from measured distances, assuming only that the space time geometry is described by the Friedman-Lemaitre-Robertson-Walker metric, and adopting an approach that effectively uses only Taylor expansions of basic observables. Aims: We perform a high-redshift analysis to constrain the cosmographic expansion up to the fifth order. It is based on the Union2 type Ia supernovae data set, the gamma-ray burst Hubble diagram, a data set of 28 independent measurements of the Hubble parameter, baryon acoustic oscillations measurements from galaxy clustering and the Lyman-α forest in the SDSS-III Baryon Oscillation Spectroscopic Survey (BOSS), and some Gaussian priors on h and ΩM. Methods: We performed a statistical analysis and explored the probability distributions of the cosmographic parameters. By building up their regions of confidence, we maximized our likelihood function using the Markov chain Monte Carlo method. Results: Our high-redshift analysis confirms that the expansion of the Universe currently accelerates; the estimation of the jerk parameter indicates a possible deviation from the standard ΛCDM cosmological model. Moreover, we investigate implications of our results for the reconstruction of the dark energy equation of state (EOS) by comparing the standard technique of cosmography with an alternative approach based on generalized Padé approximations of the same observables. Because these expansions converge better, is possible to improve the constraints on the cosmographic parameters and also on the dark matter EOS. Conclusions: The estimation of the jerk and the DE parameters indicates at 1σ a possible deviation from the ΛCDM cosmological model.
Kong, Xiangxing; Li, Jun; Cai, Yibo; Tian, Yu; Chi, Shengqiang; Tong, Danyang; Hu, Yeting; Yang, Qi; Li, Jingsong; Poston, Graeme; Yuan, Ying; Ding, Kefeng
2018-01-08
To revise the American Joint Committee on Cancer TNM staging system for colorectal cancer (CRC) based on a nomogram analysis of Surveillance, Epidemiology, and End Results (SEER) database, and to prove the rationality of enhancing T stage's weighting in our previously proposed T-plus staging system. Total 115,377 non-metastatic CRC patients from SEER were randomly grouped as training and testing set by ratio 1:1. The Nomo-staging system was established via three nomograms based on 1-year, 2-year and 3-year disease specific survival (DSS) Logistic regression analysis of the training set. The predictive value of Nomo-staging system for the testing set was evaluated by concordance index (c-index), likelihood ratio (L.R.) and Akaike information criteria (AIC) for 1-year, 2-year, 3-year overall survival (OS) and DSS. Kaplan-Meier survival curve was used to valuate discrimination and gradient monotonicity. And an external validation was performed on database from the Second Affiliated Hospital of Zhejiang University (SAHZU). Patients with T1-2 N1 and T1N2a were classified into stage II while T4 N0 patients were classified into stage III in Nomo-staging system. Kaplan-Meier survival curves of OS and DSS in testing set showed Nomo-staging system performed better in discrimination and gradient monotonicity, and the external validation in SAHZU database also showed distinctly better discrimination. The Nomo-staging system showed higher value in L.R. and c-index, and lower value in AIC when predicting OS and DSS in testing set. The Nomo-staging system showed better performance in prognosis prediction and the weight of lymph nodes status in prognosis prediction should be cautiously reconsidered.
Datta, Niloy R; Rogers, Susanne; Ordóñez, Silvia Gómez; Puric, Emsad; Bodis, Stephan
2016-01-01
A systematic review and meta-analysis was conducted to evaluate the outcome of controlled clinical trials in head and neck cancers (HNCs) using hyperthermia and radiotherapy versus radiotherapy alone. A total of 498 abstracts were screened from four databases and hand searched as per the PRISMA guidelines. Only two-arm studies treating HNCs with either radiotherapy alone, or hyperthermia and radiotherapy without concurrent chemotherapy or surgery were considered. The evaluated end point was complete response (CR). Following a detailed screening of the titles, abstracts and full text papers, six articles fulfilling the above eligibility criteria were considered. In total 451 clinical cases from six studies were included in the meta-analysis. Five of six trials were randomised. The overall CR with radiotherapy alone was 39.6% (92/232) and varied between 31.3% and 46.9% across the six trials. With thermoradiotherapy, the overall CR reported was 62.5% (137/219), (range 33.9-83.3%). The odds ratio was 2.92 (95% CI: 1.58-5.42, p = 0.001); the risk ratio was 1.61 (95% CI: 1.32-1.97, p < 0.0001) and the risk difference was 0.25 (95% CI: 0.12-0.39, p < 0.0001), all in favour of combined treatment with hyperthermia and radiotherapy over radiotherapy alone. Acute and late grade III/IV toxicities were reported to be similar in both the groups. Hyperthermia along with radiotherapy enhances the likelihood of CR in HNCs by around 25% compared to radiotherapy alone with no significant additional acute and late morbidities. This level I evidence should justify the integration of hyperthermia into the multimodality therapy of HNCs.
Analysis of Unplanned Intensive Care Unit Admissions in Postoperative Pediatric Patients.
Landry, Elizabeth K; Gabriel, Rodney A; Beutler, Sascha; Dutton, Richard P; Urman, Richard D
2017-03-01
Currently, there are only a few retrospective, single-institution studies that have addressed the prevalence and risk factors associated with unplanned admissions to the pediatric intensive care unit (ICU) after surgery. Based on the limited amount of studies, it appears that airway and respiratory complications put a child at increased risk for unplanned ICU admission. A more extensive and diverse analysis of unplanned postoperative admissions to the ICU is needed to address risk factors that have yet to be revealed by the current literature. To establish a rate of unplanned postoperative ICU admissions in pediatric patients using a large, multi-institution data set and to further characterize the associated risk factors. Data from the National Anesthesia Clinical Outcomes Registry were analyzed. We recorded the overall risk of unplanned postoperative ICU admission in patients younger than 18 years and performed univariate and multivariate logistic regression analysis to identify the associated patient, surgical, and anesthetic-related characteristics. Of the 324 818 cases analyzed, 211 reported an unexpected ICU admission. There was an increased likelihood of unplanned postoperative ICU in infants (age <1 year) and children who were classified as American Society of Anesthesiologists physical status classification of III or IV. Likewise, longer case duration and cases requiring general anesthesia were also associated with unplanned ICU admissions. This study establishes a rate of unplanned ICU admission following surgery in the heterogeneous pediatric population. This is the first study to utilize such a large data set encompassing a wide range of practice environments to identify risk factors leading to unplanned postoperative ICU admissions. Our study revealed that patient, surgical, and anesthetic complexity each contributed to an increased number of unplanned ICU admissions in the pediatric population.
Cavanagh, Colin R; Jonas, Elisabeth; Hobbs, Matthew; Thomson, Peter C; Tammen, Imke; Raadsma, Herman W
2010-09-16
An (Awassi × Merino) × Merino single-sire backcross family with 165 male offspring was used to map quantitative trait loci (QTL) for body composition traits on a framework map of 189 microsatellite loci across all autosomes. Two cohorts were created from the experimental progeny to represent alternative maturity classes for body composition assessment. Animals were raised under paddock conditions prior to entering the feedlot for a 90-day fattening phase. Body composition traits were derived in vivo at the end of the experiment prior to slaughter at 2 (cohort 1) and 3.5 (cohort 2) years of age, using computed tomography. Image analysis was used to gain accurate predictions for 13 traits describing major fat depots, lean muscle, bone, body proportions and body weight which were used for single- and two-QTL mapping analysis. Using a maximum-likelihood approach, three highly significant (LOD ≥ 3), 15 significant (LOD ≥ 2), and 11 suggestive QTL (1.7 ≤ LOD < 2) were detected on eleven chromosomes. Regression analysis confirmed 28 of these QTL and an additional 17 suggestive (P < 0.1) and two significant (P < 0.05) QTL were identified using this method. QTL with pleiotropic effects for two or more tissues were identified on chromosomes 1, 6, 10, 14, 16 and 23. No tissue-specific QTL were identified.A meta-assembly of ovine QTL for carcass traits from this study and public domain sources was performed and compared with a corresponding bovine meta-assembly. The assembly demonstrated QTL with effects on carcass composition in homologous regions on OAR1, 2, 6 and 21.
Sarre, Aili; Ökvist, Mats; Klar, Tobias; Hall, David R; Smalås, Arne O; McSweeney, Sean; Timmins, Joanna; Moe, Elin
2015-08-01
While most bacteria possess a single gene encoding the bifunctional DNA glycosylase Endonuclease III (EndoIII) in their genomes, Deinococcus radiodurans possesses three: DR2438 (DrEndoIII1), DR0289 (DrEndoIII2) and DR0982 (DrEndoIII3). Here we have determined the crystal structures of DrEndoIII1 and an N-terminally truncated form of DrEndoIII3 (DrEndoIII3Δ76). We have also generated a homology model of DrEndoIII2 and measured activity of the three enzymes. All three structures consist of two all α-helical domains, one of which exhibits a [4Fe-4S] cluster and the other a HhH-motif, separated by a DNA binding cleft, similar to previously determined structures of endonuclease III from Escherichia coli and Geobacillus stearothermophilus. However, both DrEndoIII1 and DrEndoIII3 possess an extended HhH motif with extra helical features and an altered electrostatic surface potential. In addition, the DNA binding cleft of DrEndoIII3 seems to be less accessible for DNA interactions, while in DrEndoIII1 it seems to be more open. Analysis of the enzyme activities shows that DrEndoIII2 is most similar to the previously studied enzymes, while DrEndoIII1 seems to be more distant with a weaker activity towards substrate DNA containing either thymine glycol or an abasic site. DrEndoIII3 is the most distantly related enzyme and displays no detectable activity towards these substrates even though the suggested catalytic residues are conserved. Based on a comparative structural analysis, we suggest that the altered surface potential, shape of the substrate-binding pockets and specific amino acid substitutions close to the active site and in the DNA interacting loops may underlie the unexpected differences in activity. Copyright © 2015 Elsevier Inc. All rights reserved.
Waller, Martha W; Iritani, Bonita J; Christ, Sharon L; Clark, Heddy Kovach; Moracco, Kathryn E; Halpern, Carolyn Tucker; Flewelling, Robert L
2012-07-01
Greater access to alcohol has been widely found to be associated with many negative outcomes including violence perpetration. This study examines the relationship between alcohol outlet density, alcohol use, and intimate partner violence (IPV) victimization among young women in the United States. A direct association between alcohol outlet density in one's neighborhood and the likelihood of IPV victimization was examined. Data were from Wave III of the National Longitudinal Study of Adolescent Health (Add Health), which followed a nationally representative sample of adolescents into adulthood. Participants were young adult females age 18 to 26 at Wave III. Of the 4,571 female respondents who reported a current heterosexual relationship and had IPV data, 13.2% reported having been the victim of physical violence only and 6.5% experienced sexual only or physical and sexual violence in the relationship during the past year. In the regression models tested, there was no significant direct association between neighborhood alcohol outlet density and IPV victimization nor was there an association between outlet density and drinking behaviors, thus eliminating the possibility of an indirect association. Results of fully adjusted models indicate females who drank heavily, whether infrequently or frequently, were at significant risk for experiencing sexual only IPV or sexual and physical IPV. Asians and Native Americans were at significantly greater odds of experiencing sexual only or sexual and physical IPV compared with non-Hispanic Whites, while non-Hispanic Blacks were at significantly greater odds for physical only IPV. We conclude that a continuous measure of alcohol outlet density was not associated with IPV in models controlling for individual and other neighborhood characteristics. Young women who drink heavily, whether infrequently or frequently, have greater odds of experiencing sexual only or sexual and physical compared to abstainers. Similar to previous study findings, young women living with or married to their partner were at far greater risk of experiencing physical only and/or sexual only or sexual and physical IPV. The study adds to the growing body of literature that examines how community characteristics such as outlet density influence the likelihood of IPV.
Computer-based guidelines for concrete pavements : HIPERPAV III : user manual
DOT National Transportation Integrated Search
2009-10-01
This user manual provides guidance on how to use the new High PERformance PAVing (HIPERPAV) III software program for the analysis of early-age Portland cement concrete pavement (PCCP) behavior. HIPERPAV III includes several improvements over prev...
RELATIONSHIP OF PRESEASON MOVEMENT SCREENS WITH OVERUSE SYMPTOMS IN COLLEGIATE BASEBALL PLAYERS
Clifton, Daniel R.; Onate, James A.; Ramsey, Vincent K.; Cromartie, Fred
2017-01-01
Background: The shoulder mobility screen of the Functional Movement Screen™ (FMS™) and the upper extremity patterns of the Selective Functional Movement Assessment (SFMA) assess global, multi-joint movement capabilities in the upper-extremities. Identifying which assessment can most accurately determine if baseball players are at an increased risk of experiencing overuse symptoms in the shoulder or elbow throughout a competitive season may reduce throwing-related injuries requiring medical attention. Purpose: The purpose of this study was to determine if preseason FMS™ or SFMA scores were related to overuse severity scores in the shoulder or elbow during the preseason and competitive season. Study design: Cohort study. Methods: Sixty healthy, male, Division III collegiate baseball players (mean age = 20.1 ± 2.0 years) underwent preseason testing using the FMS™ shoulder mobility screen, and SFMA upper extremity patterns. Their scores were dichotomized into good and bad movement scores, and were compared to weekly questionnaires registering overuse symptoms and pain severity in the shoulder or elbow during the season. Results: Poor FMS™ performance was associated with an increased likelihood of experiencing at least one overuse symptom during the preseason independent of grade and position (adjusted odds ratio [OR] = 5.14, p = 0.03). Poor SFMA performance was associated with an increased likelihood of experiencing at least one overuse symptom during the preseason (adjusted OR = 6.10, p = 0.03) and during the competitive season (adjusted OR = 17.07, p = 0.03) independent of grade and position. Conclusion: FMS™ shoulder mobility and SFMA upper extremity pattern performance were related to the likelihood of experiencing overuse symptoms during a baseball season. Participants with poor FMSTM performances may be more likely to experience at least one overuse symptom in their shoulder or elbow during the preseason. Additionally, individuals with poor SFMA performances may be more likely to report overuse symptoms during the preseason or competitive season. Level of evidence: Level 3 PMID:29158957
Sand, Andreas; Kristiansen, Martin; Pedersen, Christian N S; Mailund, Thomas
2013-11-22
Hidden Markov models are widely used for genome analysis as they combine ease of modelling with efficient analysis algorithms. Calculating the likelihood of a model using the forward algorithm has worst case time complexity linear in the length of the sequence and quadratic in the number of states in the model. For genome analysis, however, the length runs to millions or billions of observations, and when maximising the likelihood hundreds of evaluations are often needed. A time efficient forward algorithm is therefore a key ingredient in an efficient hidden Markov model library. We have built a software library for efficiently computing the likelihood of a hidden Markov model. The library exploits commonly occurring substrings in the input to reuse computations in the forward algorithm. In a pre-processing step our library identifies common substrings and builds a structure over the computations in the forward algorithm which can be reused. This analysis can be saved between uses of the library and is independent of concrete hidden Markov models so one preprocessing can be used to run a number of different models.Using this library, we achieve up to 78 times shorter wall-clock time for realistic whole-genome analyses with a real and reasonably complex hidden Markov model. In one particular case the analysis was performed in less than 8 minutes compared to 9.6 hours for the previously fastest library. We have implemented the preprocessing procedure and forward algorithm as a C++ library, zipHMM, with Python bindings for use in scripts. The library is available at http://birc.au.dk/software/ziphmm/.
Risk prediction with procalcitonin and clinical rules in community-acquired pneumonia
Huang, David T.; Weissfeld, Lisa A.; Kellum, John A.; Yealy, Donald M.; Kong, Lan; Martino, Michael; Angus, Derek C.
2009-01-01
Objective The Pneumonia Severity Index (PSI) and CURB-65 predict outcomes in community acquired pneumonia (CAP), but have limitations. Procalcitonin, a biomarker of bacterial infection, may provide prognostic information in CAP. Our objective was to describe the pattern of procalcitonin in CAP, and determine if procalcitonin provides prognostic information beyond PSI and CURB-65. Methods We conducted a multi-center prospective cohort study in 28 community and teaching emergency departments. Patients presenting with a clinical and radiographic diagnosis of CAP were enrolled. We stratified procalcitonin levels a priori into four tiers – I: < 0.1; II: ≥ 0.1 to <0.25; III: ≥ 0.25 to < 0.5; and IV: ≥ 0.5 ng/ml. Primary outcome was 30d mortality. Results 1651 patients formed the study cohort. Procalcitonin levels were broadly spread across tiers: 32.8% (I), 21.6% (II), 10.2% (III), 35.4% (IV). Used alone, procalcitonin had modest test characteristics: specificity (35%), sensitivity (92%), positive likelihood ratio (LR) (1.41), and negative LR (0.22). Adding procalcitonin to PSI in all subjects minimally improved performance. Adding procalcitonin to low risk PSI subjects (Class I–III) provided no additional information. However, subjects in procalcitonin tier I had low 30d mortality regardless of clinical risk, including those in higher risk classes (1.5% vs. 1.6% for those in PSI Class I–III vs. Class IV/V). Among high risk PSI subjects (Class IV/V), one quarter (126/546) were in procalcitonin tier I, and the negative LR of procalcitonin tier I was 0.09. Procalcitonin tier I was also associated with lower burden of other adverse outcomes. Similar results were seen with CURB-65 stratification. Conclusions Selective use of procalcitonin as an adjunct to existing rules may offer additional prognostic information in high risk patients. PMID:18342993
NASA Technical Reports Server (NTRS)
2005-01-01
Under funding from this proposal three in situ profile measurements of stratospheric sulfate aerosol and ozone were completed from balloon-borne platforms. The measured quantities are aerosol size resolved number concentration and ozone. The one derived product is aerosol size distribution, from which aerosol moments, such as surface area, volume, and extinction can be calculated for comparison with SAGE III measurements and SAGE III derived products, such as surface area. The analysis of these profiles and comparison with SAGE III extinction measurements and SAGE III derived surface areas are provided in Yongxiao (2005), which comprised the research thesis component of Mr. Jian Yongxiao's M.S. degree in Atmospheric Science at the University of Wyoming. In addition analysis continues on using principal component analysis (PCA) to derive aerosol surface area from the 9 wavelength extinction measurements available from SAGE III. Ths paper will present PCA components to calculate surface area from SAGE III measurements and compare these derived surface areas with those available directly from in situ size distribution measurements, as well as surface areas which would be derived from PCA and Thomason's algorithm applied to the four wavelength SAGE II extinction measurements.
Feng, Yan-Ru; Zhu, Yuan; Liu, Lu-Ying; Wang, Wei-Hu; Wang, Shu-Lian; Song, Yong-Wen; Wang, Xin; Tang, Yuan; Liu, Yue-Ping; Ren, Hua; Fang, Hui; Zhang, Shi-Ping; Liu, Xin-Fan; Yu, Zi-Hao; Li, Ye-Xiong; Jin, Jing
2016-05-03
The aim of this study is to present an interim analysis of a phase III trial (NCT00714077) of postoperative concurrent capecitabine and radiotherapy with or without oxaliplatin for pathological stage II and III rectal cancer. Patients with pathologically confirmed stage II and III rectal cancer were randomized to either radiotherapy with concurrent capecitabine (Cap-RT group) or with capecitabine and oxaliplatin (Capox-RT group). The primary endpoint was 3-year disease-free survival rate (DFS). The 3-year DFS rate was 73.9% in the Capox-RT group and 71.6% in the Cap-RT group (HR 0.92, p = 0.647), respectively. No significant difference was observed in overall survival, cumulative incidence of local recurrence and distant metastasis between the two groups (p > 0.05). More grade 3-4 acute toxicity was observed in the Capox-RT group than in the Cap-RT group (38.1% vs. 29.2%, p = 0.041). Inclusion of oxaliplatin in the capecitabine-based postoperative regimen did not improve DFS but increased toxicities for pathological stage II and III rectal cancer in this interim analysis.
The impact of irritable bowel syndrome on health-related quality of life: a Singapore perspective.
Wang, Yu Tien; Lim, Hwee Yong; Tai, David; Krishnamoorthy, Thinesh L; Tan, Tira; Barbier, Sylvaine; Thumboo, Julian
2012-08-09
Irritable bowel syndrome (IBS) is a common gastrointestinal disorder. The prevalence of IBS in Asian countries varies from 2.9% to 15.6%. IBS does not result in increased mortality, but is associated with psychological distress and disruption of work and sleep. Consequently, the evaluation of health-related quality of life (HRQoL) is an important outcome measure for patients with IBS since it provides a holistic assessment of the patient's emotional, social and physical function. However, some HRQoL tools can be time-consuming to apply. EQ-5D is a brief HRQoL tool which has been validated in the Western IBS population but has thus far not been used in Asia. This study was conducted to determine whether persons with self-reported symptoms that met the Rome III criteria for IBS had a poorer quality of life than those without these symptoms. We also aimed to determine which specific aspects of quality of life were most affected and whether any risk factors distinguished those with and without IBS. Self-administered questionnaires which included the Rome III diagnostic questionnaire modules for IBS and the EQ-5D questionnaire were obtained from participants of a health symposium in Singapore on 31th October 2010. IBS was diagnosed based on the Rome III Criteria. The main outcome measure was the EQ-5D index score. The relationship between the presence of IBS and the EQ-5D index score, individual dimensions of EQ-5D and demographic risk factors were examined. 449 completed questionnaires were analyzed. The mean EQ-5D index score for IBS was 0.739 which was a significant reduction compared to non-IBS participants [-0.11 (95% CI: -0.15 to -0.07), p<0.001]. Multivariate analysis showed that IBS was significantly associated with younger age and higher education level. Of the five EQ-5D dimensions, IBS sufferers were significantly affected in mobility, anxiety or depression, usual activity and pain. There was a "dose related" increase in likelihood of having IBS with increased severity of pain and anxiety or depression. IBS sufferers have significantly poorer quality of life. Assessment of HRQoL in IBS using the EQ-5D should be considered in further studies and routine clinical practice.
Sun, Changling; Zhang, Yayun; Han, Xue; Du, Xiaodong
2018-03-01
Objective The purposes of this study were to verify the effectiveness of the narrow band imaging (NBI) system in diagnosing nasopharyngeal cancer (NPC) as compared with white light endoscopy. Data Sources PubMed, Cochrane Library, EMBASE, CNKI, and Wan Fang databases. Review Methods Data analyses were performed with Meta-Disc. The updated Quality Assessment of Diagnostic Accuracy Studies-2 tool was used to assess study quality and potential bias. Publication bias was assessed with a Deeks asymmetry test. The registry number of the protocol published on PROSPERO is CRD42015026244. Results This meta-analysis included 10 studies of 1337 lesions. For NBI diagnosis of NPC, the pooled values were as follows: sensitivity, 0.83 (95% CI, 0.80-0.86); specificity, 0.91 (95% CI, 0.89-0.93); positive likelihood ratio, 8.82 (95% CI, 5.12-15.21); negative likelihood ratio, 0.18 (95% CI, 0.12-0.27); and diagnostic odds ratio, 65.73 (95% CI, 36.74-117.60). The area under the curve was 0.9549. For white light endoscopy in diagnosing NPC, the pooled values were as follows: sensitivity, 0.79 (95% CI, 0.75-0.83); specificity, 0.87 (95% CI, 0.84-0.90); positive likelihood ratio, 5.02 (95% CI, 1.99-12.65); negative likelihood ratio, 0.34 (95% CI, 0.24-0.49); and diagnostic odds ratio, 16.89 (95% CI, 5.98-47.66). The area under the curve was 0.8627. The evaluation of heterogeneity, calculated per the diagnostic odds ratio, gave an I 2 of 0.326. No marked publication bias ( P = .68) existed in this meta-analysis. Conclusion The sensitivity and specificity of NBI for the diagnosis of NPC are similar to those of white light endoscopy, and the potential value of NBI for the diagnosis of NPC needs to be validated further.
NASA Astrophysics Data System (ADS)
Elshall, A. S.; Ye, M.; Niu, G. Y.; Barron-Gafford, G.
2015-12-01
Models in biogeoscience involve uncertainties in observation data, model inputs, model structure, model processes and modeling scenarios. To accommodate for different sources of uncertainty, multimodal analysis such as model combination, model selection, model elimination or model discrimination are becoming more popular. To illustrate theoretical and practical challenges of multimodal analysis, we use an example about microbial soil respiration modeling. Global soil respiration releases more than ten times more carbon dioxide to the atmosphere than all anthropogenic emissions. Thus, improving our understanding of microbial soil respiration is essential for improving climate change models. This study focuses on a poorly understood phenomena, which is the soil microbial respiration pulses in response to episodic rainfall pulses (the "Birch effect"). We hypothesize that the "Birch effect" is generated by the following three mechanisms. To test our hypothesis, we developed and assessed five evolving microbial-enzyme models against field measurements from a semiarid Savannah that is characterized by pulsed precipitation. These five model evolve step-wise such that the first model includes none of these three mechanism, while the fifth model includes the three mechanisms. The basic component of Bayesian multimodal analysis is the estimation of marginal likelihood to rank the candidate models based on their overall likelihood with respect to observation data. The first part of the study focuses on using this Bayesian scheme to discriminate between these five candidate models. The second part discusses some theoretical and practical challenges, which are mainly the effect of likelihood function selection and the marginal likelihood estimation methods on both model ranking and Bayesian model averaging. The study shows that making valid inference from scientific data is not a trivial task, since we are not only uncertain about the candidate scientific models, but also about the statistical methods that are used to discriminate between these models.
A Bayesian Alternative for Multi-objective Ecohydrological Model Specification
NASA Astrophysics Data System (ADS)
Tang, Y.; Marshall, L. A.; Sharma, A.; Ajami, H.
2015-12-01
Process-based ecohydrological models combine the study of hydrological, physical, biogeochemical and ecological processes of the catchments, which are usually more complex and parametric than conceptual hydrological models. Thus, appropriate calibration objectives and model uncertainty analysis are essential for ecohydrological modeling. In recent years, Bayesian inference has become one of the most popular tools for quantifying the uncertainties in hydrological modeling with the development of Markov Chain Monte Carlo (MCMC) techniques. Our study aims to develop appropriate prior distributions and likelihood functions that minimize the model uncertainties and bias within a Bayesian ecohydrological framework. In our study, a formal Bayesian approach is implemented in an ecohydrological model which combines a hydrological model (HyMOD) and a dynamic vegetation model (DVM). Simulations focused on one objective likelihood (Streamflow/LAI) and multi-objective likelihoods (Streamflow and LAI) with different weights are compared. Uniform, weakly informative and strongly informative prior distributions are used in different simulations. The Kullback-leibler divergence (KLD) is used to measure the dis(similarity) between different priors and corresponding posterior distributions to examine the parameter sensitivity. Results show that different prior distributions can strongly influence posterior distributions for parameters, especially when the available data is limited or parameters are insensitive to the available data. We demonstrate differences in optimized parameters and uncertainty limits in different cases based on multi-objective likelihoods vs. single objective likelihoods. We also demonstrate the importance of appropriately defining the weights of objectives in multi-objective calibration according to different data types.
Acoustic Analysis of Voice in Dysarthria following Stroke
ERIC Educational Resources Information Center
Wang, Yu-Tsai; Kent, Ray D.; Kent, Jane Finley; Duffy, Joseph R.; Thomas, Jack E.
2009-01-01
Although perceptual studies indicate the likelihood of voice disorders in persons with stroke, there have been few objective instrumental studies of voice dysfunction in dysarthria following stroke. This study reports automatic analysis of sustained vowel phonation for 61 speakers with stroke. The results show: (1) men with stroke and healthy…
A Comparison of Missing-Data Procedures for Arima Time-Series Analysis
ERIC Educational Resources Information Center
Velicer, Wayne F.; Colby, Suzanne M.
2005-01-01
Missing data are a common practical problem for longitudinal designs. Time-series analysis is a longitudinal method that involves a large number of observations on a single unit. Four different missing-data methods (deletion, mean substitution, mean of adjacent observations, and maximum likelihood estimation) were evaluated. Computer-generated…
ERIC Educational Resources Information Center
Prevost, A. Toby; Mason, Dan; Griffin, Simon; Kinmonth, Ann-Louise; Sutton, Stephen; Spiegelhalter, David
2007-01-01
Practical meta-analysis of correlation matrices generally ignores covariances (and hence correlations) between correlation estimates. The authors consider various methods for allowing for covariances, including generalized least squares, maximum marginal likelihood, and Bayesian approaches, illustrated using a 6-dimensional response in a series of…
Establishing Factor Validity Using Variable Reduction in Confirmatory Factor Analysis.
ERIC Educational Resources Information Center
Hofmann, Rich
1995-01-01
Using a 21-statement attitude-type instrument, an iterative procedure for improving confirmatory model fit is demonstrated within the context of the EQS program of P. M. Bentler and maximum likelihood factor analysis. Each iteration systematically eliminates the poorest fitting statement as identified by a variable fit index. (SLD)
Determinants of Standard Errors of MLEs in Confirmatory Factor Analysis
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Cheng, Ying; Zhang, Wei
2010-01-01
This paper studies changes of standard errors (SE) of the normal-distribution-based maximum likelihood estimates (MLE) for confirmatory factor models as model parameters vary. Using logical analysis, simplified formulas and numerical verification, monotonic relationships between SEs and factor loadings as well as unique variances are found.…
COS Views of Local Galaxies Approaching Primeval Conditions
NASA Astrophysics Data System (ADS)
Wofford, Aida
2014-10-01
We will use COS G160M+G185M to observe the cosmollogically important lines C IV 1548+1551 A, He II 1640 A, O III] 1661+1666 A, and C III] 1907+1909 A in the three closest most metal-poor blue compact dwarf galaxies known. These galaxies approach primeval insterstellar and stellar conditions. One of the galaxies has no existing spectroscopic coverage in the UV. Available spectroscopy of the most metal-poor galaxies in the local universe are scarce, inhomogeneous, mostly low spectral-resolution, and are either noisy in main UV lines or lack their coverage. The proposed spectral resolution of about 20 km/s represents an order of magnitude improvement over existing HST data and allows us to disentangle stellar, nebular, and/or shock components to the lines. The high-quality constraints obtained in the framework of this proposal will make it possible to assess the relative likelihood of new spectral models of star-forming galaxies from different groups, in the best possible way achievable with current instrumentation. This will ensure that the best possible studies of early chemical enrichment of the universe can be achieved. The proposed observations are necessary to minimize large existing systematic uncertainties in the determination of high-redshift galaxy properties that JWST was in large part designed to measure.
A unifying framework for marginalized random intercept models of correlated binary outcomes
Swihart, Bruce J.; Caffo, Brian S.; Crainiceanu, Ciprian M.
2013-01-01
We demonstrate that many current approaches for marginal modeling of correlated binary outcomes produce likelihoods that are equivalent to the copula-based models herein. These general copula models of underlying latent threshold random variables yield likelihood-based models for marginal fixed effects estimation and interpretation in the analysis of correlated binary data with exchangeable correlation structures. Moreover, we propose a nomenclature and set of model relationships that substantially elucidates the complex area of marginalized random intercept models for binary data. A diverse collection of didactic mathematical and numerical examples are given to illustrate concepts. PMID:25342871
Voluntary sterilisation among Canadian women.
De Wit, M; Rajulton, F
1991-07-01
Using data from the 1984 Canadian Fertility Survey, proportional hazards modelling was employed to determine factors associated with the likelihood of voluntary sterilisation among 5315 women of childbearing age, and the trends in timing and differences in the likelihood associated with different age cohorts. Multivariate analysis suggests that educational attainment, parity and duration since last birth at the time of sterilisation, religious commitment, province of residence and marital status at the time of sterilisation, are all important predictors. Education and parity attainment emerged as the best predictors of the timing of voluntary sterilisation in all age cohorts, but the contribution of other covariates varies between cohorts.
Quantum state estimation when qubits are lost: a no-data-left-behind approach
Williams, Brian P.; Lougovski, Pavel
2017-04-06
We present an approach to Bayesian mean estimation of quantum states using hyperspherical parametrization and an experiment-specific likelihood which allows utilization of all available data, even when qubits are lost. With this method, we report the first closed-form Bayesian mean and maximum likelihood estimates for the ideal single qubit. Due to computational constraints, we utilize numerical sampling to determine the Bayesian mean estimate for a photonic two-qubit experiment in which our novel analysis reduces burdens associated with experimental asymmetries and inefficiencies. This method can be applied to quantum states of any dimension and experimental complexity.
PERIODIC AUTOREGRESSIVE-MOVING AVERAGE (PARMA) MODELING WITH APPLICATIONS TO WATER RESOURCES.
Vecchia, A.V.
1985-01-01
Results involving correlation properties and parameter estimation for autogressive-moving average models with periodic parameters are presented. A multivariate representation of the PARMA model is used to derive parameter space restrictions and difference equations for the periodic autocorrelations. Close approximation to the likelihood function for Gaussian PARMA processes results in efficient maximum-likelihood estimation procedures. Terms in the Fourier expansion of the parameters are sequentially included, and a selection criterion is given for determining the optimal number of harmonics to be included. Application of the techniques is demonstrated through analysis of a monthly streamflow time series.
Psychiatric DRGs: more risk for hospitals?
Ehrman, C M; Funk, G; Cavanaugh, J
1989-09-01
The diagnosis related group (DRG) system, which replaced the cost-plus system of reimbursement, was implemented in 1983 by Medicare to cover medical expenses on a prospective basis. To date, the DRG system has not been applied to psychiatric illness. The authors compare the likelihood of cost overruns in psychiatric illness with that of cost overruns in medical illness. The data analysis demonstrates that a prospective payment system would have a high likelihood of failure in psychiatric illness. Possible reasons for failure include wide variations in treatments, diagnostics, and other related costs. Also, the number of DRG classifications for psychiatric illness is inadequate.
NASA Astrophysics Data System (ADS)
Abbasi, R. U.; Abu-Zayyad, T.; Amann, J. F.; Archbold, G.; Atkins, R.; Bellido, J. A.; Belov, K.; Belz, J. W.; Ben-Zvi, S. Y.; Bergman, D. R.; Boyer, J. H.; Burt, G. W.; Cao, Z.; Clay, R. W.; Connolly, B. M.; Dawson, B. R.; Deng, W.; Farrar, G. R.; Fedorova, Y.; Findlay, J.; Finley, C. B.; Hanlon, W. F.; Hoffman, C. M.; Holzscheiter, M. H.; Hughes, G. A.; Hüntemeyer, P.; Jui, C. C. H.; Kim, K.; Kirn, M. A.; Knapp, B. C.; Loh, E. C.; Maestas, M. M.; Manago, N.; Mannel, E. J.; Marek, L. J.; Martens, K.; Matthews, J. A. J.; Matthews, J. N.; O'Neill, A.; Painter, C. A.; Perera, L.; Reil, K.; Riehle, R.; Roberts, M. D.; Sasaki, M.; Schnetzer, S. R.; Seman, M.; Simpson, K. M.; Sinnis, G.; Smith, J. D.; Snow, R.; Sokolsky, P.; Song, C.; Springer, R. W.; Stokes, B. T.; Thomas, J. R.; Thomas, S. B.; Thomson, G. B.; Tupa, D.; Westerhoff, S.; Wiencke, L. R.; Zech, A.
2005-04-01
We present the results of a search for cosmic-ray point sources at energies in excess of 4.0×1019 eV in the combined data sets recorded by the Akeno Giant Air Shower Array and High Resolution Fly's Eye stereo experiments. The analysis is based on a maximum likelihood ratio test using the probability density function for each event rather than requiring an a priori choice of a fixed angular bin size. No statistically significant clustering of events consistent with a point source is found.
Chai, Liyuan; Yang, Jinqin; Zhang, Ning; Wu, Pin-Jiun; Li, Qingzhu; Wang, Qingwei; Liu, Hui; Yi, Haibo
2017-09-01
Aqueous complexes between ferric (Fe(III)) and arsenate (As(V)) are indispensable for understanding the mobility of arsenic (As) in Fe(III)-As(V)-rich systems. In this study, aqueous Fe(III)-As(V) complexes, FeH 2 AsO 4 2+ and FeHAsO 4 + , were postulated based on the qualitative analysis of UV-Vis spectra in both Fe(III)-As(V)-HClO 4 and Fe(III)-As(V)-H 2 SO 4 systems. Subsequently, monodentate structures were evidenced by Fe K-edge EXAFS and modeled as [FeH 2 AsO 4 (H 2 O) 5 ] 2+ and [FeHAsO 4 (H 2 O) 5 ] + by DFT. The feature band at ∼280 nm was verified as electron excitation chiefly from Fe-As-bridged O atoms to d-orbital of Fe in [FeH 2 AsO 4 (H 2 O) 5 ] 2+ and [FeHAsO 4 (H 2 O) 5 ] + . The structural and spectral information of Fe(III)-As(V) complexes will enable future speciation analysis in Fe(III)-As(V)-rich system. Copyright © 2017. Published by Elsevier Ltd.
Gao, Xionghou; Geng, Wei; Zhang, Haitao; Zhao, Xuefei; Yao, Xiaojun
2013-11-01
We have theoretically investigated the adsorption of thiophene, benzothiophene, dibenzothiophene on Na(I)Y and rare earth exchanged La(III)Y, Ce(III)Y, Pr(III)Y Nd(III)Y zeolites by density functional theory calculations. The calculated results show that except benzothiophene adsorbed on Na(I)Y with a stand configuration, the stable adsorption structures of other thiophenic compounds on zeolites exhibit lying configurations. Adsorption energies of thiophenic compounds on the Na(I)Y are very low, and decrease with the increase of the number of benzene rings in thiophenic compounds. All rare earth exchanged zeolites exhibit strong interaction with thiophene. La(III)Y and Nd(III)Y zeolites are found to show enhanced adsorption energies to benzothiophene and Pr(III)Y zeolites are favorable for dibenzothiophene adsorption. The analysis of the electronic total charge density and electron orbital overlaps show that the thiophenic compounds interact with zeolites by π-electrons of thiophene ring and exchanged metal atom. Mulliken charge populations analysis reveals that adsorption energies are strongly dependent on the charge transfer of thiophenic molecule and exchanged metal atom.
NASA Technical Reports Server (NTRS)
Liles, Kaitlin; Amundsen, Ruth; Davis, Warren; Scola, Salvatore; Tobin, Steven; McLeod, Shawn; Mannu, Sergio; Guglielmo, Corrado; Moeller, Timothy
2013-01-01
The Stratospheric Aerosol and Gas Experiment III (SAGE III) instrument is the fifth in a series of instruments developed for monitoring aerosols and gaseous constituents in the stratosphere and troposphere. SAGE III will be delivered to the International Space Station (ISS) via the SpaceX Dragon vehicle in 2015. A detailed thermal model of the SAGE III payload has been developed in Thermal Desktop (TD). Several novel methods have been implemented to facilitate efficient payload-level thermal analysis, including the use of a design of experiments (DOE) methodology to determine the worst-case orbits for SAGE III while on ISS, use of TD assemblies to move payloads from the Dragon trunk to the Enhanced Operational Transfer Platform (EOTP) to its final home on the Expedite the Processing of Experiments to Space Station (ExPRESS) Logistics Carrier (ELC)-4, incorporation of older models in varying unit sets, ability to change units easily (including hardcoded logic blocks), case-based logic to facilitate activating heaters and active elements for varying scenarios within a single model, incorporation of several coordinate frames to easily map to structural models with differing geometries and locations, and streamlined results processing using an Excel-based text file plotter developed in-house at LaRC. This document presents an overview of the SAGE III thermal model and describes the development and implementation of these efficiency-improving analysis methods.
Reliable and More Powerful Methods for Power Analysis in Structural Equation Modeling
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Zhang, Zhiyong; Zhao, Yanyun
2017-01-01
The normal-distribution-based likelihood ratio statistic T[subscript ml] = nF[subscript ml] is widely used for power analysis in structural Equation modeling (SEM). In such an analysis, power and sample size are computed by assuming that T[subscript ml] follows a central chi-square distribution under H[subscript 0] and a noncentral chi-square…
Fan, Timothy M; Kitchell, Barbara E; Dhaliwal, Ravinder S; Jones, Pamela D; Hintermeister, John G; Paria, Biman C
2002-01-01
Twenty cats with spontaneously arising tumors received oral lomustine at a dose range of 32 to 59 mg/m2 every 21 days. Due to biohazard concerns associated with lomustine capsule reformulation, a standardized 10-mg capsule dosage was used for all cats regardless of body weight. Severe hematological toxicity was infrequent, with the incidence of either grade III or IV neutropenia and thrombocytopenia being 4.1% and 1.0%, respectively. Cats receiving higher cumulative doses of lomustine trended toward a greater likelihood for progressive neutropenia (P=0.07). Two cats with lymphoma, two cats with fibrosarcoma, and one cat with multiple myeloma achieved a measurable partial response to lomustine therapy. Cats treated with higher dosages of lomustine trended toward statistically significant higher response rates (P=0.07).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Yun; Kukkadapu, Ravi K.; Livi, Kenneth J. T.
The redox state and speciation of metalloid arsenic (As) determine its toxicity and mobility. Knowledge of biogeochemical processes influencing the As redox state is therefore important to understand and predict its environmental behavior. Many previous studies examined As(III) oxidation by various Mn-oxides, but little is known the environmental influences (e.g. co-existing ions) on such process. In this study, we investigated the mechanisms of As(III) oxidation by a poorly crystalline hexagonal birnessite (δ-MnO2) in the presence of Fe(II) using X-ray absorption spectroscopy (XAS), Mössbauer spectroscopy and transmission electron microscopy (TEM) coupled with energy-dispersive X-ray spectroscopy (EDS). As K-edge X-ray absorption nearmore » edge spectroscopy (XANES) analysis revealed that, at low Fe(II) concentration (100 μM), As(V) was the predominant As species on the solid phase, while at higher Fe(II) concentration (200-1000 μM), both As(III) and As(V) were sorbed on the solid phase. As K-edge extended X-ray absorption fine structure spectroscopy (EXAFS) analysis showed an increasing As-Mn/Fe distance over time, indicating As prefers to bind with the newly formed Fe(III)-(hydr)oxides. As adsorbed on Fe(III)-(hydr)oxides as a bidentate binuclear corner-sharing complex. Both Mössbauer and TEM-EDS investigations demonstrated that the oxidized Fe(III) products formed during Fe(II) oxidation by δ-MnO2 were predominantly ferrihydrite, goethite, and ferric arsenate like compounds. However, Fe EXAFS analysis also suggested the formation of a small amount of lepidocrocite. The Mn K-edge XANES data indicated that As(III) and Fe(II) oxidation occurs as a two electron transfer with δ-MnO2 and the observed Mn(III) is due to conproportionation of surface sorbed Mn(II) with Mn(IV) in δ-MnO2 structure. This study reveals that the mechanisms of As(III) oxidation by δ-MnO2 in the presence of Fe(II) are very complex, involving many simultaneous reactions, and the formation of Fe(III)-(hydr)oxides plays a very important role in reducing As mobility.« less
Learning from Ethical Dilemmas.
ERIC Educational Resources Information Center
Havens, Mark D.
1987-01-01
Reports analysis of 60 case studies of ethical dilemmas faced by experiential educators. Identifies issues which enhance likelihood of moral dilemmas: funding, residential programming, and risk-taking. Exposes need for a professional "code of ethics." (NEC)
Conrad, Martina; Engelmann, Dorit; Friedrich, Michael; Scheffold, Katharina; Philipp, Rebecca; Schulz-Kindermann, Frank; Härter, Martin; Mehnert, Anja; Koranyi, Susan
2018-04-13
There are only a few valid instruments measuring couples' communication in patients with cancer for German speaking countries. The Couple Communication Scale (CCS) represents an established instrument to assess couples' communication. However, there is no evidence regarding the psychometric properties of the German version of the CCS until now and the assumed one factor structure of the CCS was not verified for patients with advanced cancer yet. The CCS was validated as a part of the study "Managing cancer and living meaningfully" (CALM) on N=136 patients with advanced cancer (≥18 years, UICC-state III/IV). The psychometric properties of the scale were calculated (factor reliability, item reliability, average variance extracted [DEV]) and a confirmatory factor analysis was conducted (Maximum Likelihood Estimation). The concurrent validity was tested against symptoms of anxiety (GAD-7), depression (BDI-II) and attachment insecurity (ECR-M16). In the confirmatory factor analysis, the one factor structure showed a low, but acceptable model fit and explained on average 49% of every item's variance (DEV). The CCS has an excellent internal consistency (Cronbachs α=0,91) and was negatively associated with attachment insecurity (ECR-M16: anxiety: r=- 0,55, p<0,01; avoidance: r=- 0,42, p<0,01) as well as with anxiety (GAD-7: r=- 0,20, p<0,05) and depression (BDI-II: r=- 0,27, p<0,01). The CCS is a reliable and valid instrument measuring couples' communication in patients with advanced cancer. © Georg Thieme Verlag KG Stuttgart · New York.
Syed, Mudasir Ahmad; Bhat, Farooz Ahmad; Balkhi, Masood-ul Hassan; Bhat, Bilal Ahmad
2016-01-01
Schizothoracine fish commonly called snow trouts inhibit the entire network of snow and spring fed cool waters of Kashmir, India. Over 10 species reported earlier, only five species have been found, these include Schizothorax niger, Schizothorax esocinus, Schizothorax plagiostomus, Schizothorax curvifrons and Schizothorax labiatus. The relationship between these species is contradicting. To understand the evolutionary relation of these species, we examined the sequence information of mitochondrial D-loop of 25 individuals representing five species. Sequence alignment showed D-loop region highly variable and length variation was observed in di-nucleotide (TA)n microsatellite between and within species. Interestingly, all these species have (TA)n microsatellite not associated with longer tandem repeats at the 3' end of the mitochondrial control region and do not show heteroplasmy. Our analysis also indicates the presence of four conserved sequence blocks (CSB), CSB-D, CSB-1, CSB-II and CSB-III, four (Termination Associated Sequence) TAS motifs and 15bp pyrimidine block within the mitochondrial control region, that are highly conserved within genus Schizothorax when compared with other species. The phylogenetic analysis carried by Maximum likelihood (ML), Neighbor Joining (NJ) and Bayesian inference (BI) generated almost identical results. The resultant BI tree showed a close genetic relationship of all the five species and supports two distinct grouping of S. esocinus species. Besides the species relation, the presence of length variation in tandem repeats is attributed to differences in predicting the stability of secondary structures. The role of CSBs and TASs, reported so far as main regulatory signals, would explain the conservation of these elements in evolution.
Sarcopenia and post-hospital outcomes in older adults: A longitudinal study.
Pérez-Zepeda, Mario Ulises; Sgaravatti, Aldo; Dent, Elsa
Sarcopenia poses a significant problem for older adults, yet very little is known about this medical condition in the hospital setting. The aims of this hospital-based study were to determine: (i) the prevalence of sarcopenia; (ii) factors associated with sarcopenia; and (iii) the association of sarcopenia with adverse clinical outcomes post-hospitalisation. This is a longitudinal analysis of consecutive patients aged ≥70 years admitted to a Geriatric Management and Evaluation Unit (GEMU) ward. Sarcopenia was classified using the European Working Group on Sarcopenia in Older People (EWGSOP) algorithm, which included: handgrip strength, gait speed, and muscle mass using Bioelectrical Impedance Analysis (BIA). Outcomes were assessed at 12-months post-hospital discharge, and included both mortality and admission to a hospital Emergency Department (ED). Kaplan-Meier methods were used to estimate survival, with Cox proportion hazard models then applied. All regression analyses controlled for age, sex, and co-morbidity. 172 patients (72% female) with a mean (SD) age of 85.2 (6.4) years were included. Sarcopenia was present in 69 (40.1%) of patients. Patients with sarcopenia were twice as likely to die in the 12-months post-hospitalisation (HR, 95% CI=2.23, 1.15-4.34), but did not have an increased likelihood of ED admission. Sarcopenia showed an independent association with 12-month post-hospital mortality in older adults. With the new recognition of sarcopenia as a medical condition with its own unique ICD-10-CM code, awareness and diagnosis of sarcopenia in clinical settings is paramount. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
2013-01-01
Background Whereas the prognosis of second kidney transplant recipients (STR) compared to the first ones has been frequently analyzed, no study has addressed the issue of comparing the risk factor effects on graft failure between both groups. Methods Here, we propose two alternative strategies to study the heterogeneity of risk factors between two groups of patients: (i) a multiplicative-regression model for relative survival (MRS) and (ii) a stratified Cox model (SCM) specifying the graft rank as strata and assuming subvectors of the explicatives variables. These developments were motivated by the analysis of factors associated with time to graft failure (return-to-dialysis or patient death) in second kidney transplant recipients (STR) compared to the first ones. Estimation of the parameters was based on partial likelihood maximization. Monte-Carlo simulations associated with bootstrap re-sampling was performed to calculate the standard deviations for the MRS. Results We demonstrate, for the first time in renal transplantation, that: (i) male donor gender is a specific risk factor for STR, (ii) the adverse effect of recipient age is enhanced for STR and (iii) the graft failure risk related to donor age is attenuated for STR. Conclusion While the traditional Cox model did not provide original results based on the renal transplantation literature, the proposed relative and stratified models revealed new findings that are useful for clinicians. These methodologies may be of interest in other medical fields when the principal objective is the comparison of risk factors between two populations. PMID:23915191
Lehmann-Che, Jacqueline; André, Fabrice; Desmedt, Christine; Mazouni, Chafika; Giacchetti, Sylvie; Turpin, Elisabeth; Espié, Marc; Plassa, Louis-François; Marty, Michel; Bertheau, Philippe; Sotiriou, Christos; Piccart, Martine; Symmans, W Fraser; Pusztai, Lajos; de Thé, Hugues
2010-01-01
The predictive value of p53 for the efficacy of front-line anthracycline-based chemotherapy regimens has been a matter of significant controversy. Anthracyclines are usually combined with widely different doses of alkylating agents, which may significantly modulate tumor response to these combinations. We analyzed three series of de novo stage II-III breast cancer patients treated front line with anthracycline-based regimens of various cyclophosphamide dose intensities: 65 patients with estrogen receptor (ER)(-) tumors treated with anthracyclines alone (Institut Jules Bordet, Brussels), 51 unselected breast cancer patients treated with intermediate doses of cyclophosphamide (MD Anderson Cancer Center, Houston, TX), and 128 others treated with a dose-dense anthracycline-cyclophosphamide combination (St. Louis, Paris). After chemotherapy and surgery, pathologic complete response (pCR) was evaluated. p53 status was determined by a yeast functional assay on the pretreatment tumor sample. In a multivariate analysis of the pooled results, a lack of ER expression and high-dose cyclophosphamide administration were associated with a higher likelihood of pCR. A sharp statistical interaction was detected between p53 status and cyclophosphamide dose intensity. Indeed, when restricting our analysis to patients with ER(-) tumors, we confirmed that a mutant p53 status was associated with anthracycline resistance, but found that p53 inactivation was required for response to the dose-intense alkylating regimen. The latter allowed very high levels of pCR in triple-negative tumors. Thus, our data strongly suggest that cyclophosphamide dose intensification in ER(-) p53-mutated breast cancer patients could significantly improve their response.
Parallel implementation of D-Phylo algorithm for maximum likelihood clusters.
Malik, Shamita; Sharma, Dolly; Khatri, Sunil Kumar
2017-03-01
This study explains a newly developed parallel algorithm for phylogenetic analysis of DNA sequences. The newly designed D-Phylo is a more advanced algorithm for phylogenetic analysis using maximum likelihood approach. The D-Phylo while misusing the seeking capacity of k -means keeps away from its real constraint of getting stuck at privately conserved motifs. The authors have tested the behaviour of D-Phylo on Amazon Linux Amazon Machine Image(Hardware Virtual Machine)i2.4xlarge, six central processing unit, 122 GiB memory, 8 × 800 Solid-state drive Elastic Block Store volume, high network performance up to 15 processors for several real-life datasets. Distributing the clusters evenly on all the processors provides us the capacity to accomplish a near direct speed if there should arise an occurrence of huge number of processors.
Human factors process failure modes and effects analysis (HF PFMEA) software tool
NASA Technical Reports Server (NTRS)
Chandler, Faith T. (Inventor); Relvini, Kristine M. (Inventor); Shedd, Nathaneal P. (Inventor); Valentino, William D. (Inventor); Philippart, Monica F. (Inventor); Bessette, Colette I. (Inventor)
2011-01-01
Methods, computer-readable media, and systems for automatically performing Human Factors Process Failure Modes and Effects Analysis for a process are provided. At least one task involved in a process is identified, where the task includes at least one human activity. The human activity is described using at least one verb. A human error potentially resulting from the human activity is automatically identified, the human error is related to the verb used in describing the task. A likelihood of occurrence, detection, and correction of the human error is identified. The severity of the effect of the human error is identified. The likelihood of occurrence, and the severity of the risk of potential harm is identified. The risk of potential harm is compared with a risk threshold to identify the appropriateness of corrective measures.
He, Qiqi; Wang, Hanzhang; Kenyon, Jonathan; Liu, Guiming; Yang, Li; Tian, Junqiang; Yue, Zhongjin; Wang, Zhiping
2015-01-01
To use meta-analysis to determine the accuracy of percutaneous core needle biopsy in the diagnosis of small renal masses (SMRs ≤ 4.0 cm). Studies were identified by searching PubMed, Embase, and the Cochrane Library database up to March 2013. Two of the authors independently assessed the study quality using QUADAS-2 tool and extracted data that met the inclusion criteria. The sensitivity, specificity, likelihood ratios, diagnostic odds ratio (DOR) and also summary receiver operating characteristic (SROC) curve were investigated and draw. Deek's funnel plot was used to evaluate the publication bias. A total of 9 studies with 788 patients (803 biopsies) were included. Failed biopsies without repeated or aborted from follow-up/surgery result were excluded (232 patients and 353 biopsies). For all cases, the pooled sensitivity was 94.0% (95% CI: 91.0%, 95.0%), the pooled positive likelihood was 22.57 (95 % CI: 9.20-55.34), the pooled negative likelihood was 0.09 (95 % CI: 0.06-0.13), the pooled DOR was 296.52(95 % CI: 99. 42-884.38). The area under the curve of SROC analysis was 0.959 ± 0.0254. Imaging-guided percutaneous core needle biopsy of small renal masses (SMRs ≤ 4.0 cm) is highly accurate to malignant tumor diagnosis with unknown metastatic status and could be offered to some patients after clinic judgment prior to surgical intervention consideration.
Fast automated analysis of strong gravitational lenses with convolutional neural networks.
Hezaveh, Yashar D; Levasseur, Laurence Perreault; Marshall, Philip J
2017-08-30
Quantifying image distortions caused by strong gravitational lensing-the formation of multiple images of distant sources due to the deflection of their light by the gravity of intervening structures-and estimating the corresponding matter distribution of these structures (the 'gravitational lens') has primarily been performed using maximum likelihood modelling of observations. This procedure is typically time- and resource-consuming, requiring sophisticated lensing codes, several data preparation steps, and finding the maximum likelihood model parameters in a computationally expensive process with downhill optimizers. Accurate analysis of a single gravitational lens can take up to a few weeks and requires expert knowledge of the physical processes and methods involved. Tens of thousands of new lenses are expected to be discovered with the upcoming generation of ground and space surveys. Here we report the use of deep convolutional neural networks to estimate lensing parameters in an extremely fast and automated way, circumventing the difficulties that are faced by maximum likelihood methods. We also show that the removal of lens light can be made fast and automated using independent component analysis of multi-filter imaging data. Our networks can recover the parameters of the 'singular isothermal ellipsoid' density profile, which is commonly used to model strong lensing systems, with an accuracy comparable to the uncertainties of sophisticated models but about ten million times faster: 100 systems in approximately one second on a single graphics processing unit. These networks can provide a way for non-experts to obtain estimates of lensing parameters for large samples of data.
A Simple Qualitative Analysis Scheme for Several Environmentally Important Elements
ERIC Educational Resources Information Center
Lambert, Jack L.; Meloan, Clifton E.
1977-01-01
Describes a scheme that uses precipitation, gas evolution, complex ion formation, and flame tests to analyze for the following ions: Hg(I), Hg(II), Sb(III), Cr(III), Pb(II), Sr(II), Cu(II), Cd(II), As(III), chloride, nitrate, and sulfate. (MLH)
2012-01-01
Background Mycobacterium avium subspecies paratuberculosis (Map) is the aetiological agent of Johne’s disease or paratuberculosis and is included within the Mycobacterium avium complex (MAC). Map strains are of two major types often referred to as ‘Sheep’ or ‘S-type’ and ‘Cattle’ or ‘C-type’. With the advent of more discriminatory typing techniques it has been possible to further classify the S-type strains into two groups referred to as Type I and Type III. This study was undertaken to genotype a large panel of S-type small ruminant isolates from different hosts and geographical origins and to compare them with a large panel of well documented C-type isolates to assess the genetic diversity of these strain types. Methods used included Mycobacterial Interspersed Repetitive Units - Variable-Number Tandem Repeat analysis (MIRU-VNTR), analysis of Large Sequence Polymorphisms by PCR (LSP analysis), Single Nucleotide Polymorphism (SNP) analysis of gyr genes, Pulsed-Field Gel Electrophoresis (PFGE) and Restriction Fragment Length Polymorphism analysis coupled with hybridization to IS900 (IS900-RFLP) analysis. Results The presence of LSPA4 and absence of LSPA20 was confirmed in all 24 Map S-type strains analysed. SNPs within the gyr genes divided the S-type strains into types I and III. Twenty four PFGE multiplex profiles and eleven different IS900-RFLP profiles were identified among the S-type isolates, some of them not previously published. Both PFGE and IS900-RFLP segregated the S-type strains into types I and III and the results concurred with those of the gyr SNP analysis. Nine MIRU-VNTR genotypes were identified in these isolates. MIRU-VNTR analysis differentiated Map strains from other members of Mycobacterium avium Complex, and Map S-type from C-type but not type I from III. Pigmented Map isolates were found of type I or III. Conclusion This is the largest panel of S-type strains investigated to date. The S-type strains could be further divided into two subtypes, I and III by some of the typing techniques (IS900-RFLP, PFGE and SNP analysis of the gyr genes). MIRU-VNTR did not divide the strains into the subtypes I and III but did detect genetic differences between isolates within each of the subtypes. Pigmentation is not exclusively associated with type I strains. PMID:23164429
Henry, Stephen G.; Jerant, Anthony; Iosif, Ana-Maria; Feldman, Mitchell D.; Cipri, Camille; Kravitz, Richard L.
2015-01-01
Objective To identify factors associated with participant consent to record visits; to estimate effects of recording on patient-clinician interactions Methods Secondary analysis of data from a randomized trial studying communication about depression; participants were asked for optional consent to audio record study visits. Multiple logistic regression was used to model likelihood of patient and clinician consent. Multivariable regression and propensity score analyses were used to estimate effects of audio recording on 6 dependent variables: discussion of depressive symptoms, preventive health, and depression diagnosis; depression treatment recommendations; visit length; visit difficulty. Results Of 867 visits involving 135 primary care clinicians, 39% were recorded. For clinicians, only working in academic settings (P=0.003) and having worked longer at their current practice (P=0.02) were associated with increased likelihood of consent. For patients, white race (P=0.002) and diabetes (P=0.03) were associated with increased likelihood of consent. Neither multivariable regression nor propensity score analyses revealed any significant effects of recording on the variables examined. Conclusion Few clinician or patient characteristics were significantly associated with consent. Audio recording had no significant effect on any dependent variables. Practice Implications Benefits of recording clinic visits likely outweigh the risks of bias in this setting. PMID:25837372
Multi-Sample Cluster Analysis Using Akaike’s Information Criterion.
1982-12-20
of Likelihood Criteria for I)fferent Hypotheses," in P. A. Krishnaiah (Ed.), Multivariate Analysis-Il, New York: Academic Press. [5] Fisher, R. A...Methods of Simultaneous Inference in MANOVA," in P. R. Krishnaiah (Ed.), rultivariate Analysis-Il, New York: Academic Press. [8) Kendall, M. G. (1966...1982), Applied Multivariate Statisti- cal-Analysis, Englewood Cliffs: Prentice-Mall, Inc. [1U] Krishnaiah , P. R. (1969), "Simultaneous Test
Phylogenetic analysis of dissimilatory Fe(III)-reducing bacteria
Lonergan, D.J.; Jenter, H.L.; Coates, J.D.; Phillips, E.J.P.; Schmidt, T.M.; Lovley, D.R.
1996-01-01
Evolutionary relationships among strictly anaerobic dissimilatory Fe(III)- reducing bacteria obtained from a diversity of sedimentary environments were examined by phylogenetic analysis of 16S rRNA gene sequences. Members of the genera Geobacter, Desulfuromonas, Pelobacter, and Desulfuromusa formed a monophyletic group within the delta subdivision of the class Proteobacteria. On the basis of their common ancestry and the shared ability to reduce Fe(III) and/or S0, we propose that this group be considered a single family, Geobacteraceae. Bootstrap analysis, characteristic nucleotides, and higher- order secondary structures support the division of Geobacteraceae into two subgroups, designated the Geobacter and Desulfuromonas clusters. The genus Desulfuromusa and Pelobacter acidigallici make up a distinct branch with the Desulfuromonas cluster. Several members of the family Geobacteraceae, none of which reduce sulfate, were found to contain the target sequences of probes that have been previously used to define the distribution of sulfate-reducing bacteria and sulfate-reducing bacterium-like microorganisms. The recent isolations of Fe(III)-reducing microorganisms distributed throughout the domain Bacteria suggest that development of 16S rRNA probes that would specifically target all Fe(III) reducers may not be feasible. However, all of the evidence suggests that if a 16S rRNA sequence falls within the family Geobacteraceae, then the organism has the capacity for Fe(III) reduction. The suggestion, based on geological evidence, that Fe(III) reduction was the first globally significant process for oxidizing organic matter back to carbon dioxide is consistent with the finding that acetate-oxidizing Fe(III) reducers are phylogenetically diverse.
Overweight and obesity in India: policy issues from an exploratory multi-level analysis.
Siddiqui, Md Zakaria; Donato, Ronald
2016-06-01
This article analyses a nationally representative household dataset-the National Family Health Survey (NFHS-3) conducted in 2005 to 2006-to examine factors influencing the prevalence of overweight/obesity in India. The dataset was disaggregated into four sub-population groups-urban and rural females and males-and multi-level logit regression models were used to estimate the impact of particular covariates on the likelihood of overweight/obesity. The multi-level modelling approach aimed to identify individual and macro-level contextual factors influencing this health outcome. In contrast to most studies on low-income developing countries, the findings reveal that education for females beyond a particular level of educational attainment exhibits a negative relationship with the likelihood of overweight/obesity. This relationship was not observed for males. Muslim females and all Sikh sub-populations have a higher likelihood of overweight/obesity suggesting the importance of socio-cultural influences. The results also show that the relationship between wealth and the probability of overweight/obesity is stronger for males than females highlighting the differential impact of increasing socio-economic status on gender. Multi-level analysis reveals that states exerted an independent influence on the likelihood of overweight/obesity beyond individual-level covariates, reflecting the importance of spatially related contextual factors on overweight/obesity. While this study does not disentangle macro-level 'obesogenic' environmental factors from socio-cultural network influences, the results highlight the need to refrain from adopting a 'one size fits all' policy approach in addressing the overweight/obesity epidemic facing India. Instead, policy implementation requires a more nuanced and targeted approach to incorporate the growing recognition of socio-cultural and spatial contextual factors impacting on healthy behaviours. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Marcum, Zachary A; Perera, Subashan; Thorpe, Joshua M; Switzer, Galen E; Castle, Nicholas G; Strotmeyer, Elsa S; Simonsick, Eleanor M; Ayonayon, Hilsa N; Phillips, Caroline L; Rubin, Susan; Zucker-Levin, Audrey R; Bauer, Douglas C; Shorr, Ronald I; Kang, Yihuang; Gray, Shelly L; Hanlon, Joseph T
2016-07-01
Few studies have compared the risk of recurrent falls across various antidepressant agents-using detailed dosage and duration data-among community-dwelling older adults, including those who have a history of a fall/fracture. To examine the association of antidepressant use with recurrent falls, including among those with a history of falls/fractures, in community-dwelling elders. This was a longitudinal analysis of 2948 participants with data collected via interview at year 1 from the Health, Aging and Body Composition study and followed through year 7 (1997-2004). Any antidepressant medication use was self-reported at years 1, 2, 3, 5, and 6 and further categorized as (1) selective serotonin reuptake inhibitors (SSRIs), (2) tricyclic antidepressants, and (3) others. Dosage and duration were examined. The outcome was recurrent falls (≥2) in the ensuing 12-month period following each medication data collection. Using multivariable generalized estimating equations models, we observed a 48% greater likelihood of recurrent falls in antidepressant users compared with nonusers (adjusted odds ratio [AOR] = 1.48; 95% CI = 1.12-1.96). Increased likelihood was also found among those taking SSRIs (AOR = 1.62; 95% CI = 1.15-2.28), with short duration of use (AOR = 1.47; 95% CI = 1.04-2.00), and taking moderate dosages (AOR = 1.59; 95% CI = 1.15-2.18), all compared with no antidepressant use. Stratified analysis revealed an increased likelihood among users with a baseline history of falls/fractures compared with nonusers (AOR = 1.83; 95% CI = 1.28-2.63). Antidepressant use overall, SSRI use, short duration of use, and moderate dosage were associated with recurrent falls. Those with a history of falls/fractures also had an increased likelihood of recurrent falls. © The Author(s) 2016.
Two new methods to fit models for network meta-analysis with random inconsistency effects.
Law, Martin; Jackson, Dan; Turner, Rebecca; Rhodes, Kirsty; Viechtbauer, Wolfgang
2016-07-28
Meta-analysis is a valuable tool for combining evidence from multiple studies. Network meta-analysis is becoming more widely used as a means to compare multiple treatments in the same analysis. However, a network meta-analysis may exhibit inconsistency, whereby the treatment effect estimates do not agree across all trial designs, even after taking between-study heterogeneity into account. We propose two new estimation methods for network meta-analysis models with random inconsistency effects. The model we consider is an extension of the conventional random-effects model for meta-analysis to the network meta-analysis setting and allows for potential inconsistency using random inconsistency effects. Our first new estimation method uses a Bayesian framework with empirically-based prior distributions for both the heterogeneity and the inconsistency variances. We fit the model using importance sampling and thereby avoid some of the difficulties that might be associated with using Markov Chain Monte Carlo (MCMC). However, we confirm the accuracy of our importance sampling method by comparing the results to those obtained using MCMC as the gold standard. The second new estimation method we describe uses a likelihood-based approach, implemented in the metafor package, which can be used to obtain (restricted) maximum-likelihood estimates of the model parameters and profile likelihood confidence intervals of the variance components. We illustrate the application of the methods using two contrasting examples. The first uses all-cause mortality as an outcome, and shows little evidence of between-study heterogeneity or inconsistency. The second uses "ear discharge" as an outcome, and exhibits substantial between-study heterogeneity and inconsistency. Both new estimation methods give results similar to those obtained using MCMC. The extent of heterogeneity and inconsistency should be assessed and reported in any network meta-analysis. Our two new methods can be used to fit models for network meta-analysis with random inconsistency effects. They are easily implemented using the accompanying R code in the Additional file 1. Using these estimation methods, the extent of inconsistency can be assessed and reported.
Pourcain, Beate St.; Smith, George Davey; York, Timothy P.; Evans, David M.
2014-01-01
Genome wide complex trait analysis (GCTA) is extended to include environmental effects of the maternal genotype on offspring phenotype (“maternal effects”, M-GCTA). The model includes parameters for the direct effects of the offspring genotype, maternal effects and the covariance between direct and maternal effects. Analysis of simulated data, conducted in OpenMx, confirmed that model parameters could be recovered by full information maximum likelihood (FIML) and evaluated the biases that arise in conventional GCTA when indirect genetic effects are ignored. Estimates derived from FIML in OpenMx showed very close agreement to those obtained by restricted maximum likelihood using the published algorithm for GCTA. The method was also applied to illustrative perinatal phenotypes from ∼4,000 mother-offspring pairs from the Avon Longitudinal Study of Parents and Children. The relative merits of extended GCTA in contrast to quantitative genetic approaches based on analyzing the phenotypic covariance structure of kinships are considered. PMID:25060210
NASA Astrophysics Data System (ADS)
Coelho, Carlos A.; Marques, Filipe J.
2013-09-01
In this paper the authors combine the equicorrelation and equivariance test introduced by Wilks [13] with the likelihood ratio test (l.r.t.) for independence of groups of variables to obtain the l.r.t. of block equicorrelation and equivariance. This test or its single block version may find applications in many areas as in psychology, education, medicine, genetics and they are important "in many tests of multivariate analysis, e.g. in MANOVA, Profile Analysis, Growth Curve analysis, etc" [12, 9]. By decomposing the overall hypothesis into the hypotheses of independence of groups of variables and the hypothesis of equicorrelation and equivariance we are able to obtain the expressions for the overall l.r.t. statistic and its moments. From these we obtain a suitable factorization of the characteristic function (c.f.) of the logarithm of the l.r.t. statistic, which enables us to develop highly manageable and precise near-exact distributions for the test statistic.
Cramer-Rao bound analysis of wideband source localization and DOA estimation
NASA Astrophysics Data System (ADS)
Yip, Lean; Chen, Joe C.; Hudson, Ralph E.; Yao, Kung
2002-12-01
In this paper, we derive the Cramér-Rao Bound (CRB) for wideband source localization and DOA estimation. The resulting CRB formula can be decomposed into two terms: one that depends on the signal characteristic and one that depends on the array geometry. For a uniformly spaced circular array (UCA), a concise analytical form of the CRB can be given by using some algebraic approximation. We further define a DOA beamwidth based on the resulting CRB formula. The DOA beamwidth can be used to design the sampling angular spacing for the Maximum-likelihood (ML) algorithm. For a randomly distributed array, we use an elliptical model to determine the largest and smallest effective beamwidth. The effective beamwidth and the CRB analysis of source localization allow us to design an efficient algorithm for the ML estimator. Finally, our simulation results of the Approximated Maximum Likelihood (AML) algorithm are demonstrated to match well to the CRB analysis at high SNR.
On equivalent parameter learning in simplified feature space based on Bayesian asymptotic analysis.
Yamazaki, Keisuke
2012-07-01
Parametric models for sequential data, such as hidden Markov models, stochastic context-free grammars, and linear dynamical systems, are widely used in time-series analysis and structural data analysis. Computation of the likelihood function is one of primary considerations in many learning methods. Iterative calculation of the likelihood such as the model selection is still time-consuming though there are effective algorithms based on dynamic programming. The present paper studies parameter learning in a simplified feature space to reduce the computational cost. Simplifying data is a common technique seen in feature selection and dimension reduction though an oversimplified space causes adverse learning results. Therefore, we mathematically investigate a condition of the feature map to have an asymptotically equivalent convergence point of estimated parameters, referred to as the vicarious map. As a demonstration to find vicarious maps, we consider the feature space, which limits the length of data, and derive a necessary length for parameter learning in hidden Markov models. Copyright © 2012 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Hao; Mey, Antonia S. J. S.; Noé, Frank
2014-12-07
We propose a discrete transition-based reweighting analysis method (dTRAM) for analyzing configuration-space-discretized simulation trajectories produced at different thermodynamic states (temperatures, Hamiltonians, etc.) dTRAM provides maximum-likelihood estimates of stationary quantities (probabilities, free energies, expectation values) at any thermodynamic state. In contrast to the weighted histogram analysis method (WHAM), dTRAM does not require data to be sampled from global equilibrium, and can thus produce superior estimates for enhanced sampling data such as parallel/simulated tempering, replica exchange, umbrella sampling, or metadynamics. In addition, dTRAM provides optimal estimates of Markov state models (MSMs) from the discretized state-space trajectories at all thermodynamic states. Under suitablemore » conditions, these MSMs can be used to calculate kinetic quantities (e.g., rates, timescales). In the limit of a single thermodynamic state, dTRAM estimates a maximum likelihood reversible MSM, while in the limit of uncorrelated sampling data, dTRAM is identical to WHAM. dTRAM is thus a generalization to both estimators.« less
Delgado, João; Pollard, Simon; Snary, Emma; Black, Edgar; Prpich, George; Longhurst, Phil
2013-08-01
Exotic animal diseases (EADs) are characterized by their capacity to spread global distances, causing impacts on animal health and welfare with significant economic consequences. We offer a critique of current import risk analysis approaches employed in the EAD field, focusing on their capacity to assess complex systems at a policy level. To address the shortcomings identified, we propose a novel method providing a systematic analysis of the likelihood of a disease incursion, developed by reference to the multibarrier system employed for the United Kingdom. We apply the network model to a policy-level risk assessment of classical swine fever (CSF), a notifiable animal disease caused by the CSF virus. In doing so, we document and discuss a sequence of analyses that describe system vulnerabilities and reveal the critical control points (CCPs) for intervention, reducing the likelihood of U.K. pig herds being exposed to the CSF virus. © 2012 Society for Risk Analysis.
Maximum Likelihood Analysis of Low Energy CDMS II Germanium Data
Agnese, R.
2015-03-30
We report on the results of a search for a Weakly Interacting Massive Particle (WIMP) signal in low-energy data of the Cryogenic Dark Matter Search experiment using a maximum likelihood analysis. A background model is constructed using GEANT4 to simulate the surface-event background from Pb210decay-chain events, while using independent calibration data to model the gamma background. Fitting this background model to the data results in no statistically significant WIMP component. In addition, we also perform fits using an analytic ad hoc background model proposed by Collar and Fields, who claimed to find a large excess of signal-like events in ourmore » data. Finally, we confirm the strong preference for a signal hypothesis in their analysis under these assumptions, but excesses are observed in both single- and multiple-scatter events, which implies the signal is not caused by WIMPs, but rather reflects the inadequacy of their background model.« less
IVHS Countermeasures for Rear-End Collisions, Task 1 Vol. III: 1991 NASS CDS Case Analysis
DOT National Transportation Integrated Search
1994-02-15
This report is from the NHTSA sponsored program, "IVHS Countermeasures for Rear-End Collisions". The Task 1 Interim Report consists of six volumes. This Volume, Volume III, 1991 NASS CDS Clinical Case Analysis presents the results of a clinical case ...
DOT National Transportation Integrated Search
1980-12-01
This source document on motor vehicle market analysis and consumer impacts consists of three parts. Part III consists of studies and reviews on: consumer awareness of fuel efficiency issues; consumer acceptance of fuel efficient vehicles; car size ch...
Determinants of success in Shared Savings Programs: An analysis of ACO and market characteristics.
Ouayogodé, Mariétou H; Colla, Carrie H; Lewis, Valerie A
2017-03-01
Medicare's Accountable Care Organization (ACO) programs introduced shared savings to traditional Medicare, which allow providers who reduce health care costs for their patients to retain a percentage of the savings they generate. To examine ACO and market factors associated with superior financial performance in Medicare ACO programs. We obtained financial performance data from the Centers for Medicare and Medicaid Services (CMS); we derived market-level characteristics from Medicare claims; and we collected ACO characteristics from the National Survey of ACOs for 215 ACOs. We examined the association between ACO financial performance and ACO provider composition, leadership structure, beneficiary characteristics, risk bearing experience, quality and process improvement capabilities, physician performance management, market competition, CMS-assigned financial benchmark, and ACO contract start date. We examined two outcomes from Medicare ACOs' first performance year: savings per Medicare beneficiary and earning shared savings payments (a dichotomous variable). When modeling the ACO ability to save and earn shared savings payments, we estimated positive regression coefficients for a greater proportion of primary care providers in the ACO, more practicing physicians on the governing board, physician leadership, active engagement in reducing hospital re-admissions, a greater proportion of disabled Medicare beneficiaries assigned to the ACO, financial incentives offered to physicians, a larger financial benchmark, and greater ACO market penetration. No characteristic of organizational structure was significantly associated with both outcomes of savings per beneficiary and likelihood of achieving shared savings. ACO prior experience with risk-bearing contracts was positively correlated with savings and significantly increased the likelihood of receiving shared savings payments. In the first year, performance is quite heterogeneous, yet organizational structure does not consistently predict performance. Organizations with large financial benchmarks at baseline have greater opportunities to achieve savings. Findings on prior risk bearing suggest that ACOs learn over time under risk-bearing contracts. Given the lack of predictive power for organizational characteristics, CMS should continue to encourage diversity in organizational structures for ACO participants, and provide alternative funding and risk bearing mechanisms to continue to allow a diverse group of organizations to participate. III. Copyright © 2016 Elsevier Inc. All rights reserved.
Randomized controlled trial of atorvastatin in clinically isolated syndrome
Waubant, E.; Pelletier, D.; Mass, M.; Cohen, J.A.; Kita, M.; Cross, A.; Bar-Or, A.; Vollmer, T.; Racke, M.; Stüve, O.; Schwid, S.; Goodman, A.; Kachuck, N.; Preiningerova, J.; Weinstock-Guttman, B.; Calabresi, P.A.; Miller, A.; Mokhtarani, M.; Iklé, D.; Murphy, S.; Kopetskie, H.; Ding, L.; Rosenberg, E.; Spencer, C.; Zamvil, S.S.; Waubant, E.; Pelletier, D.; Mass, M.; Bourdette, D.; Egan, R.; Cohen, J.; Stone, L.; Kita, M.; Elliott, M.; Cross, A.; Parks, B.J.; Bar-Or, A.; Vollmer, T.; Campagnolo, D.; Racke, M.; Stüve, O.; Frohman, E.; Schwid, S.; Goodman, A.; Segal, B.; Kachuck, N.; Weiner, L.; Preiningerova, J.; Carrithers, M.; Weinstock-Guttman, B.; Calabresi, P.; Kerr, D.; Miller, A.; Lublin, F.; Sayre, Peter; Hayes, Deborah; Rosenberg, Ellen; Gao, Wendy; Ding, Linna; Adah, Steven; Mokhtarani, Masoud; Neuenburg, Jutta; Bromstead, Carolyn; Olinger, Lynn; Mullen, Blair; Jamison, Ross; Speth, Kelly; Saljooqi, Kerensa; Phan, Peter; Phippard, Deborah; Seyfert-Margolis, Vicki; Bourcier, Katarzyna; Debnam, Tracia; Romaine, Jennifer; Wolin, Stephanie; O'Dale, Brittany; Iklé, David; Murphy, Stacey; Kopetskie, Heather
2012-01-01
Objective: To test efficacy and safety of atorvastatin in subjects with clinically isolated syndrome (CIS). Methods: Subjects with CIS were enrolled in a phase II, double-blind, placebo-controlled, 14-center randomized trial testing 80 mg atorvastatin on clinical and brain MRI activity. Brain MRIs were performed quarterly. The primary endpoint (PEP) was development of ≥3 new T2 lesions, or one clinical relapse within 12 months. Subjects meeting the PEP were offered additional weekly interferon β-1a (IFNβ-1a). Results: Due to slow recruitment, enrollment was discontinued after 81 of 152 planned subjects with CIS were randomized and initiated study drug. Median (interquartile range) numbers of T2 and gadolinium-enhancing (Gd) lesions were 15.0 (22.0) and 0.0 (0.0) at baseline. A total of 53.1% of atorvastatin recipients (n = 26/49) met PEP compared to 56.3% of placebo recipients (n = 18/32) (p = 0.82). Eleven atorvastatin subjects (22.4%) and 7 placebo subjects (21.9%) met the PEP by clinical criteria. Proportion of subjects who did not develop new T2 lesions up to month 12 or to starting IFNβ-1a was 55.3% in the atorvastatin and 27.6% in the placebo group (p = 0.03). Likelihood of remaining free of new T2 lesions was significantly greater in the atorvastatin group compared with placebo (odds ratio [OR] = 4.34, p = 0.01). Likelihood of remaining free of Gd lesions tended to be higher in the atorvastatin group (OR = 2.72, p = 0.11). Overall, atorvastatin was well tolerated. No clear antagonistic effect of atorvastatin plus IFNβ-1a was observed on MRI measures. Conclusion: Atorvastatin treatment significantly decreased development of new brain MRI T2 lesion activity, although it did not achieve the composite clinical and imaging PEP. Classification of Evidence: This study provided Class II evidence that atorvastatin did not reduce the proportion of patients with CIS meeting imaging and clinical criteria for starting immunomodulating therapy after 12 months, compared to placebo. In an analysis of a secondary endpoint (Class III), atorvastatin was associated with a reduced risk for developing new T2 lesions. PMID:22459680
Determinants of Success in Shared Savings Programs: An Analysis of ACO and Market Characteristics
Colla, Carrie H.; Lewis, Valerie A.
2016-01-01
Background Medicare’s Accountable Care Organization (ACO) programs introduced shared savings to traditional Medicare, which allow providers who reduce health care costs for their patients to retain a percentage of the savings they generate. Objective To examine ACO and market factors associated with superior financial performance in Medicare ACO programs. Methods We obtained financial performance data from the Centers for Medicare and Medicaid Services (CMS); we derived market-level characteristics from Medicare claims; and we collected ACO characteristics from the National Survey of ACOs for 215 ACOs. We examined the association between ACO financial performance and ACO provider composition, leadership structure, beneficiary characteristics, risk bearing experience, quality and process improvement capabilities, physician performance management, market competition, CMS-assigned financial benchmark, and ACO contract start date. We examined two outcomes from Medicare ACOs’ first performance year: savings per Medicare beneficiary and earning shared savings payments (a dichotomous variable). Results When modeling the ACO ability to save and earn shared savings payments, we estimated positive regression coefficients for a greater proportion of primary care providers in the ACO, more practicing physicians on the governing board, physician leadership, active engagement in reducing hospital re-admissions, a greater proportion of disabled Medicare beneficiaries assigned to the ACO, financial incentives offered to physicians, a larger financial benchmark, and greater ACO market penetration. No characteristic of organizational structure was significantly associated with both outcomes of savings per beneficiary and likelihood of achieving shared savings. ACO prior experience with risk-bearing contracts was positively correlated with savings and significantly increased the likelihood of receiving shared savings payments. Conclusions In the first year performance is quite heterogeneous, yet organizational structure does not consistently predict performance. Organizations with large financial benchmarks at baseline have greater opportunities to achieve savings. Findings on prior risk bearing suggest that ACOs learn over time under risk-bearing contracts. Implications Given the lack of predictive power for organizational characteristics, CMS should continue to encourage diversity in organizational structures for ACO participants, and provide alternative funding and risk bearing mechanisms to continue to allow a diverse group of organizations to participate. Level of evidence III PMID:27687917
ERIC Educational Resources Information Center
Burton, D. Bradley; And Others
1994-01-01
A maximum-likelihood confirmatory factor analysis was performed by applying LISREL VII to the Wechsler Adult Intelligence Scale-Revised results of a normal elderly sample of 225 adults. Results indicate that a three-factor model fits best across all sample combinations. A mild gender effect is discussed. (SLD)
Brief Experimental Analysis of Written Letter Formation: Single-Case Demonstration
ERIC Educational Resources Information Center
Burns, Matthew K.; Ganuza, Zoila M.; London, Rachel M.
2009-01-01
Many students experience difficulty in acquiring basic writing skills and educators need to efficiently address those deficits by implementing an intervention with a high likelihood for success. The current article demonstrates the utility of using a brief experimental analysis (BEA) to identify a letter-formation intervention for a second-grade…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jason L. Wright
Finding and identifying Cryptography is a growing concern in the malware analysis community. In this paper, a heuristic method for determining the likelihood that a given function contains a cryptographic algorithm is discussed and the results of applying this method in various environments is shown. The algorithm is based on frequency analysis of opcodes that make up each function within a binary.
A conditional probability analysis (CPA) approach has been developed for identifying biological thresholds of impact for use in the development of geographic-specific water quality criteria for protection of aquatic life. This approach expresses the threshold as the likelihood ...
Generalized Full-Information Item Bifactor Analysis
Cai, Li; Yang, Ji Seung; Hansen, Mark
2011-01-01
Full-information item bifactor analysis is an important statistical method in psychological and educational measurement. Current methods are limited to single group analysis and inflexible in the types of item response models supported. We propose a flexible multiple-group item bifactor analysis framework that supports a variety of multidimensional item response theory models for an arbitrary mixing of dichotomous, ordinal, and nominal items. The extended item bifactor model also enables the estimation of latent variable means and variances when data from more than one group are present. Generalized user-defined parameter restrictions are permitted within or across groups. We derive an efficient full-information maximum marginal likelihood estimator. Our estimation method achieves substantial computational savings by extending Gibbons and Hedeker’s (1992) bifactor dimension reduction method so that the optimization of the marginal log-likelihood only requires two-dimensional integration regardless of the dimensionality of the latent variables. We use simulation studies to demonstrate the flexibility and accuracy of the proposed methods. We apply the model to study cross-country differences, including differential item functioning, using data from a large international education survey on mathematics literacy. PMID:21534682
Khakzad, Nima; Khan, Faisal; Amyotte, Paul
2015-07-01
Compared to the remarkable progress in risk analysis of normal accidents, the risk analysis of major accidents has not been so well-established, partly due to the complexity of such accidents and partly due to low probabilities involved. The issue of low probabilities normally arises from the scarcity of major accidents' relevant data since such accidents are few and far between. In this work, knowing that major accidents are frequently preceded by accident precursors, a novel precursor-based methodology has been developed for likelihood modeling of major accidents in critical infrastructures based on a unique combination of accident precursor data, information theory, and approximate reasoning. For this purpose, we have introduced an innovative application of information analysis to identify the most informative near accident of a major accident. The observed data of the near accident were then used to establish predictive scenarios to foresee the occurrence of the major accident. We verified the methodology using offshore blowouts in the Gulf of Mexico, and then demonstrated its application to dam breaches in the United Sates. © 2015 Society for Risk Analysis.
Faponle, Abayomi S; Quesne, Matthew G; Sastri, Chivukula V; Banse, Frédéric; de Visser, Sam P
2015-01-01
Heme and nonheme monoxygenases and dioxygenases catalyze important oxygen atom transfer reactions to substrates in the body. It is now well established that the cytochrome P450 enzymes react through the formation of a high-valent iron(IV)–oxo heme cation radical. Its precursor in the catalytic cycle, the iron(III)–hydroperoxo complex, was tested for catalytic activity and found to be a sluggish oxidant of hydroxylation, epoxidation and sulfoxidation reactions. In a recent twist of events, evidence has emerged of several nonheme iron(III)–hydroperoxo complexes that appear to react with substrates via oxygen atom transfer processes. Although it was not clear from these studies whether the iron(III)–hydroperoxo reacted directly with substrates or that an initial O–O bond cleavage preceded the reaction. Clearly, the catalytic activity of heme and nonheme iron(III)–hydroperoxo complexes is substantially different, but the origins of this are still poorly understood and warrant a detailed analysis. In this work, an extensive computational analysis of aromatic hydroxylation by biomimetic nonheme and heme iron systems is presented, starting from an iron(III)–hydroperoxo complex with pentadentate ligand system (L52). Direct C–O bond formation by an iron(III)–hydroperoxo complex is investigated, as well as the initial heterolytic and homolytic bond cleavage of the hydroperoxo group. The calculations show that [(L52)FeIII(OOH)]2+ should be able to initiate an aromatic hydroxylation process, although a low-energy homolytic cleavage pathway is only slightly higher in energy. A detailed valence bond and thermochemical analysis rationalizes the differences in chemical reactivity of heme and nonheme iron(III)–hydroperoxo and show that the main reason for this particular nonheme complex to be reactive comes from the fact that they homolytically split the O–O bond, whereas a heterolytic O–O bond breaking in heme iron(III)–hydroperoxo is found. PMID:25399782
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holmes, Jordan A.; Wang, Andrew Z.; University of North Carolina-Lineberger Comprehensive Cancer Center, Chapel Hill, NC
2012-09-01
Purpose: To examine the patterns of primary treatment in a recent population-based cohort of prostate cancer patients, stratified by the likelihood of extraprostatic cancer as predicted by disease characteristics available at diagnosis. Methods and Materials: A total of 157,371 patients diagnosed from 2004 to 2008 with clinically localized and potentially curable (node-negative, nonmetastatic) prostate cancer, who have complete information on prostate-specific antigen, Gleason score, and clinical stage, were included. Patients with clinical T1/T2 disease were grouped into categories of <25%, 25%-50%, and >50% likelihood of having extraprostatic disease using the Partin nomogram. Clinical T3/T4 patients were examined separately as themore » highest-risk group. Logistic regression was used to examine the association between patient group and receipt of each primary treatment, adjusting for age, race, year of diagnosis, marital status, Surveillance, Epidemiology and End Results database region, and county-level education. Separate models were constructed for primary surgery, external-beam radiotherapy (RT), and conservative management. Results: On multivariable analysis, increasing likelihood of extraprostatic disease was significantly associated with increasing use of RT and decreased conservative management. Use of surgery also increased. Patients with >50% likelihood of extraprostatic cancer had almost twice the odds of receiving prostatectomy as those with <25% likelihood, and T3-T4 patients had 18% higher odds. Prostatectomy use increased in recent years. Patients aged 76-80 years were likely to be managed conservatively, even those with a >50% likelihood of extraprostatic cancer (34%) and clinical T3-T4 disease (24%). The proportion of patients who received prostatectomy or conservative management was approximately 50% or slightly higher in all groups. Conclusions: There may be underutilization of RT in older prostate cancer patients and those with likely extraprostatic disease. Because more than half of prostate cancer patients do not consult with a radiation oncologist, a multidisciplinary consultation may affect the treatment decision-making process.« less
The retention of health human resources in primary healthcare centers in Lebanon: a national survey.
Alameddine, Mohamad; Saleh, Shadi; El-Jardali, Fadi; Dimassi, Hani; Mourad, Yara
2012-11-22
Critical shortages of health human resources (HHR), associated with high turnover rates, have been a concern in many countries around the globe. Of particular interest is the effect of such a trend on the primary healthcare (PHC) sector; considered a cornerstone in any effective healthcare system. This study is a rare attempt to investigate PHC HHR work characteristics, level of burnout and likelihood to quit as well as the factors significantly associated with staff retention at PHC centers in Lebanon. A cross-sectional design was utilized to survey all health providers at 81 PHC centers dispersed in all districts of Lebanon. The questionnaire consisted of four sections: socio-demographic/ professional background, organizational/institutional characteristics, likelihood to quit and level of professional burnout (using the Maslach-Burnout Inventory). A total of 755 providers completed the questionnaire (60.5% response rate). Bivariate analyses and multinomial logistic regression were used to determine factors associated with likelihood to quit. Two out of five respondents indicated likelihood to quit their jobs within the next 1-3 years and an additional 13.4% were not sure about quitting. The top three reasons behind likelihood to quit were poor salary (54.4%), better job opportunities outside the country (35.1%) and lack of professional development (33.7%). A U-shaped relationship was observed between age and likelihood to quit. Regression analysis revealed that high levels of burnout, lower level of education and low tenure were all associated with increased likelihood to quit. The study findings reflect an unstable workforce and are not conducive to supporting an expanded role for PHC in the Lebanese healthcare system. While strategies aiming at improving staff retention would be important to develop and implement for all PHC HHR; targeted retention initiatives should focus on the young-new recruits and allied health professionals. Particular attention should be dedicated to enhancing providers' role satisfaction and sense of job security. Such initiatives are of pivotal importance to stabilize the workforce and ensure its longevity.
The retention of health human resources in primary healthcare centers in Lebanon: a national survey
2012-01-01
Background Critical shortages of health human resources (HHR), associated with high turnover rates, have been a concern in many countries around the globe. Of particular interest is the effect of such a trend on the primary healthcare (PHC) sector; considered a cornerstone in any effective healthcare system. This study is a rare attempt to investigate PHC HHR work characteristics, level of burnout and likelihood to quit as well as the factors significantly associated with staff retention at PHC centers in Lebanon. Methods A cross-sectional design was utilized to survey all health providers at 81 PHC centers dispersed in all districts of Lebanon. The questionnaire consisted of four sections: socio-demographic/ professional background, organizational/institutional characteristics, likelihood to quit and level of professional burnout (using the Maslach-Burnout Inventory). A total of 755 providers completed the questionnaire (60.5% response rate). Bivariate analyses and multinomial logistic regression were used to determine factors associated with likelihood to quit. Results Two out of five respondents indicated likelihood to quit their jobs within the next 1–3 years and an additional 13.4% were not sure about quitting. The top three reasons behind likelihood to quit were poor salary (54.4%), better job opportunities outside the country (35.1%) and lack of professional development (33.7%). A U-shaped relationship was observed between age and likelihood to quit. Regression analysis revealed that high levels of burnout, lower level of education and low tenure were all associated with increased likelihood to quit. Conclusions The study findings reflect an unstable workforce and are not conducive to supporting an expanded role for PHC in the Lebanese healthcare system. While strategies aiming at improving staff retention would be important to develop and implement for all PHC HHR; targeted retention initiatives should focus on the young-new recruits and allied health professionals. Particular attention should be dedicated to enhancing providers’ role satisfaction and sense of job security. Such initiatives are of pivotal importance to stabilize the workforce and ensure its longevity. PMID:23173905
Searching for a neurologic injury's Wechsler Adult Intelligence Scale-Third Edition profile.
Gonçalves, Marta A; Moura, Octávio; Castro-Caldas, Alexandre; Simões, Mário R
2017-01-01
This study aimed to investigate the presence of a Wechsler Adult Intelligence Scale-Third Edition (WAIS-III) cognitive profile in a Portuguese neurologic injured sample. The Portuguese WAIS-III was administered to 81 mixed neurologic patients and 81 healthy matched controls selected from the Portuguese standardization sample. Although the mixed neurologic injury group performed significantly lower than the healthy controls for the majority of the WAIS-III scores (i.e., composite measures, discrepancies, and subtests), the mean scores were within the normal range and, therefore, at risk of being unobserved in a clinical evaluation. ROC curves analysis showed poor to acceptable diagnostic accuracy for the WAIS-III composite measures and subtests (Working Memory Index and Digit Span revealed the highest accuracy for discriminating between participants, respectively). Multiple regression analysis showed that both literacy and the presence of brain injury were significant predictors for all of the composite measures. In addition, multiple regression analysis also showed that literacy, age of injury onset, and years of survival predicted all seven composite measures for the mixed neurologic injured group. Despite the failure to find a WAIS-III cognitive profile for mixed neurologic patients, the results showed a significant influence of brain lesion and literacy in the performance of the WAIS-III.
Peciak, Joanna; Stec, Wojciech J; Treda, Cezary; Ksiazkiewicz, Magdalena; Janik, Karolina; Popeda, Marta; Smolarz, Maciej; Rosiak, Kamila; Hulas-Bigoszewska, Krystyna; Och, Waldemar; Rieske, Piotr; Stoczynska-Fidelus, Ewelina
2017-01-01
Background: The presence as well as the potential role of EGFRvIII in tumors other than glioblastoma still remains a controversial subject with many contradictory data published. Previous analyses, however, did not consider the level of EGFRvIII mRNA expression in different tumor types. Methods: Appropriately designed protocol for Real-time quantitative reverse-transcription PCR (Real-time qRT-PCR) was applied to analyze EGFRvIII and EGFRWT mRNA expression in 155 tumor specimens. Additionally, Western Blot (WB) analysis was performed for selected samples. Stable cell lines showing EGFRvIII expression (CAS-1 and DK-MG) were analyzed by means of WB, immunocytochemistry (ICC) and fluorescence in situ hybridization (FISH). Results: Our analyses revealed EGFRvIII expression in 27.59% of glioblastomas (8/29), 8.11% of colorectal cancers (3/37), 6.52% of prostate cancers (3/46) and none of breast cancers (0/43). Despite the average relative expression of EGFRvIII varying greatly among tumors of different tissues (approximately 800-fold) or even within the same tissue group (up to 8000-fold for GB), even the marginal expression of EGFRvIII mRNA can be detrimental to cancer progression, as determined by the analysis of stable cell lines endogenously expressing the oncogene. Conclusion: EGFRvIII plays an unquestionable role in glioblastomas with high expression of this oncogene. Our data suggests that EGFRvIII importance should not be underestimated even in tumors with relatively low expression of this oncogene. PMID:28123609
Hui, Siu-kuen Azor; Grandner, Michael A.
2015-01-01
Objective Using the Transtheoretical Model of behavioral change, this study evaluates the relationship between sleep quality and the motivation and maintenance processes of healthy behavior change. Methods The current study is an analysis of data collected in 2008 from an online health risk assessment (HRA) survey completed by participants of the Kansas State employee wellness program (N = 13,322). Using multinomial logistic regression, associations between self-reported sleep quality and stages of change (i.e. precontemplation, contemplation, preparation, action, maintenance) in five health behaviors (stress management, weight management, physical activities, alcohol use, and smoking) were analyzed. Results Adjusted for covariates, poor sleep quality was associated with an increased likelihood of contemplation, preparation, and in some cases action stage when engaging in the health behavior change process, but generally a lower likelihood of maintenance of the healthy behavior. Conclusions The present study demonstrated that poor sleep quality was associated with an elevated likelihood of contemplating or initiating behavior change, but a decreased likelihood of maintaining healthy behavior change. It is important to include sleep improvement as one of the lifestyle management interventions offered in EWP to comprehensively reduce health risks and promote the health of a large employee population. PMID:26046013
NASA Technical Reports Server (NTRS)
Watson, Clifford
2010-01-01
Traditional hazard analysis techniques utilize a two-dimensional representation of the results determined by relative likelihood and severity of the residual risk. These matrices present a quick-look at the Likelihood (Y-axis) and Severity (X-axis) of the probable outcome of a hazardous event. A three-dimensional method, described herein, utilizes the traditional X and Y axes, while adding a new, third dimension, shown as the Z-axis, and referred to as the Level of Control. The elements of the Z-axis are modifications of the Hazard Elimination and Control steps (also known as the Hazard Reduction Precedence Sequence). These steps are: 1. Eliminate risk through design. 2. Substitute less risky materials for more hazardous materials. 3. Install safety devices. 4. Install caution and warning devices. 5. Develop administrative controls (to include special procedures and training.) 6. Provide protective clothing and equipment. When added to the twodimensional models, the level of control adds a visual representation of the risk associated with the hazardous condition, creating a tall-pole for the least-well-controlled failure while establishing the relative likelihood and severity of all causes and effects for an identified hazard. Computer modeling of the analytical results, using spreadsheets and threedimensional charting gives a visual confirmation of the relationship between causes and their controls
NASA Technical Reports Server (NTRS)
Watson, Clifford C.
2011-01-01
Traditional hazard analysis techniques utilize a two-dimensional representation of the results determined by relative likelihood and severity of the residual risk. These matrices present a quick-look at the Likelihood (Y-axis) and Severity (X-axis) of the probable outcome of a hazardous event. A three-dimensional method, described herein, utilizes the traditional X and Y axes, while adding a new, third dimension, shown as the Z-axis, and referred to as the Level of Control. The elements of the Z-axis are modifications of the Hazard Elimination and Control steps (also known as the Hazard Reduction Precedence Sequence). These steps are: 1. Eliminate risk through design. 2. Substitute less risky materials for more hazardous materials. 3. Install safety devices. 4. Install caution and warning devices. 5. Develop administrative controls (to include special procedures and training.) 6. Provide protective clothing and equipment. When added to the two-dimensional models, the level of control adds a visual representation of the risk associated with the hazardous condition, creating a tall-pole for the least-well-controlled failure while establishing the relative likelihood and severity of all causes and effects for an identified hazard. Computer modeling of the analytical results, using spreadsheets and three-dimensional charting gives a visual confirmation of the relationship between causes and their controls.
Religiosity profiles of American youth in relation to substance use, violence, and delinquency.
Salas-Wright, Christopher P; Vaughn, Michael G; Hodge, David R; Perron, Brian E
2012-12-01
Relatively little is known in terms of the relationship between religiosity profiles and adolescents' involvement in substance use, violence, and delinquency. Using a diverse sample of 17,705 (49 % female) adolescents from the 2008 National Survey on Drug Use and Health, latent profile analysis and multinomial regression are employed to examine the relationships between latent religiosity classes and substance use, violence, and delinquency. Results revealed a five class solution. Classes were identified as religiously disengaged (10.76 %), religiously infrequent (23.59 %), privately religious (6.55 %), religious regulars (40.85 %), and religiously devoted (18.25 %). Membership in the religiously devoted class was associated with the decreased likelihood of participation in a variety of substance use behaviors as well as decreases in the likelihood of fighting and theft. To a lesser extent, membership in the religious regulars class was also associated with the decreased likelihood of substance use and fighting. However, membership in the religiously infrequent and privately religious classes was only associated with the decreased likelihood of marijuana use. Findings suggest that private religiosity alone does not serve to buffer youth effectively against involvement in problem behavior, but rather that it is the combination of intrinsic and extrinsic adolescent religiosity factors that is associated with participation in fewer problem behaviors.
Risk Presentation Using the Three Dimensions of Likelihood, Severity, and Level of Control
NASA Technical Reports Server (NTRS)
Watson, Clifford
2010-01-01
Traditional hazard analysis techniques utilize a two-dimensional representation of the results determined by relative likelihood and severity of the residual risk. These matrices present a quick-look at the Likelihood (Y-axis) and Severity (X-axis) of the probable outcome of a hazardous event. A three-dimensional method, described herein, utilizes the traditional X and Y axes, while adding a new, third dimension, shown as the Z-axis, and referred to as the Level of Control. The elements of the Z-axis are modifications of the Hazard Elimination and Control steps (also known as the Hazard Reduction Precedence Sequence). These steps are: 1. Eliminate risk through design. 2. Substitute less risky materials for more hazardous materials. 3. Install safety devices. 4. Install caution and warning devices. 5. Develop administrative controls (to include special procedures and training.) 6. Provide protective clothing and equipment. When added to the two-dimensional models, the level of control adds a visual representation of the risk associated with the hazardous condition, creating a tall-pole for the leastwell-controlled failure while establishing the relative likelihood and severity of all causes and effects for an identified hazard. Computer modeling of the analytical results, using spreadsheets and three-dimensional charting gives a visual confirmation of the relationship between causes and their controls.
Fischer, Sebastian; Wiemer, Anita; Diedrich, Laura; Moock, Jörn; Rössler, Wulf
2014-01-01
We suggest that interactions with strangers at work influence the likelihood of depressive disorders, as they serve as an environmental stressor, which are a necessary condition for the onset of depression according to diathesis-stress models of depression. We examined a large dataset (N = 76,563 in K = 196 occupations) from the German pension insurance program and the Occupational Information Network dataset on occupational characteristics. We used a multilevel framework with individuals and occupations as levels of analysis. We found that occupational environments influence employees’ risks of depression. In line with the quotation that ‘hell is other people’ frequent conflictual contacts were related to greater likelihoods of depression in both males and females (OR = 1.14, p<.05). However, interactions with the public were related to greater likelihoods of depression for males but lower likelihoods of depression for females (ORintercation = 1.21, p<.01). We theorize that some occupations may involve interpersonal experiences with negative emotional tones that make functional coping difficult and increase the risk of depression. In other occupations, these experiences have neutral tones and allow for functional coping strategies. Functional strategies are more often found in women than in men. PMID:25075855