DOE Office of Scientific and Technical Information (OSTI.GOV)
Conover, W.J.; Cox, D.D.; Martz, H.F.
1997-12-01
When using parametric empirical Bayes estimation methods for estimating the binomial or Poisson parameter, the validity of the assumed beta or gamma conjugate prior distribution is an important diagnostic consideration. Chi-square goodness-of-fit tests of the beta or gamma prior hypothesis are developed for use when the binomial sample sizes or Poisson exposure times vary. Nine examples illustrate the application of the methods, using real data from such diverse applications as the loss of feedwater flow rates in nuclear power plants, the probability of failure to run on demand and the failure rates of the high pressure coolant injection systems atmore » US commercial boiling water reactors, the probability of failure to run on demand of emergency diesel generators in US commercial nuclear power plants, the rate of failure of aircraft air conditioners, baseball batting averages, the probability of testing positive for toxoplasmosis, and the probability of tumors in rats. The tests are easily applied in practice by means of corresponding Mathematica{reg_sign} computer programs which are provided.« less
A Monte Carlo Risk Analysis of Life Cycle Cost Prediction.
1975-09-01
process which occurs with each FLU failure. With this in mind there is no alternative other than the binomial distribution. 24 GOR/SM/75D-6 With all of...Weibull distribution of failures as selected by user. For each failure of the ith FLU, the model then samples from the binomial distribution to deter- mine...which is sampled from the binomial . Neither of the two conditions for normality are met, i.e., that RTS Ie close to .5 and the number of samples close
Rogers, Jennifer K; Pocock, Stuart J; McMurray, John J V; Granger, Christopher B; Michelson, Eric L; Östergren, Jan; Pfeffer, Marc A; Solomon, Scott D; Swedberg, Karl; Yusuf, Salim
2014-01-01
Heart failure is characterized by recurrent hospitalizations, but often only the first event is considered in clinical trial reports. In chronic diseases, such as heart failure, analysing all events gives a more complete picture of treatment benefit. We describe methods of analysing repeat hospitalizations, and illustrate their value in one major trial. The Candesartan in Heart failure Assessment of Reduction in Mortality and morbidity (CHARM)-Preserved study compared candesartan with placebo in 3023 patients with heart failure and preserved systolic function. The heart failure hospitalization rates were 12.5 and 8.9 per 100 patient-years in the placebo and candesartan groups, respectively. The repeat hospitalizations were analysed using the Andersen-Gill, Poisson, and negative binomial methods. Death was incorporated into analyses by treating it as an additional event. The win ratio method and a method that jointly models hospitalizations and mortality were also considered. Using repeat events gave larger treatment benefits than time to first event analysis. The negative binomial method for the composite of recurrent heart failure hospitalizations and cardiovascular death gave a rate ratio of 0.75 [95% confidence interval (CI) 0.62-0.91, P = 0.003], whereas the hazard ratio for time to first heart failure hospitalization or cardiovascular death was 0.86 (95% CI 0.74-1.00, P = 0.050). In patients with preserved EF, candesartan reduces the rate of admissions for worsening heart failure, to a greater extent than apparent from analysing only first hospitalizations. Recurrent events should be routinely incorporated into the analysis of future clinical trials in heart failure. © 2013 The Authors. European Journal of Heart Failure © 2013 European Society of Cardiology.
A Methodology for Quantifying Certain Design Requirements During the Design Phase
NASA Technical Reports Server (NTRS)
Adams, Timothy; Rhodes, Russel
2005-01-01
A methodology for developing and balancing quantitative design requirements for safety, reliability, and maintainability has been proposed. Conceived as the basis of a more rational approach to the design of spacecraft, the methodology would also be applicable to the design of automobiles, washing machines, television receivers, or almost any other commercial product. Heretofore, it has been common practice to start by determining the requirements for reliability of elements of a spacecraft or other system to ensure a given design life for the system. Next, safety requirements are determined by assessing the total reliability of the system and adding redundant components and subsystems necessary to attain safety goals. As thus described, common practice leaves the maintainability burden to fall to chance; therefore, there is no control of recurring costs or of the responsiveness of the system. The means that have been used in assessing maintainability have been oriented toward determining the logistical sparing of components so that the components are available when needed. The process established for developing and balancing quantitative requirements for safety (S), reliability (R), and maintainability (M) derives and integrates NASA s top-level safety requirements and the controls needed to obtain program key objectives for safety and recurring cost (see figure). Being quantitative, the process conveniently uses common mathematical models. Even though the process is shown as being worked from the top down, it can also be worked from the bottom up. This process uses three math models: (1) the binomial distribution (greaterthan- or-equal-to case), (2) reliability for a series system, and (3) the Poisson distribution (less-than-or-equal-to case). The zero-fail case for the binomial distribution approximates the commonly known exponential distribution or "constant failure rate" distribution. Either model can be used. The binomial distribution was selected for modeling flexibility because it conveniently addresses both the zero-fail and failure cases. The failure case is typically used for unmanned spacecraft as with missiles.
Performance and structure of single-mode bosonic codes
NASA Astrophysics Data System (ADS)
Albert, Victor V.; Noh, Kyungjoo; Duivenvoorden, Kasper; Young, Dylan J.; Brierley, R. T.; Reinhold, Philip; Vuillot, Christophe; Li, Linshu; Shen, Chao; Girvin, S. M.; Terhal, Barbara M.; Jiang, Liang
2018-03-01
The early Gottesman, Kitaev, and Preskill (GKP) proposal for encoding a qubit in an oscillator has recently been followed by cat- and binomial-code proposals. Numerically optimized codes have also been proposed, and we introduce codes of this type here. These codes have yet to be compared using the same error model; we provide such a comparison by determining the entanglement fidelity of all codes with respect to the bosonic pure-loss channel (i.e., photon loss) after the optimal recovery operation. We then compare achievable communication rates of the combined encoding-error-recovery channel by calculating the channel's hashing bound for each code. Cat and binomial codes perform similarly, with binomial codes outperforming cat codes at small loss rates. Despite not being designed to protect against the pure-loss channel, GKP codes significantly outperform all other codes for most values of the loss rate. We show that the performance of GKP and some binomial codes increases monotonically with increasing average photon number of the codes. In order to corroborate our numerical evidence of the cat-binomial-GKP order of performance occurring at small loss rates, we analytically evaluate the quantum error-correction conditions of those codes. For GKP codes, we find an essential singularity in the entanglement fidelity in the limit of vanishing loss rate. In addition to comparing the codes, we draw parallels between binomial codes and discrete-variable systems. First, we characterize one- and two-mode binomial as well as multiqubit permutation-invariant codes in terms of spin-coherent states. Such a characterization allows us to introduce check operators and error-correction procedures for binomial codes. Second, we introduce a generalization of spin-coherent states, extending our characterization to qudit binomial codes and yielding a multiqudit code.
A comparison of methods for the analysis of binomial clustered outcomes in behavioral research.
Ferrari, Alberto; Comelli, Mario
2016-12-01
In behavioral research, data consisting of a per-subject proportion of "successes" and "failures" over a finite number of trials often arise. This clustered binary data are usually non-normally distributed, which can distort inference if the usual general linear model is applied and sample size is small. A number of more advanced methods is available, but they are often technically challenging and a comparative assessment of their performances in behavioral setups has not been performed. We studied the performances of some methods applicable to the analysis of proportions; namely linear regression, Poisson regression, beta-binomial regression and Generalized Linear Mixed Models (GLMMs). We report on a simulation study evaluating power and Type I error rate of these models in hypothetical scenarios met by behavioral researchers; plus, we describe results from the application of these methods on data from real experiments. Our results show that, while GLMMs are powerful instruments for the analysis of clustered binary outcomes, beta-binomial regression can outperform them in a range of scenarios. Linear regression gave results consistent with the nominal level of significance, but was overall less powerful. Poisson regression, instead, mostly led to anticonservative inference. GLMMs and beta-binomial regression are generally more powerful than linear regression; yet linear regression is robust to model misspecification in some conditions, whereas Poisson regression suffers heavily from violations of the assumptions when used to model proportion data. We conclude providing directions to behavioral scientists dealing with clustered binary data and small sample sizes. Copyright © 2016 Elsevier B.V. All rights reserved.
Statistical inference involving binomial and negative binomial parameters.
García-Pérez, Miguel A; Núñez-Antón, Vicente
2009-05-01
Statistical inference about two binomial parameters implies that they are both estimated by binomial sampling. There are occasions in which one aims at testing the equality of two binomial parameters before and after the occurrence of the first success along a sequence of Bernoulli trials. In these cases, the binomial parameter before the first success is estimated by negative binomial sampling whereas that after the first success is estimated by binomial sampling, and both estimates are related. This paper derives statistical tools to test two hypotheses, namely, that both binomial parameters equal some specified value and that both parameters are equal though unknown. Simulation studies are used to show that in small samples both tests are accurate in keeping the nominal Type-I error rates, and also to determine sample size requirements to detect large, medium, and small effects with adequate power. Additional simulations also show that the tests are sufficiently robust to certain violations of their assumptions.
On Models for Binomial Data with Random Numbers of Trials
Comulada, W. Scott; Weiss, Robert E.
2010-01-01
Summary A binomial outcome is a count s of the number of successes out of the total number of independent trials n = s + f, where f is a count of the failures. The n are random variables not fixed by design in many studies. Joint modeling of (s, f) can provide additional insight into the science and into the probability π of success that cannot be directly incorporated by the logistic regression model. Observations where n = 0 are excluded from the binomial analysis yet may be important to understanding how π is influenced by covariates. Correlation between s and f may exist and be of direct interest. We propose Bayesian multivariate Poisson models for the bivariate response (s, f), correlated through random effects. We extend our models to the analysis of longitudinal and multivariate longitudinal binomial outcomes. Our methodology was motivated by two disparate examples, one from teratology and one from an HIV tertiary intervention study. PMID:17688514
Geary, T W; Smith, M F; MacNeil, M D; Day, M L; Bridges, G A; Perry, G A; Abreu, F M; Atkins, J A; Pohler, K G; Jinks, E M; Madsen, C A
2013-07-01
Reproductive failure in livestock can result from failure to fertilize the oocyte or embryonic loss during gestation. Although fertilization failure occurs, embryonic mortality represents a greater contribution to reproductive failure. Reproductive success varies among species and production goals but is measured as a binomial trait (i.e., pregnancy), derived by the success or failure of multiple biological steps. This review focuses primarily on follicular characteristics affecting oocyte quality, fertilization, and embryonic health that lead to pregnancy establishment in beef cattle. When estrous cycles are manipulated with assisted reproductive technologies and ovulation is induced, duration of proestrus (i.e., interval from induced luteolysis to induced ovulation), ovulatory follicle growth rate, and ovulatory follicle size are factors that affect the maturation of the follicle and oocyte at induced ovulation. The most critical maturational component of the ovulatory follicle is the production of sufficient estradiol to prepare follicular cells for luteinization and progesterone synthesis and prepare the uterus for pregnancy. The exact roles of estradiol in oocyte maturation remain unclear, but cows that have lesser serum concentrations of estradiol have decreased fertilization rates and decreased embryo survival on d 7 after induced ovulation. When length of proestrus is held constant, perhaps the most practical follicular measure of fertility is ovulatory follicle size because it is an easily measured attribute of the follicle that is highly associated with its ability to produce estradiol.
Phase transition and information cascade in a voting model
NASA Astrophysics Data System (ADS)
Hisakado, M.; Mori, S.
2010-08-01
In this paper, we introduce a voting model that is similar to a Keynesian beauty contest and analyse it from a mathematical point of view. There are two types of voters—copycat and independent—and two candidates. Our voting model is a binomial distribution (independent voters) doped in a beta binomial distribution (copycat voters). We find that the phase transition in this system is at the upper limit of t, where t is the time (or the number of the votes). Our model contains three phases. If copycats constitute a majority or even half of the total voters, the voting rate converges more slowly than it would in a binomial distribution. If independents constitute the majority of voters, the voting rate converges at the same rate as it would in a binomial distribution. We also study why it is difficult to estimate the conclusion of a Keynesian beauty contest when there is an information cascade.
Covering Resilience: A Recent Development for Binomial Checkpointing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walther, Andrea; Narayanan, Sri Hari Krishna
In terms of computing time, adjoint methods offer a very attractive alternative to compute gradient information, required, e.g., for optimization purposes. However, together with this very favorable temporal complexity result comes a memory requirement that is in essence proportional with the operation count of the underlying function, e.g., if algorithmic differentiation is used to provide the adjoints. For this reason, checkpointing approaches in many variants have become popular. This paper analyzes an extension of the so-called binomial approach to cover also possible failures of the computing systems. Such a measure of precaution is of special interest for massive parallel simulationsmore » and adjoint calculations where the mean time between failure of the large scale computing system is smaller than the time needed to complete the calculation of the adjoint information. We describe the extensions of standard checkpointing approaches required for such resilience, provide a corresponding implementation and discuss first numerical results.« less
Solar San Diego: The Impact of Binomial Rate Structures on Real PV Systems; Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
VanGeet, O.; Brown, E.; Blair, T.
2008-05-01
There is confusion in the marketplace regarding the impact of solar photovoltaics (PV) on the user's actual electricity bill under California Net Energy Metering, particularly with binomial tariffs (those that include both demand and energy charges) and time-of-use (TOU) rate structures. The City of San Diego has extensive real-time electrical metering on most of its buildings and PV systems, with interval data for overall consumption and PV electrical production available for multiple years. This paper uses 2007 PV-system data from two city facilities to illustrate the impacts of binomial rate designs. The analysis will determine the energy and demand savingsmore » that the PV systems are achieving relative to the absence of systems. A financial analysis of PV-system performance under various rate structures is presented. The data revealed that actual demand and energy use benefits of binomial tariffs increase in summer months, when solar resources allow for maximized electricity production. In a binomial tariff system, varying on- and semi-peak times can result in approximately $1,100 change in demand charges per month over not having a PV system in place, an approximate 30% cost savings. The PV systems are also shown to have a 30%-50% reduction in facility energy charges in 2007.« less
Wu, Jia-Rong; DeWalt, Darren A; Baker, David W; Schillinger, Dean; Ruo, Bernice; Bibbins-Domingo, Kristen; Macabasco-O'Connell, Aurelia; Holmes, George M; Broucksou, Kimberly A; Erman, Brian; Hawk, Victoria; Cene, Crystal W; Jones, Christine DeLong; Pignone, Michael
2014-09-01
To determine whether a single-item self-report medication adherence question predicts hospitalisation and death in patients with heart failure. Poor medication adherence is associated with increased morbidity and mortality. Having a simple means of identifying suboptimal medication adherence could help identify at-risk patients for interventions. We performed a prospective cohort study in 592 participants with heart failure within a four-site randomised trial. Self-report medication adherence was assessed at baseline using a single-item question: 'Over the past seven days, how many times did you miss a dose of any of your heart medication?' Participants who reported no missing doses were defined as fully adherent, and those missing more than one dose were considered less than fully adherent. The primary outcome was combined all-cause hospitalisation or death over one year and the secondary endpoint was heart failure hospitalisation. Outcomes were assessed with blinded chart reviews, and heart failure outcomes were determined by a blinded adjudication committee. We used negative binomial regression to examine the relationship between medication adherence and outcomes. Fifty-two percent of participants were 52% male, mean age was 61 years, and 31% were of New York Heart Association class III/IV at enrolment; 72% of participants reported full adherence to their heart medicine at baseline. Participants with full medication adherence had a lower rate of all-cause hospitalisation and death (0·71 events/year) compared with those with any nonadherence (0·86 events/year): adjusted-for-site incidence rate ratio was 0·83, fully adjusted incidence rate ratio 0·68. Incidence rate ratios were similar for heart failure hospitalisations. A single medication adherence question at baseline predicts hospitalisation and death over one year in heart failure patients. Medication adherence is associated with all-cause and heart failure-related hospitalisation and death in heart failure. It is important for clinicians to assess patients' medication adherence on a regular basis at their clinical follow-ups. © 2013 John Wiley & Sons Ltd.
Drain Failure in Intra-Abdominal Abscesses Associated with Appendicitis.
Horn, Christopher B; Coleoglou Centeno, Adrian A; Guerra, Jarot J; Mazuski, John E; Bochicchio, Grant V; Turnbull, Isaiah R
2018-04-01
Previous studies have suggested that percutaneous drainage and interval appendectomy is an effective treatment for appendicitis with associated abscess. Few studies to date have analyzed risk factors for failed drain management. We hypothesized that older patients with more co-morbidities would be at higher risk for failing conservative treatment. The 2010-2014 editions of the National Inpatient Sample (NIS) were queried for patients with diagnoses of peri-appendiceal abscesses. Minors and elective admissions were excluded. We identified patients who underwent percutaneous drainage and defined drain failure as undergoing a surgical operation after drainage but during the same inpatient visit to assess for factors associated with failure of drainage alone as a treatment. After univariable analysis, binomial logistic regression was used to assess for independent risk factors. Frequencies were analyzed by χ 2 and continuous variables by Student's t-test. A total of 2,209 patients with appendiceal abscesses received drains; 561 patients (25.4%) failed conservative management and underwent operative intervention. On univariable analysis, patients who failed conservative management were younger, more likely to be Hispanic, have more inpatient diagnoses, and to have undergone drainage earlier in the hospital course. Multivariable regression demonstrated that the number of diagnoses, female sex, and Hispanic race were predictive of failure of drainage alone. Older age, West and Midwest census regions, and later drain placement were predictive of successful treatment with drainage alone. Failure was associated with more charges and longer hospital stay but not with a higher mortality rate. Approximately a quarter of patients will fail management of appendiceal abscess with percutaneous drain placement alone. Risk factors for failure are patient complexity, female sex, earlier drainage, and Hispanic race. Failure of drainage is associated with higher total charges and longer hospital stay; however, no change in the mortality rate was noted.
Extending the Binomial Checkpointing Technique for Resilience
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walther, Andrea; Narayanan, Sri Hari Krishna
In terms of computing time, adjoint methods offer a very attractive alternative to compute gradient information, re- quired, e.g., for optimization purposes. However, together with this very favorable temporal complexity result comes a memory requirement that is in essence proportional with the operation count of the underlying function, e.g., if algo- rithmic differentiation is used to provide the adjoints. For this reason, checkpointing approaches in many variants have become popular. This paper analyzes an extension of the so-called binomial approach to cover also possible failures of the computing systems. Such a measure of precaution is of special interest for massivemore » parallel simulations and adjoint calculations where the mean time between failure of the large scale computing system is smaller than the time needed to complete the calculation of the adjoint information. We de- scribe the extensions of standard checkpointing approaches required for such resilience, provide a corresponding imple- mentation and discuss numerical results.« less
NASA Technical Reports Server (NTRS)
Vesely, William E.; Colon, Alfredo E.
2010-01-01
Design Safety/Reliability is associated with the probability of no failure-causing faults existing in a design. Confidence in the non-existence of failure-causing faults is increased by performing tests with no failure. Reliability-Growth testing requirements are based on initial assurance and fault detection probability. Using binomial tables generally gives too many required tests compared to reliability-growth requirements. Reliability-Growth testing requirements are based on reliability principles and factors and should be used.
Hansson, Mari; Pemberton, John; Engkvist, Ola; Feierberg, Isabella; Brive, Lars; Jarvis, Philip; Zander-Balderud, Linda; Chen, Hongming
2014-06-01
High-throughput screening (HTS) is widely used in the pharmaceutical industry to identify novel chemical starting points for drug discovery projects. The current study focuses on the relationship between molecular hit rate in recent in-house HTS and four common molecular descriptors: lipophilicity (ClogP), size (heavy atom count, HEV), fraction of sp(3)-hybridized carbons (Fsp3), and fraction of molecular framework (f(MF)). The molecular hit rate is defined as the fraction of times the molecule has been assigned as active in the HTS campaigns where it has been screened. Beta-binomial statistical models were built to model the molecular hit rate as a function of these descriptors. The advantage of the beta-binomial statistical models is that the correlation between the descriptors is taken into account. Higher degree polynomial terms of the descriptors were also added into the beta-binomial statistic model to improve the model quality. The relative influence of different molecular descriptors on molecular hit rate has been estimated, taking into account that the descriptors are correlated to each other through applying beta-binomial statistical modeling. The results show that ClogP has the largest influence on the molecular hit rate, followed by Fsp3 and HEV. f(MF) has only a minor influence besides its correlation with the other molecular descriptors. © 2013 Society for Laboratory Automation and Screening.
Stober, Thomas; Rammelsberg, P
2005-01-01
The purpose of this study was to evaluate the clinical performance of two adhesively retained composite core materials and compare them with a metal-added glass ionomer. The main objective evaluated was total or partial loss of build-ups during the treatment prior to crown cementation. In 187 patients, 315 vital and non-vital teeth were built up after randomisation with either Rebilda D (RD), Rebilda SC (RSC) or Ketac Silver Aplicap (KSA). The composites were applied in the total-etch-technique with the corresponding dentin bonding agent. The metal-added glass ionomer was used with a conditioner. One group of patients was treated by experienced dentists, the other by dental students, in order to evaluate the effects of different levels of experience. Data were analysed using Mann-Whitney-U-Test and binomial logistic regression. The early failure rate (partial or total loss) of core build-ups before crown cementation was significantly higher for KSA (28.8%), as compared to RSC (15.3%, p=0.037) and RD (15%, p=0.025). Most failures were observed during the removal of the temporary crowns. The rate of replacements was between 3.0 (RD/dentists) and 20.4% (KSA/students). Furthermore, we found that build-ups made by students had a significantly higher risk of loss than those made by dentists (p=0.028). Adhesively retained self-curing composites show a better clinical short-term performance and can be recommended as core build-up materials.
Huang, B Emma; Clifford, David; Lê Cao, Kim-Anh
2014-12-11
To test the effects of technique and attitude in pulling Christmas crackers. A binomial trial conducted at a Christmas-in-July dinner party involving five anonymous dinner guests, including two of the authors. Number of wins achieved by different strategies, with a win defined as securing the larger portion of the cracker. The previously "guaranteed" strategy for victory, employing a downwards angle towards the puller, failed to differentiate itself from random chance (win rate, 6/15; probability of winning, 0.40; 95% CI, 0.15-0.65). A novel passive-aggressive strategy, in which one individual just holds on without pulling, provided a significant advantage (win rate, 11/12; probability of winning, 0.92; 95% CI, 0.76-1.00). The passive-aggressive strategy of failing to pull has a high rate of success at winning Christmas crackers; however, excessive adoption of this approach will result in a complete failure, with no winners at all.
Solar San Diego: The Impact of Binomial Rate Structures on Real PV-Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Geet, O.; Brown, E.; Blair, T.
2008-01-01
There is confusion in the marketplace regarding the impact of solar photovoltaics (PV) on the user's actual electricity bill under California Net Energy Metering, particularly with binomial tariffs (those that include both demand and energy charges) and time-of-use (TOU) rate structures. The City of San Diego has extensive real-time electrical metering on most of its buildings and PV systems, with interval data for overall consumption and PV electrical production available for multiple years. This paper uses 2007 PV-system data from two city facilities to illustrate the impacts of binomial rate designs. The analysis will determine the energy and demand savingsmore » that the PV systems are achieving relative to the absence of systems. A financial analysis of PV-system performance under various rates structures is presented. The data revealed that actual demand and energy use benefits of bionomial tariffs increase in summer months, when solar resources allow for maximized electricity production. In a binomial tariff system, varying on- and semi-peak times can result in approximately $1,100 change in demand charges per month over not having a PV system in place, an approximate 30% cost savings. The PV systems are also shown to have a 30%-50% reduction in facility energy charges in 2007. Future work will include combining demand and electricity charges and increasing the breadth of rate structures tested, including the impacts of non-coincident demand charges.« less
Pedroza, Claudia; Truong, Van Thi Thanh
2017-11-02
Analyses of multicenter studies often need to account for center clustering to ensure valid inference. For binary outcomes, it is particularly challenging to properly adjust for center when the number of centers or total sample size is small, or when there are few events per center. Our objective was to evaluate the performance of generalized estimating equation (GEE) log-binomial and Poisson models, generalized linear mixed models (GLMMs) assuming binomial and Poisson distributions, and a Bayesian binomial GLMM to account for center effect in these scenarios. We conducted a simulation study with few centers (≤30) and 50 or fewer subjects per center, using both a randomized controlled trial and an observational study design to estimate relative risk. We compared the GEE and GLMM models with a log-binomial model without adjustment for clustering in terms of bias, root mean square error (RMSE), and coverage. For the Bayesian GLMM, we used informative neutral priors that are skeptical of large treatment effects that are almost never observed in studies of medical interventions. All frequentist methods exhibited little bias, and the RMSE was very similar across the models. The binomial GLMM had poor convergence rates, ranging from 27% to 85%, but performed well otherwise. The results show that both GEE models need to use small sample corrections for robust SEs to achieve proper coverage of 95% CIs. The Bayesian GLMM had similar convergence rates but resulted in slightly more biased estimates for the smallest sample sizes. However, it had the smallest RMSE and good coverage across all scenarios. These results were very similar for both study designs. For the analyses of multicenter studies with a binary outcome and few centers, we recommend adjustment for center with either a GEE log-binomial or Poisson model with appropriate small sample corrections or a Bayesian binomial GLMM with informative priors.
Tran, Phoebe; Waller, Lance
2015-01-01
Lyme disease has been the subject of many studies due to increasing incidence rates year after year and the severe complications that can arise in later stages of the disease. Negative binomial models have been used to model Lyme disease in the past with some success. However, there has been little focus on the reliability and consistency of these models when they are used to study Lyme disease at multiple spatial scales. This study seeks to explore how sensitive/consistent negative binomial models are when they are used to study Lyme disease at different spatial scales (at the regional and sub-regional levels). The study area includes the thirteen states in the Northeastern United States with the highest Lyme disease incidence during the 2002-2006 period. Lyme disease incidence at county level for the period of 2002-2006 was linked with several previously identified key landscape and climatic variables in a negative binomial regression model for the Northeastern region and two smaller sub-regions (the New England sub-region and the Mid-Atlantic sub-region). This study found that negative binomial models, indeed, were sensitive/inconsistent when used at different spatial scales. We discuss various plausible explanations for such behavior of negative binomial models. Further investigation of the inconsistency and sensitivity of negative binomial models when used at different spatial scales is important for not only future Lyme disease studies and Lyme disease risk assessment/management but any study that requires use of this model type in a spatial context. Copyright © 2014 Elsevier Inc. All rights reserved.
Richards, Jennifer L; Chapple-McGruder, Theresa; Williams, Bryan L; Kramer, Michael R
2015-05-01
Children's cognitive development and academic performance are linked to both fetal and early childhood factors, including preterm birth and family socioeconomic status. We evaluated whether the relationship between preterm birth (PTB) and first grade standardized test performance among Georgia public school students was modified by neighborhood deprivation in early childhood. The Georgia Birth to School cohort followed 327,698 children born in Georgia from 1998 to 2002 through to end-of-year first grade standardized tests. Binomial and log-binomial generalized estimating equations were used to estimate risk differences and risk ratios for the associations of both PTB and the Neighborhood Deprivation Index for the census tract in which each child's mother resided at the time of birth with test failure (versus passing). The presence of additive and multiplicative interaction was assessed. PTB was strongly associated with test failure, with increasing risk for earlier gestational ages. There was positive additive interaction between PTB and neighborhood deprivation. The main effect of PTB versus term birth increased risk of mathematics failure: 15.9% (95%CI: 13.3-18.5%) for early, 5.0% (95% CI: 4.1-5.9%) for moderate, and 1.3% (95%CI: 0.9-1.7%) for late preterm. Each 1 standard deviation increase in neighborhood deprivation was associated with 0.6% increased risk of mathematics failure. For children exposed to both PTB and higher neighborhood deprivation, test failure was 4.8%, 1.5%, and 0.8% greater than the sum of two main effects for early, moderate, and late PTB, respectively. Results were similar, but slightly attenuated, for reading and English/language arts. Our results suggest that PTB and neighborhood deprivation additively interact to produce greater risk among doubly exposed children than would be predicted from the sum of the effects of the two exposures. Understanding socioeconomic disparities in the effect of PTB on academic outcomes at school entry is important for targeting of early childhood interventions. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2014-01-01
Unknown risks are introduced into failure critical systems when probability of detection (POD) capabilities are accepted without a complete understanding of the statistical method applied and the interpretation of the statistical results. The presence of this risk in the nondestructive evaluation (NDE) community is revealed in common statements about POD. These statements are often interpreted in a variety of ways and therefore, the very existence of the statements identifies the need for a more comprehensive understanding of POD methodologies. Statistical methodologies have data requirements to be met, procedures to be followed, and requirements for validation or demonstration of adequacy of the POD estimates. Risks are further enhanced due to the wide range of statistical methodologies used for determining the POD capability. Receiver/Relative Operating Characteristics (ROC) Display, simple binomial, logistic regression, and Bayes' rule POD methodologies are widely used in determining POD capability. This work focuses on Hit-Miss data to reveal the framework of the interrelationships between Receiver/Relative Operating Characteristics Display, simple binomial, logistic regression, and Bayes' Rule methodologies as they are applied to POD. Knowledge of these interrelationships leads to an intuitive and global understanding of the statistical data, procedural and validation requirements for establishing credible POD estimates.
New Class of Quantum Error-Correcting Codes for a Bosonic Mode
NASA Astrophysics Data System (ADS)
Michael, Marios H.; Silveri, Matti; Brierley, R. T.; Albert, Victor V.; Salmilehto, Juha; Jiang, Liang; Girvin, S. M.
2016-07-01
We construct a new class of quantum error-correcting codes for a bosonic mode, which are advantageous for applications in quantum memories, communication, and scalable computation. These "binomial quantum codes" are formed from a finite superposition of Fock states weighted with binomial coefficients. The binomial codes can exactly correct errors that are polynomial up to a specific degree in bosonic creation and annihilation operators, including amplitude damping and displacement noise as well as boson addition and dephasing errors. For realistic continuous-time dissipative evolution, the codes can perform approximate quantum error correction to any given order in the time step between error detection measurements. We present an explicit approximate quantum error recovery operation based on projective measurements and unitary operations. The binomial codes are tailored for detecting boson loss and gain errors by means of measurements of the generalized number parity. We discuss optimization of the binomial codes and demonstrate that by relaxing the parity structure, codes with even lower unrecoverable error rates can be achieved. The binomial codes are related to existing two-mode bosonic codes, but offer the advantage of requiring only a single bosonic mode to correct amplitude damping as well as the ability to correct other errors. Our codes are similar in spirit to "cat codes" based on superpositions of the coherent states but offer several advantages such as smaller mean boson number, exact rather than approximate orthonormality of the code words, and an explicit unitary operation for repumping energy into the bosonic mode. The binomial quantum codes are realizable with current superconducting circuit technology, and they should prove useful in other quantum technologies, including bosonic quantum memories, photonic quantum communication, and optical-to-microwave up- and down-conversion.
Estimation of the cure rate in Iranian breast cancer patients.
Rahimzadeh, Mitra; Baghestani, Ahmad Reza; Gohari, Mahmood Reza; Pourhoseingholi, Mohamad Amin
2014-01-01
Although the Cox's proportional hazard model is the popular approach for survival analysis to investigate significant risk factors of cancer patient survival, it is not appropriate in the case of log-term disease free survival. Recently, cure rate models have been introduced to distinguish between clinical determinants of cure and variables associated with the time to event of interest. The aim of this study was to use a cure rate model to determine the clinical associated factors for cure rates of patients with breast cancer (BC). This prospective cohort study covered 305 patients with BC, admitted at Shahid Faiazbakhsh Hospital, Tehran, during 2006 to 2008 and followed until April 2012. Cases of patient death were confirmed by telephone contact. For data analysis, a non-mixed cure rate model with Poisson distribution and negative binomial distribution were employed. All analyses were carried out using a developed Macro in WinBugs. Deviance information criteria (DIC) were employed to find the best model. The overall 1-year, 3-year and 5-year relative survival rates were 97%, 89% and 74%. Metastasis and stage of BC were the significant factors, but age was significant only in negative binomial model. The DIC also showed that the negative binomial model had a better fit. This study indicated that, metastasis and stage of BC were identified as the clinical criteria for cure rates. There are limited studies on BC survival which employed these cure rate models to identify the clinical factors associated with cure. These models are better than Cox, in the case of long-term survival.
A binomial stochastic kinetic approach to the Michaelis-Menten mechanism
NASA Astrophysics Data System (ADS)
Lente, Gábor
2013-05-01
This Letter presents a new method that gives an analytical approximation of the exact solution of the stochastic Michaelis-Menten mechanism without computationally demanding matrix operations. The method is based on solving the deterministic rate equations and then using the results as guiding variables of calculating probability values using binomial distributions. This principle can be generalized to a number of different kinetic schemes and is expected to be very useful in the evaluation of measurements focusing on the catalytic activity of one or a few individual enzyme molecules.
Hosseinipour, Mina; Nelson, Julie A E; Trapence, Clement; Rutstein, Sarah E; Kasende, Florence; Kayoyo, Virginia; Kaunda-Khangamwa, Blessings; Compliment, Kara; Stanley, Christopher; Cataldo, Fabian; van Lettow, Monique; Rosenberg, Nora E; Tweya, Hannock; Gugsa, Salem; Sampathkumar, Veena; Schouten, Erik; Eliya, Michael; Chimbwandira, Frank; Chiwaula, Levison; Kapito-Tembo, Atupele; Phiri, Sam
2017-06-01
In 2011, Malawi launched Option B+, a program of universal antiretroviral therapy (ART) treatment for pregnant and lactating women to optimize maternal health and prevent pediatric HIV infection. For optimal outcomes, women need to achieve HIVRNA suppression. We report 6-month HIVRNA suppression and HIV drug resistance in the PURE study. PURE study was a cluster-randomized controlled trial evaluating 3 strategies for promoting uptake and retention; arm 1: Standard of Care, arm 2: Facility Peer Support, and arm 3: Community Peer support. Pregnant and breastfeeding mothers were enrolled and followed according to Malawi ART guidelines. Dried blood spots for HIVRNA testing were collected at 6 months. Samples with ART failure (HIVRNA ≥1000 copies/ml) had resistance testing. We calculated odds ratios for ART failure using generalized estimating equations with a logit link and binomial distribution. We enrolled 1269 women across 21 sites in Southern and Central Malawi. Most enrolled while pregnant (86%) and were WHO stage 1 (95%). At 6 months, 950/1269 (75%) were retained; 833/950 (88%) had HIVRNA testing conducted, and 699/833 (84%) were suppressed. Among those with HIVRNA ≥1000 copies/ml with successful amplification (N = 55, 41% of all viral loads > 1000 copies/ml), confirmed HIV resistance was found in 35% (19/55), primarily to the nonnucleoside reverse transcriptase inhibitor class of drugs. ART failure was associated with treatment default but not study arm, age, WHO stage, or breastfeeding status. Virologic suppression at 6 months was <90% target, but the observed confirmed resistance rates suggest that adherence support should be the primary approach for early failure in option B+.
Binomial tree method for pricing a regime-switching volatility stock loans
NASA Astrophysics Data System (ADS)
Putri, Endah R. M.; Zamani, Muhammad S.; Utomo, Daryono B.
2018-03-01
Binomial model with regime switching may represents the price of stock loan which follows the stochastic process. Stock loan is one of alternative that appeal investors to get the liquidity without selling the stock. The stock loan mechanism resembles that of American call option when someone can exercise any time during the contract period. From the resembles both of mechanism, determination price of stock loan can be interpreted from the model of American call option. The simulation result shows the behavior of the price of stock loan under a regime-switching with respect to various interest rate and maturity.
Zero-truncated negative binomial - Erlang distribution
NASA Astrophysics Data System (ADS)
Bodhisuwan, Winai; Pudprommarat, Chookait; Bodhisuwan, Rujira; Saothayanun, Luckhana
2017-11-01
The zero-truncated negative binomial-Erlang distribution is introduced. It is developed from negative binomial-Erlang distribution. In this work, the probability mass function is derived and some properties are included. The parameters of the zero-truncated negative binomial-Erlang distribution are estimated by using the maximum likelihood estimation. Finally, the proposed distribution is applied to real data, the number of methamphetamine in the Bangkok, Thailand. Based on the results, it shows that the zero-truncated negative binomial-Erlang distribution provided a better fit than the zero-truncated Poisson, zero-truncated negative binomial, zero-truncated generalized negative-binomial and zero-truncated Poisson-Lindley distributions for this data.
Reliability studies of Integrated Modular Engine system designs
NASA Technical Reports Server (NTRS)
Hardy, Terry L.; Rapp, Douglas C.
1993-01-01
A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.
Reliability studies of integrated modular engine system designs
NASA Technical Reports Server (NTRS)
Hardy, Terry L.; Rapp, Douglas C.
1993-01-01
A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.
Reliability studies of integrated modular engine system designs
NASA Astrophysics Data System (ADS)
Hardy, Terry L.; Rapp, Douglas C.
1993-06-01
A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.
Reliability studies of Integrated Modular Engine system designs
NASA Astrophysics Data System (ADS)
Hardy, Terry L.; Rapp, Douglas C.
1993-06-01
A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.
Yin, Honglei; Xu, Lin; Shao, Yechang; Li, Liping; Wan, Chengsong
2016-01-01
The objective of this study was to estimate the features of suicide rate and its association with economic development and stock market during the past decade in the People's Republic of China. Official data were gathered and analyzed in the People's Republic of China during the period 2004-2013. Nationwide suicide rate was stratified by four year age-groups, sex, urban/rural areas, and regions (East, Central, and West). Annual economic indexes including gross domestic product (GDP) per capita and rural and urban income per capita were all adjusted for inflation. Variation coefficient of market index (VCMI) was also included as an economic index to measure the fluctuation of the stock market. Negative binomial regression was performed to examine the time trend of region-level suicide rates and effects of sex, age, urban/rural area, region, and economic index on the suicide rates. Suicide rates of each age-group, sex, urban/rural area, and region were generally decreased from 2004 to 2013, while annual GDP per capita and rural and urban income per capita were generally increased by year. VCMI fluctuated largely, which peaked around 2009 and decreased after that time. Negative binomial regression showed that the decreased suicide rate in East and Central rural areas was the main cause of the decrease in suicide rate in the People's Republic of China. Suicide rate in the People's Republic of China for the study period increased with age and was higher in rural than in urban area, higher in males than in females, and the highest in the Central region. When GDP per capita increased by 2,787 RMB, the suicide rate decreased by 0.498 times. VCMI showed no significant relationship with suicide rate in the negative binomial regression. Suicide rate decreased in 2004-2013; varied among different age-groups, sex, urban/rural areas, and regions; and was negatively associated with the economic growth in the People's Republic of China. Stock market showed no relationship with suicide rate, but this finding needs to be verified in a future study.
Yin, Honglei; Xu, Lin; Shao, Yechang; Li, Liping; Wan, Chengsong
2016-01-01
Objectives The objective of this study was to estimate the features of suicide rate and its association with economic development and stock market during the past decade in the People’s Republic of China. Methods Official data were gathered and analyzed in the People’s Republic of China during the period 2004–2013. Nationwide suicide rate was stratified by four year age-groups, sex, urban/rural areas, and regions (East, Central, and West). Annual economic indexes including gross domestic product (GDP) per capita and rural and urban income per capita were all adjusted for inflation. Variation coefficient of market index (VCMI) was also included as an economic index to measure the fluctuation of the stock market. Negative binomial regression was performed to examine the time trend of region-level suicide rates and effects of sex, age, urban/rural area, region, and economic index on the suicide rates. Results Suicide rates of each age-group, sex, urban/rural area, and region were generally decreased from 2004 to 2013, while annual GDP per capita and rural and urban income per capita were generally increased by year. VCMI fluctuated largely, which peaked around 2009 and decreased after that time. Negative binomial regression showed that the decreased suicide rate in East and Central rural areas was the main cause of the decrease in suicide rate in the People’s Republic of China. Suicide rate in the People’s Republic of China for the study period increased with age and was higher in rural than in urban area, higher in males than in females, and the highest in the Central region. When GDP per capita increased by 2,787 RMB, the suicide rate decreased by 0.498 times. VCMI showed no significant relationship with suicide rate in the negative binomial regression. Conclusion Suicide rate decreased in 2004–2013; varied among different age-groups, sex, urban/rural areas, and regions; and was negatively associated with the economic growth in the People’s Republic of China. Stock market showed no relationship with suicide rate, but this finding needs to be verified in a future study. PMID:27994468
Camp, Christopher L; Ryan, Claire B; Degen, Ryan M; Dines, Joshua S; Altchek, David W; Werner, Brian C
2017-04-01
The literature investigating risk factors for failure after decompression of the ulnar nerve at the elbow (cubital tunnel release [CuTR]) is limited. The purpose of this study was to identify risk factors for failure of isolated CuTR, defined as progression to subsequent ipsilateral revision surgery. The 100% Medicare Standard Analytic Files from 2005 to 2012 were queried for patients undergoing CuTR. Patients undergoing any concomitant procedures were excluded. A multivariate binomial logistic regression analysis was used to evaluate patient-related risk factors for ipsilateral revision surgery. Adjusted odds ratios (ORs) and 95% confidence intervals were calculated for each risk factor. A total of 25,977 patients underwent primary CuTR, and 304 (1.4%) of those with ≥2 years of follow-up required revision surgery. Although the rate of primary procedures is on the rise (P = .002), the revision rate remains steady (P = .148). Significant, independent risk factors for revision surgery included age <65 years (OR, 1.5; P < .001), obesity (OR, 1.3; P = .022), morbid obesity (OR, 1.3; P = .044), tobacco use (OR, 2.0; P < .001), diabetes (OR, 1.3; P = .011), hyperlipidemia (OR, 1.2; P = .015), chronic liver disease (OR, 1.6; P = .001), chronic anemia (OR, 1.6; P = .001), and hypercoagulable disorder (OR, 2.1; P = .001). The incidence of failure requiring ipsilateral revision surgery after CuTR remained steadily low (1.4%) during the study period. There are numerous patient-related risk factors that are independently associated with an increased risk for revision surgery, the most significant of which are tobacco use, younger age, hypercoagulable disorder, liver disease, and anemia. Copyright © 2017 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.
Dorazio, R.M.; Royle, J. Andrew
2003-01-01
We develop a parameterization of the beta-binomial mixture that provides sensible inferences about the size of a closed population when probabilities of capture or detection vary among individuals. Three classes of mixture models (beta-binomial, logistic-normal, and latent-class) are fitted to recaptures of snowshoe hares for estimating abundance and to counts of bird species for estimating species richness. In both sets of data, rates of detection appear to vary more among individuals (animals or species) than among sampling occasions or locations. The estimates of population size and species richness are sensitive to model-specific assumptions about the latent distribution of individual rates of detection. We demonstrate using simulation experiments that conventional diagnostics for assessing model adequacy, such as deviance, cannot be relied on for selecting classes of mixture models that produce valid inferences about population size. Prior knowledge about sources of individual heterogeneity in detection rates, if available, should be used to help select among classes of mixture models that are to be used for inference.
Distinguishing between Binomial, Hypergeometric and Negative Binomial Distributions
ERIC Educational Resources Information Center
Wroughton, Jacqueline; Cole, Tarah
2013-01-01
Recognizing the differences between three discrete distributions (Binomial, Hypergeometric and Negative Binomial) can be challenging for students. We present an activity designed to help students differentiate among these distributions. In addition, we present assessment results in the form of pre- and post-tests that were designed to assess the…
Library Book Circulation and the Beta-Binomial Distribution.
ERIC Educational Resources Information Center
Gelman, E.; Sichel, H. S.
1987-01-01
Argues that library book circulation is a binomial rather than a Poisson process, and that individual book popularities are continuous beta distributions. Three examples demonstrate the superiority of beta over negative binomial distribution, and it is suggested that a bivariate-binomial process would be helpful in predicting future book…
Chen, Connie; Gribble, Matthew O; Bartroff, Jay; Bay, Steven M; Goldstein, Larry
2017-05-01
The United States's Clean Water Act stipulates in section 303(d) that states must identify impaired water bodies for which total maximum daily loads (TMDLs) of pollution inputs into water bodies are developed. Decision-making procedures about how to list, or delist, water bodies as impaired, or not, per Clean Water Act 303(d) differ across states. In states such as California, whether or not a particular monitoring sample suggests that water quality is impaired can be regarded as a binary outcome variable, and California's current regulatory framework invokes a version of the exact binomial test to consolidate evidence across samples and assess whether the overall water body complies with the Clean Water Act. Here, we contrast the performance of California's exact binomial test with one potential alternative, the Sequential Probability Ratio Test (SPRT). The SPRT uses a sequential testing framework, testing samples as they become available and evaluating evidence as it emerges, rather than measuring all the samples and calculating a test statistic at the end of the data collection process. Through simulations and theoretical derivations, we demonstrate that the SPRT on average requires fewer samples to be measured to have comparable Type I and Type II error rates as the current fixed-sample binomial test. Policymakers might consider efficient alternatives such as SPRT to current procedure. Copyright © 2017 Elsevier Ltd. All rights reserved.
Martin, Julien; Royle, J. Andrew; MacKenzie, Darryl I.; Edwards, Holly H.; Kery, Marc; Gardner, Beth
2011-01-01
Summary 1. Binomial mixture models use repeated count data to estimate abundance. They are becoming increasingly popular because they provide a simple and cost-effective way to account for imperfect detection. However, these models assume that individuals are detected independently of each other. This assumption may often be violated in the field. For instance, manatees (Trichechus manatus latirostris) may surface in turbid water (i.e. become available for detection during aerial surveys) in a correlated manner (i.e. in groups). However, correlated behaviour, affecting the non-independence of individual detections, may also be relevant in other systems (e.g. correlated patterns of singing in birds and amphibians). 2. We extend binomial mixture models to account for correlated behaviour and therefore to account for non-independent detection of individuals. We simulated correlated behaviour using beta-binomial random variables. Our approach can be used to simultaneously estimate abundance, detection probability and a correlation parameter. 3. Fitting binomial mixture models to data that followed a beta-binomial distribution resulted in an overestimation of abundance even for moderate levels of correlation. In contrast, the beta-binomial mixture model performed considerably better in our simulation scenarios. We also present a goodness-of-fit procedure to evaluate the fit of beta-binomial mixture models. 4. We illustrate our approach by fitting both binomial and beta-binomial mixture models to aerial survey data of manatees in Florida. We found that the binomial mixture model did not fit the data, whereas there was no evidence of lack of fit for the beta-binomial mixture model. This example helps illustrate the importance of using simulations and assessing goodness-of-fit when analysing ecological data with N-mixture models. Indeed, both the simulations and the goodness-of-fit procedure highlighted the limitations of the standard binomial mixture model for aerial manatee surveys. 5. Overestimation of abundance by binomial mixture models owing to non-independent detections is problematic for ecological studies, but also for conservation. For example, in the case of endangered species, it could lead to inappropriate management decisions, such as downlisting. These issues will be increasingly relevant as more ecologists apply flexible N-mixture models to ecological data.
Perceived Prevalence of Teasing and Bullying Predicts High School Dropout Rates
ERIC Educational Resources Information Center
Cornell, Dewey; Gregory, Anne; Huang, Francis; Fan, Xitao
2013-01-01
This prospective study of 276 Virginia public high schools found that the prevalence of teasing and bullying (PTB) as perceived by both 9th-grade students and teachers was predictive of dropout rates for this cohort 4 years later. Negative binomial regression indicated that one standard deviation increases in student- and teacher-reported PTB were…
Estimating relative risks for common outcome using PROC NLP.
Yu, Binbing; Wang, Zhuoqiao
2008-05-01
In cross-sectional or cohort studies with binary outcomes, it is biologically interpretable and of interest to estimate the relative risk or prevalence ratio, especially when the response rates are not rare. Several methods have been used to estimate the relative risk, among which the log-binomial models yield the maximum likelihood estimate (MLE) of the parameters. Because of restrictions on the parameter space, the log-binomial models often run into convergence problems. Some remedies, e.g., the Poisson and Cox regressions, have been proposed. However, these methods may give out-of-bound predicted response probabilities. In this paper, a new computation method using the SAS Nonlinear Programming (NLP) procedure is proposed to find the MLEs. The proposed NLP method was compared to the COPY method, a modified method to fit the log-binomial model. Issues in the implementation are discussed. For illustration, both methods were applied to data on the prevalence of microalbuminuria (micro-protein leakage into urine) for kidney disease patients from the Diabetes Control and Complications Trial. The sample SAS macro for calculating relative risk is provided in the appendix.
Meta-analysis of studies with bivariate binary outcomes: a marginal beta-binomial model approach
Chen, Yong; Hong, Chuan; Ning, Yang; Su, Xiao
2018-01-01
When conducting a meta-analysis of studies with bivariate binary outcomes, challenges arise when the within-study correlation and between-study heterogeneity should be taken into account. In this paper, we propose a marginal beta-binomial model for the meta-analysis of studies with binary outcomes. This model is based on the composite likelihood approach, and has several attractive features compared to the existing models such as bivariate generalized linear mixed model (Chu and Cole, 2006) and Sarmanov beta-binomial model (Chen et al., 2012). The advantages of the proposed marginal model include modeling the probabilities in the original scale, not requiring any transformation of probabilities or any link function, having closed-form expression of likelihood function, and no constraints on the correlation parameter. More importantly, since the marginal beta-binomial model is only based on the marginal distributions, it does not suffer from potential misspecification of the joint distribution of bivariate study-specific probabilities. Such misspecification is difficult to detect and can lead to biased inference using currents methods. We compare the performance of the marginal beta-binomial model with the bivariate generalized linear mixed model and the Sarmanov beta-binomial model by simulation studies. Interestingly, the results show that the marginal beta-binomial model performs better than the Sarmanov beta-binomial model, whether or not the true model is Sarmanov beta-binomial, and the marginal beta-binomial model is more robust than the bivariate generalized linear mixed model under model misspecifications. Two meta-analyses of diagnostic accuracy studies and a meta-analysis of case-control studies are conducted for illustration. PMID:26303591
Educational attainment has a limited impact on disease management outcomes in heart failure.
Smith, Brad; Forkner, Emma; Krasuski, Richard A; Galbreath, Autumn Dawn; Freeman, Gregory L
2006-06-01
The objective of this study was to assess whether educational attainment moderates outcomes in the intervention group in a trial of disease management in heart failure (HF). Data were collected from a sample of 654 patients enrolled in the disease management arm of a community- based study of HF patients. The full sample was used to analyze two primary outcomes- all-cause mortality and cardiac event-free survival. Two other primary outcomes- rates of HF-related emergency department (ED) visits and inpatient admissions-and secondary outcomes (patient self-confidence in managing HF symptoms and daily dietary sodium intake in milligrams) were analyzed in a smaller sample of 602 patients who completed at least 6 months of disease management. One-way analysis of variance and chi (2) tests were used to assess differences in baseline demographic and clinical characteristics. Survival analyses were conducted with proportional hazards regression, while negative binomial regression was used to assess educational differences in ED usage and inpatient admissions. Repeated measures analysis of variance models were used to assess whether secondary outcomes differed across educational strata and/or over time. All outcome analyses were adjusted for confounders. Patients with the least education fared the poorest for all-cause mortality, but education- related differences failed to achieve statistical significance. No education-related differences were observed for cardiac event-free survival, or for the rates of inpatient admission and ED usage. For secondary outcomes, sodium intake differed significantly by education (p = 0.04), with the largest drop (-838 mg/day) observed in the least well-educated group. Confidence increased an approximately equal amount (2.1-3.0 points on a 100-point scale) across all educational strata (p = ns). Low educational attainment may not be a barrier to effective disease management.
The Binomial Model in Fluctuation Analysis of Quantal Neurotransmitter Release
Quastel, D. M. J.
1997-01-01
The mathematics of the binomial model for quantal neurotransmitter release is considered in general terms, to explore what information might be extractable from statistical aspects of data. For an array of N statistically independent release sites, each with a release probability p, the compound binomial always pertains, with , p′ ≡ 1 - var(m)/ (1 + cvp2) and n′ ≡ 2. Unless n′ is invariant with ambient conditions or stimulation paradigms, the simple binomial (cvp = 0) is untenable and n′ is neither N nor the number of “active” sites or sites with a quantum available. At each site p = popA, where po is the output probability if a site is “eligible” or “filled” despite previous quantal discharge, and pA (eligibility probability) depends at least on the replenishment rate, po, and interstimulus time. Assuming stochastic replenishment, a simple algorithm allows calculation of the full statistical composition of outputs for any hypothetical combinations of po's and refill rates, for any stimulation paradigm and spontaneous release. A rise in n′ (reduced cvp) tends to occur whenever po varies widely between sites, with a raised stimulation frequency or factors tending to increase po's. Unlike
[Evaluation of estimation of prevalence ratio using bayesian log-binomial regression model].
Gao, W L; Lin, H; Liu, X N; Ren, X W; Li, J S; Shen, X P; Zhu, S L
2017-03-10
To evaluate the estimation of prevalence ratio ( PR ) by using bayesian log-binomial regression model and its application, we estimated the PR of medical care-seeking prevalence to caregivers' recognition of risk signs of diarrhea in their infants by using bayesian log-binomial regression model in Openbugs software. The results showed that caregivers' recognition of infant' s risk signs of diarrhea was associated significantly with a 13% increase of medical care-seeking. Meanwhile, we compared the differences in PR 's point estimation and its interval estimation of medical care-seeking prevalence to caregivers' recognition of risk signs of diarrhea and convergence of three models (model 1: not adjusting for the covariates; model 2: adjusting for duration of caregivers' education, model 3: adjusting for distance between village and township and child month-age based on model 2) between bayesian log-binomial regression model and conventional log-binomial regression model. The results showed that all three bayesian log-binomial regression models were convergence and the estimated PRs were 1.130(95 %CI : 1.005-1.265), 1.128(95 %CI : 1.001-1.264) and 1.132(95 %CI : 1.004-1.267), respectively. Conventional log-binomial regression model 1 and model 2 were convergence and their PRs were 1.130(95 % CI : 1.055-1.206) and 1.126(95 % CI : 1.051-1.203), respectively, but the model 3 was misconvergence, so COPY method was used to estimate PR , which was 1.125 (95 %CI : 1.051-1.200). In addition, the point estimation and interval estimation of PRs from three bayesian log-binomial regression models differed slightly from those of PRs from conventional log-binomial regression model, but they had a good consistency in estimating PR . Therefore, bayesian log-binomial regression model can effectively estimate PR with less misconvergence and have more advantages in application compared with conventional log-binomial regression model.
Meta-analysis of studies with bivariate binary outcomes: a marginal beta-binomial model approach.
Chen, Yong; Hong, Chuan; Ning, Yang; Su, Xiao
2016-01-15
When conducting a meta-analysis of studies with bivariate binary outcomes, challenges arise when the within-study correlation and between-study heterogeneity should be taken into account. In this paper, we propose a marginal beta-binomial model for the meta-analysis of studies with binary outcomes. This model is based on the composite likelihood approach and has several attractive features compared with the existing models such as bivariate generalized linear mixed model (Chu and Cole, 2006) and Sarmanov beta-binomial model (Chen et al., 2012). The advantages of the proposed marginal model include modeling the probabilities in the original scale, not requiring any transformation of probabilities or any link function, having closed-form expression of likelihood function, and no constraints on the correlation parameter. More importantly, because the marginal beta-binomial model is only based on the marginal distributions, it does not suffer from potential misspecification of the joint distribution of bivariate study-specific probabilities. Such misspecification is difficult to detect and can lead to biased inference using currents methods. We compare the performance of the marginal beta-binomial model with the bivariate generalized linear mixed model and the Sarmanov beta-binomial model by simulation studies. Interestingly, the results show that the marginal beta-binomial model performs better than the Sarmanov beta-binomial model, whether or not the true model is Sarmanov beta-binomial, and the marginal beta-binomial model is more robust than the bivariate generalized linear mixed model under model misspecifications. Two meta-analyses of diagnostic accuracy studies and a meta-analysis of case-control studies are conducted for illustration. Copyright © 2015 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Bergee, Martin J.; Westfall, Claude R.
2005-01-01
This is the third study in a line of inquiry whose purpose has been to develop a theoretical model of selected extra musical variables' influence on solo and small-ensemble festival ratings. Authors of the second of these (Bergee & McWhirter, 2005) had used binomial logistic regression as the basis for their model-formulation strategy. Their…
Aly, Sharif S; Zhao, Jianyang; Li, Ben; Jiang, Jiming
2014-01-01
The Intraclass Correlation Coefficient (ICC) is commonly used to estimate the similarity between quantitative measures obtained from different sources. Overdispersed data is traditionally transformed so that linear mixed model (LMM) based ICC can be estimated. A common transformation used is the natural logarithm. The reliability of environmental sampling of fecal slurry on freestall pens has been estimated for Mycobacterium avium subsp. paratuberculosis using the natural logarithm transformed culture results. Recently, the negative binomial ICC was defined based on a generalized linear mixed model for negative binomial distributed data. The current study reports on the negative binomial ICC estimate which includes fixed effects using culture results of environmental samples. Simulations using a wide variety of inputs and negative binomial distribution parameters (r; p) showed better performance of the new negative binomial ICC compared to the ICC based on LMM even when negative binomial data was logarithm, and square root transformed. A second comparison that targeted a wider range of ICC values showed that the mean of estimated ICC closely approximated the true ICC.
Harrison, Xavier A
2015-01-01
Overdispersion is a common feature of models of biological data, but researchers often fail to model the excess variation driving the overdispersion, resulting in biased parameter estimates and standard errors. Quantifying and modeling overdispersion when it is present is therefore critical for robust biological inference. One means to account for overdispersion is to add an observation-level random effect (OLRE) to a model, where each data point receives a unique level of a random effect that can absorb the extra-parametric variation in the data. Although some studies have investigated the utility of OLRE to model overdispersion in Poisson count data, studies doing so for Binomial proportion data are scarce. Here I use a simulation approach to investigate the ability of both OLRE models and Beta-Binomial models to recover unbiased parameter estimates in mixed effects models of Binomial data under various degrees of overdispersion. In addition, as ecologists often fit random intercept terms to models when the random effect sample size is low (<5 levels), I investigate the performance of both model types under a range of random effect sample sizes when overdispersion is present. Simulation results revealed that the efficacy of OLRE depends on the process that generated the overdispersion; OLRE failed to cope with overdispersion generated from a Beta-Binomial mixture model, leading to biased slope and intercept estimates, but performed well for overdispersion generated by adding random noise to the linear predictor. Comparison of parameter estimates from an OLRE model with those from its corresponding Beta-Binomial model readily identified when OLRE were performing poorly due to disagreement between effect sizes, and this strategy should be employed whenever OLRE are used for Binomial data to assess their reliability. Beta-Binomial models performed well across all contexts, but showed a tendency to underestimate effect sizes when modelling non-Beta-Binomial data. Finally, both OLRE and Beta-Binomial models performed poorly when models contained <5 levels of the random intercept term, especially for estimating variance components, and this effect appeared independent of total sample size. These results suggest that OLRE are a useful tool for modelling overdispersion in Binomial data, but that they do not perform well in all circumstances and researchers should take care to verify the robustness of parameter estimates of OLRE models.
Hilton, Michael F; Whiteford, Harvey A
2010-12-01
This study investigates associations between psychological distress and workplace accidents, workplace failures and workplace successes. The Health and Work Performance Questionnaire (HPQ) was distributed to employees of 58 large employers. A total of 60,556 full-time employees were eligible for analysis. The HPQ probed whether the respondent had, in the past 30-days, a workplace accident, success or failure ("yes" or "no"). Psychological distress was quantified using the Kessler 6 (K6) scale and categorised into low, moderate and high psychological distress. Three binomial logistic regressions were performed with the dependent variables being workplace accident, success or failure. Covariates in the models were K6 category, gender, age, marital status, education level, job category, physical health and employment sector. Accounting for all other variables, moderate and high psychological distress significantly (P < 0.0001) increased the odds ratio (OR) for a workplace accident to 1.4 for both levels of distress. Moderate and high psychological distress significantly (P < 0.0001) increased the OR (OR = 2.3 and 2.6, respectively) for a workplace failure and significantly (P < 0.0001) decreased the OR for a workplace success (OR = 0.8 and 0.7, respectively). Moderate and high psychological distress increase the OR's for workplace accidents work failures and decrease the OR of workplace successes at similar levels. As the prevalence of moderate psychological distress is approximately double that of high psychological distress moderate distress consequentially has a greater workplace impact.
M-Bonomial Coefficients and Their Identities
ERIC Educational Resources Information Center
Asiru, Muniru A.
2010-01-01
In this note, we introduce M-bonomial coefficients or (M-bonacci binomial coefficients). These are similar to the binomial and the Fibonomial (or Fibonacci-binomial) coefficients and can be displayed in a triangle similar to Pascal's triangle from which some identities become obvious.
Feng, Jia; Kramer, Michael R; Dever, Bridget V; Dunlop, Anne L; Williams, Bryan; Jain, Lucky
2013-05-01
Maternal smoking during pregnancy (MSDP) has been reported to be associated with impaired measures of cognitive function, but it remains unclear whether exposure to MSDP has an impact upon offspring school performance. We examined the association between MSDP and failure of the Criterion-Referenced Competency Tests (CRCT) among Georgia first grade students. A retrospective cohort was created by deterministically linking 331 531 children born in Georgia from 1998 to 2002 (inclusive) to their individual CRCT education records from 2005 to 2009. We evaluated the association between MSDP (yes/no) and failure of the CRCT Reading, English/Language Arts (ELA), and Mathematics tests, with adjustment for maternal and child sociodemographic characteristics and birth outcomes. Log-binomial models estimated the risk ratios and 95% confidence intervals. Conditional models were fitted to paired sibling data. MSDP was associated with CRCT failure with an adjusted risk ratios for Reading: 1.16 [95% CI 1.12, 1.21]; ELA: 1.12 [95%CI 1.10, 1.15]; and Mathematics: 1.13 [95%CI 1.10, 1.16]. The association remained significant in paired sibling analyses. MSDP may have independent long-term effects on offspring school performance, which does not appear to be through smoking-related adverse birth outcomes. © 2013 Blackwell Publishing Ltd.
Effects of Test Level Discrimination and Difficulty on Answer-Copying Indices
ERIC Educational Resources Information Center
Sunbul, Onder; Yormaz, Seha
2018-01-01
In this study Type I Error and the power rates of omega (?) and GBT (generalized binomial test) indices were investigated for several nominal alpha levels and for 40 and 80-item test lengths with 10,000-examinee sample size under several test level restrictions. As a result, Type I error rates of both indices were found to be below the acceptable…
Metaprop: a Stata command to perform meta-analysis of binomial data.
Nyaga, Victoria N; Arbyn, Marc; Aerts, Marc
2014-01-01
Meta-analyses have become an essential tool in synthesizing evidence on clinical and epidemiological questions derived from a multitude of similar studies assessing the particular issue. Appropriate and accessible statistical software is needed to produce the summary statistic of interest. Metaprop is a statistical program implemented to perform meta-analyses of proportions in Stata. It builds further on the existing Stata procedure metan which is typically used to pool effects (risk ratios, odds ratios, differences of risks or means) but which is also used to pool proportions. Metaprop implements procedures which are specific to binomial data and allows computation of exact binomial and score test-based confidence intervals. It provides appropriate methods for dealing with proportions close to or at the margins where the normal approximation procedures often break down, by use of the binomial distribution to model the within-study variability or by allowing Freeman-Tukey double arcsine transformation to stabilize the variances. Metaprop was applied on two published meta-analyses: 1) prevalence of HPV-infection in women with a Pap smear showing ASC-US; 2) cure rate after treatment for cervical precancer using cold coagulation. The first meta-analysis showed a pooled HPV-prevalence of 43% (95% CI: 38%-48%). In the second meta-analysis, the pooled percentage of cured women was 94% (95% CI: 86%-97%). By using metaprop, no studies with 0% or 100% proportions were excluded from the meta-analysis. Furthermore, study specific and pooled confidence intervals always were within admissible values, contrary to the original publication, where metan was used.
Garg, Madhur K.; Zhao, Fengmin; Palefsky, Joel; Whittington, Richard; Mitchell, Edith P.; Mulcahy, Mary F.; Armstrong, Karin I.; Nabbout, Nassim H.; Kalnicki, Shalom; El-Rayes, Bassel F.; Onitilo, Adedayo A.; Moriarty, Daniel J.; Fitzgerald, Thomas J.; Benson, Al B.
2017-01-01
Purpose Squamous cell carcinoma of the anal canal (SCCAC) is characterized by high locoregional failure (LRF) rates after sphincter-preserving definitive chemoradiation (CRT) and is typically associated with anogenital human papilloma virus infection. Because cetuximab enhances the effect of radiation therapy in human papilloma virus–associated oropharyngeal squamous cell carcinoma, we hypothesized that adding cetuximab to CRT would reduce LRF in SCCAC. Methods Sixty-one patients with stage I to III SCCAC received CRT including cisplatin, fluorouracil, and radiation therapy to the primary tumor and regional lymph nodes (45 to 54 Gy) plus eight once-weekly doses of concurrent cetuximab. The study was designed to detect at least a 50% reduction in 3-year LRF rate (one-sided α, 0.10; power 90%), assuming a 35% LRF rate from historical data. Results Poor risk features included stage III disease in 64% and male sex in 20%. The 3-year LRF rate was 23% (95% CI, 13% to 36%; one-sided P = .03) by binomial proportional estimate using the prespecified end point and 21% (95% CI, 7% to 26%) by Kaplan-Meier estimate in a post hoc analysis using methods consistent with historical data. Three-year rates were 68% (95% CI, 55% to 79%) for progression-free survival and 83% (95% CI, 71% to 91%) for overall survival. Grade 4 toxicity occurred in 32%, and 5% had treatment-associated deaths. Conclusion Although the addition of cetuximab to chemoradiation for SCCAC was associated with lower LRF rates than historical data with CRT alone, toxicity was substantial, and LRF still occurs in approximately 20%, indicating the continued need for more effective and less toxic therapies. PMID:28068178
Lee, Jeannette Y.; Palefsky, Joel; Henry, David H.; Wachsman, William; Rajdev, Lakshmi; Aboulafia, David; Ratner, Lee; Fitzgerald, Thomas J.; Kachnic, Lisa; Mitsuyasu, Ronald
2017-01-01
Purpose Squamous cell carcinoma of the anal canal (SCCAC) is characterized by high locoregional failure (LRF) rates after definitive chemoradiation (CRT), associated with anogenital human papilloma virus, and often appears in HIV infection. Because cetuximab enhances the effect of radiation therapy in human papilloma virus–associated oropharyngeal SCC, we hypothesized that adding cetuximab to CRT would reduce LRF in SCCAC. Methods Forty-five patients with stage I to III SCCAC and HIV infection received CRT: 45 to 54 Gy radiation therapy to the primary tumor and regional lymph nodes plus eight once-weekly doses of concurrent cetuximab and two cycles of cisplatin and fluorouracil. The study was designed to detect at least a 50% reduction in 3-year LRF rate (one-sided α, 0.10; power, 90%), assuming a 35% LRF rate from historical data. Results The 3-year LRF rate was 42% (95% CI, 28% to 56%; one-sided P = .9) by binomial proportional estimate using the prespecified end point (LRF or alive without LRF and followed < 3 years), and 20% (95% CI, 10% to 37%) by Kaplan-Meier estimate in post hoc analysis using definitions and methods consistent with historical data. Three-year rates by Kaplan-Meier estimate were 72% (95% CI, 56% to 84%) for progression-free survival and 79% (95% CI, 63% to 89%) for overall survival. Grade 4 toxicity occurred in 26%, and 4% had treatment-associated deaths. Conclusion HIV-associated SCCAC is potentially curable with definitive CRT. Although addition of cetuximab may result in less LRF, the 20% recurrence and 26% grade 4 toxicity rates indicate the continued need for more-effective and less-toxic therapies. PMID:27937092
Problems on Divisibility of Binomial Coefficients
ERIC Educational Resources Information Center
Osler, Thomas J.; Smoak, James
2004-01-01
Twelve unusual problems involving divisibility of the binomial coefficients are represented in this article. The problems are listed in "The Problems" section. All twelve problems have short solutions which are listed in "The Solutions" section. These problems could be assigned to students in any course in which the binomial theorem and Pascal's…
Application of binomial-edited CPMG to shale characterization
Washburn, Kathryn E.; Birdwell, Justin E.
2014-01-01
Unconventional shale resources may contain a significant amount of hydrogen in organic solids such as kerogen, but it is not possible to directly detect these solids with many NMR systems. Binomial-edited pulse sequences capitalize on magnetization transfer between solids, semi-solids, and liquids to provide an indirect method of detecting solid organic materials in shales. When the organic solids can be directly measured, binomial-editing helps distinguish between different phases. We applied a binomial-edited CPMG pulse sequence to a range of natural and experimentally-altered shale samples. The most substantial signal loss is seen in shales rich in organic solids while fluids associated with inorganic pores seem essentially unaffected. This suggests that binomial-editing is a potential method for determining fluid locations, solid organic content, and kerogen–bitumen discrimination.
NASA Astrophysics Data System (ADS)
Yeh, Cheng-Ta; Lin, Yi-Kuei; Yang, Jo-Yun
2018-07-01
Network reliability is an important performance index for many real-life systems, such as electric power systems, computer systems and transportation systems. These systems can be modelled as stochastic-flow networks (SFNs) composed of arcs and nodes. Most system supervisors respect the network reliability maximization by finding the optimal multi-state resource assignment, which is one resource to each arc. However, a disaster may cause correlated failures for the assigned resources, affecting the network reliability. This article focuses on determining the optimal resource assignment with maximal network reliability for SFNs. To solve the problem, this study proposes a hybrid algorithm integrating the genetic algorithm and tabu search to determine the optimal assignment, called the hybrid GA-TS algorithm (HGTA), and integrates minimal paths, recursive sum of disjoint products and the correlated binomial distribution to calculate network reliability. Several practical numerical experiments are adopted to demonstrate that HGTA has better computational quality than several popular soft computing algorithms.
A Three-Parameter Generalisation of the Beta-Binomial Distribution with Applications
1987-07-01
York. Rust, R.T. and Klompmaker, J.E. (1981). Improving the estimation procedure for the beta binomial t.v. exposure model. Journal of Marketing ... Research . 18, 442-448. Sabavala, D.J. and Morrison, D.G. (1977). Television show loyalty: a beta- binomial model using recall data. Journal of Advertiuing
Distribution-free Inference of Zero-inated Binomial Data for Longitudinal Studies.
He, H; Wang, W J; Hu, J; Gallop, R; Crits-Christoph, P; Xia, Y L
2015-10-01
Count reponses with structural zeros are very common in medical and psychosocial research, especially in alcohol and HIV research, and the zero-inflated poisson (ZIP) and zero-inflated negative binomial (ZINB) models are widely used for modeling such outcomes. However, as alcohol drinking outcomes such as days of drinkings are counts within a given period, their distributions are bounded above by an upper limit (total days in the period) and thus inherently follow a binomial or zero-inflated binomial (ZIB) distribution, rather than a Poisson or zero-inflated Poisson (ZIP) distribution, in the presence of structural zeros. In this paper, we develop a new semiparametric approach for modeling zero-inflated binomial (ZIB)-like count responses for cross-sectional as well as longitudinal data. We illustrate this approach with both simulated and real study data.
NASA Astrophysics Data System (ADS)
Brenner, Tom; Chen, Johnny; Stait-Gardner, Tim; Zheng, Gang; Matsukawa, Shingo; Price, William S.
2018-03-01
A new family of binomial-like inversion sequences, named jump-and-return sandwiches (JRS), has been developed by inserting a binomial-like sequence into a standard jump-and-return sequence, discovered through use of a stochastic Genetic Algorithm optimisation. Compared to currently used binomial-like inversion sequences (e.g., 3-9-19 and W5), the new sequences afford wider inversion bands and narrower non-inversion bands with an equal number of pulses. As an example, two jump-and-return sandwich 10-pulse sequences achieved 95% inversion at offsets corresponding to 9.4% and 10.3% of the non-inversion band spacing, compared to 14.7% for the binomial-like W5 inversion sequence, i.e., they afforded non-inversion bands about two thirds the width of the W5 non-inversion band.
Influence of Nurse Aide Absenteeism on Nursing Home Quality.
Castle, Nicholas G; Ferguson-Rome, Jamie C
2015-08-01
In this analysis, the association of nurse aide absenteeism with quality is examined. Absenteeism is the failure of nurse aides to report for work when they are scheduled to work. Data used in this investigation came from survey responses from 3,941 nursing homes; Nursing Home Compare; the Online System for Survey, Certification and Administrative Reporting data; and the Area Resource File. Staffing characteristics, quality indicators, facility, and market information from these data sources were all measured in 2008. The specific quality indicators examined are physical restraint use, catheter use, pain management, and pressure sores using negative binomial regression. An average rate of 9.2% for nurse aide absenteeism was reported in the prior week. We find that high levels of absenteeism are associated with poor performance on all four quality indicators examined. The investigation presented, to our knowledge, is one of the first examining the implications of absenteeism in nursing homes. Absenteeism can be a costly staffing issue, one of the potential costs identified in this analysis is an impact on quality of care. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Revealing Word Order: Using Serial Position in Binomials to Predict Properties of the Speaker
ERIC Educational Resources Information Center
Iliev, Rumen; Smirnova, Anastasia
2016-01-01
Three studies test the link between word order in binomials and psychological and demographic characteristics of a speaker. While linguists have already suggested that psychological, cultural and societal factors are important in choosing word order in binomials, the vast majority of relevant research was focused on general factors and on broadly…
Campaign Strategies and Voter Approval of School Referenda: A Mixed Methods Analysis
ERIC Educational Resources Information Center
Johnson, Paul A.; Ingle, William Kyle
2009-01-01
Drawing from state administrative data and surveys of superintendents in Ohio, this mixed methods study examined factors associated with voters' approval of local school levies. Utilizing binomial logistic regression, this study found that new levies and poverty rates were significantly associated with a decrease in the likelihood of passage.…
Martina, R; Kay, R; van Maanen, R; Ridder, A
2015-01-01
Clinical studies in overactive bladder have traditionally used analysis of covariance or nonparametric methods to analyse the number of incontinence episodes and other count data. It is known that if the underlying distributional assumptions of a particular parametric method do not hold, an alternative parametric method may be more efficient than a nonparametric one, which makes no assumptions regarding the underlying distribution of the data. Therefore, there are advantages in using methods based on the Poisson distribution or extensions of that method, which incorporate specific features that provide a modelling framework for count data. One challenge with count data is overdispersion, but methods are available that can account for this through the introduction of random effect terms in the modelling, and it is this modelling framework that leads to the negative binomial distribution. These models can also provide clinicians with a clearer and more appropriate interpretation of treatment effects in terms of rate ratios. In this paper, the previously used parametric and non-parametric approaches are contrasted with those based on Poisson regression and various extensions in trials evaluating solifenacin and mirabegron in patients with overactive bladder. In these applications, negative binomial models are seen to fit the data well. Copyright © 2014 John Wiley & Sons, Ltd.
Brenner, Tom; Chen, Johnny; Stait-Gardner, Tim; Zheng, Gang; Matsukawa, Shingo; Price, William S
2018-03-01
A new family of binomial-like inversion sequences, named jump-and-return sandwiches (JRS), has been developed by inserting a binomial-like sequence into a standard jump-and-return sequence, discovered through use of a stochastic Genetic Algorithm optimisation. Compared to currently used binomial-like inversion sequences (e.g., 3-9-19 and W5), the new sequences afford wider inversion bands and narrower non-inversion bands with an equal number of pulses. As an example, two jump-and-return sandwich 10-pulse sequences achieved 95% inversion at offsets corresponding to 9.4% and 10.3% of the non-inversion band spacing, compared to 14.7% for the binomial-like W5 inversion sequence, i.e., they afforded non-inversion bands about two thirds the width of the W5 non-inversion band. Copyright © 2018 Elsevier Inc. All rights reserved.
Silva, Guilherme Resende da; Menezes, Liliane Denize Miranda; Lanza, Isabela Pereira; Oliveira, Daniela Duarte de; Silva, Carla Aparecida; Klein, Roger Wilker Tavares; Assis, Débora Cristina Sampaio de; Cançado, Silvana de Vasconcelos
2017-09-01
In order to evaluate the efficiency of the pasteurization process in liquid whole eggs, an UV/visible spectrophotometric method was developed and validated for the assessment of alpha-amylase activity. Samples were collected from 30 lots of raw eggs (n = 30) and divided into three groups: one was reserved for analysis of the raw eggs, the second group was pasteurized at 61.1°C for 3.5 minutes (n = 30), and the third group was pasteurized at 64.4°C for 2.5 minutes (n = 30). In addition to assessing alpha-amylase activity, the microbiological quality of the samples was also evaluated by counting total and thermotolerant coliforms, mesophilic aerobic microorganisms, Staphylococcus spp., and Salmonella spp. The validated spectrophotometric method demonstrated linearity, with a coefficient of determination (R2) greater than 0.99, limits of detection (LOD) and quantification (LOQ) of 0.48 mg kg-1 and 1.16 mg kg-1, respectively, and acceptable precision and accuracy with relative standard deviation (RSD) values of less than 10% and recovery rates between 98.81% and 105.40%. The results for alpha-amylase activity in the raw egg samples showed high enzyme activity due to near-complete hydrolysis of the starch, while in the eggs pasteurized at 61.1°C, partial inactivation of the enzyme was observed. In the samples of whole eggs pasteurized at 64.4°C, starch hydrolysis did not occur due to enzyme inactivation. The results of the microbiological analyses showed a decrease (P < 0.0001) in the counts for all the studied microorganisms and in the frequency of Salmonella spp. in the pasteurized egg samples according to the two binomials under investigation, compared to the raw egg samples, which showed high rates of contamination (P < 0.0001). After pasteurization, only one sample (3.33%) was positive for Salmonella spp., indicating failure in the pasteurization process, which was confirmed by the alpha-amylase test. It was concluded that the validated methodology for testing alpha-amylase activity is adequate for assessing the efficiency of the pasteurization process, and that the time-temperature binomial used in this study is suitable to produce pasteurized eggs with high microbiological quality. © 2017 Poultry Science Association Inc.
Choosing a Transformation in Analyses of Insect Counts from Contagious Distributions with Low Means
W.D. Pepper; S.J. Zarnoch; G.L. DeBarr; P. de Groot; C.D. Tangren
1997-01-01
Guidelines based on computer simulation are suggested for choosing a transformation of insect counts from negative binomial distributions with low mean counts and high levels of contagion. Typical values and ranges of negative binomial model parameters were determined by fitting the model to data from 19 entomological field studies. Random sampling of negative binomial...
Adjusted Wald Confidence Interval for a Difference of Binomial Proportions Based on Paired Data
ERIC Educational Resources Information Center
Bonett, Douglas G.; Price, Robert M.
2012-01-01
Adjusted Wald intervals for binomial proportions in one-sample and two-sample designs have been shown to perform about as well as the best available methods. The adjusted Wald intervals are easy to compute and have been incorporated into introductory statistics courses. An adjusted Wald interval for paired binomial proportions is proposed here and…
NASA Astrophysics Data System (ADS)
Arneodo, M.; Arvidson, A.; Aubert, J. J.; Badełek, B.; Beaufays, J.; Bee, C. P.; Benchouk, C.; Berghoff, G.; Bird, I.; Blum, D.; Böhm, E.; de Bouard, X.; Brasse, F. W.; Braun, H.; Broll, C.; Brown, S.; Brück, H.; Calen, H.; Chima, J. S.; Ciborowski, J.; Clifft, R.; Coignet, G.; Combley, F.; Coughlan, J.; D'Agostini, G.; Dahlgren, S.; Dengler, F.; Derado, I.; Dreyer, T.; Drees, J.; Düren, M.; Eckardt, V.; Edwards, A.; Edwards, M.; Ernst, T.; Eszes, G.; Favier, J.; Ferrero, M. I.; Figiel, J.; Flauger, W.; Foster, J.; Ftáčnik, J.; Gabathuler, E.; Gajewski, J.; Gamet, R.; Gayler, J.; Geddes, N.; Grafström, P.; Grard, F.; Haas, J.; Hagberg, E.; Hasert, F. J.; Hayman, P.; Heusse, P.; Jaffré, M.; Jachołkowska, A.; Janata, F.; Jancsó, G.; Johnson, A. S.; Kabuss, E. M.; Kellner, G.; Korbel, V.; Krüger, J.; Kullander, S.; Landgraf, U.; Lanske, D.; Loken, J.; Long, K.; Maire, M.; Malecki, P.; Manz, A.; Maselli, S.; Mohr, W.; Montanet, F.; Montgomery, H. E.; Nagy, E.; Nassalski, J.; Norton, P. R.; Oakham, F. G.; Osborne, A. M.; Pascaud, C.; Pawlik, B.; Payre, P.; Peroni, C.; Peschel, H.; Pessard, H.; Pettinghale, J.; Pietrzyk, B.; Pietrzyk, U.; Pönsgen, B.; Pötsch, M.; Renton, P.; Ribarics, P.; Rith, K.; Rondio, E.; Sandacz, A.; Scheer, M.; Schlagböhmer, A.; Schiemann, H.; Schmitz, N.; Schneegans, M.; Schneider, A.; Scholz, M.; Schröder, T.; Schultze, K.; Sloan, T.; Stier, H. E.; Studt, M.; Taylor, G. N.; Thénard, J. M.; Thompson, J. C.; de La Torre, A.; Toth, J.; Urban, L.; Urban, L.; Wallucks, W.; Whalley, M.; Wheeler, S.; Williams, W. S. C.; Wimpenny, S. J.; Windmolders, R.; Wolf, G.
1987-09-01
The multiplicity distributions of charged hadrons produced in the deep inelastic muon-proton scattering at 280 GeV are analysed in various rapidity intervals, as a function of the total hadronic centre of mass energy W ranging from 4 20 GeV. Multiplicity distributions for the backward and forward hemispheres are also analysed separately. The data can be well parameterized by binomial distributions, extending their range of applicability to the case of lepton-proton scattering. The energy and the rapidity dependence of the parameters is presented and a smooth transition from the negative binomial distribution via Poissonian to the ordinary binomial is observed.
Censored Hurdle Negative Binomial Regression (Case Study: Neonatorum Tetanus Case in Indonesia)
NASA Astrophysics Data System (ADS)
Yuli Rusdiana, Riza; Zain, Ismaini; Wulan Purnami, Santi
2017-06-01
Hurdle negative binomial model regression is a method that can be used for discreate dependent variable, excess zero and under- and overdispersion. It uses two parts approach. The first part estimates zero elements from dependent variable is zero hurdle model and the second part estimates not zero elements (non-negative integer) from dependent variable is called truncated negative binomial models. The discrete dependent variable in such cases is censored for some values. The type of censor that will be studied in this research is right censored. This study aims to obtain the parameter estimator hurdle negative binomial regression for right censored dependent variable. In the assessment of parameter estimation methods used Maximum Likelihood Estimator (MLE). Hurdle negative binomial model regression for right censored dependent variable is applied on the number of neonatorum tetanus cases in Indonesia. The type data is count data which contains zero values in some observations and other variety value. This study also aims to obtain the parameter estimator and test statistic censored hurdle negative binomial model. Based on the regression results, the factors that influence neonatorum tetanus case in Indonesia is the percentage of baby health care coverage and neonatal visits.
Explaining reduction of pedestrian-motor vehicle crashes in Arkhangelsk, Russia, in 2005-2010.
Kudryavtsev, Alexander V; Nilssen, Odd; Lund, Johan; Grjibovski, Andrej M; Ytterstad, Børge
2012-01-01
To explain a reduction in pedestrian-motor vehicle crashes in Arkhangelsk, Russia, in 2005-2010. Retrospective ecological study. For 2005-2010, police data on pedestrian-motor vehicle crashes, traffic violations, and total motor vehicles (MVs) were combined with data on changes in national road traffic legislation and municipal road infrastructure. Negative binomial regression was used to investigate trends in monthly rates of pedestrian-motor vehicle crashes per total MVs and estimate changes in these rates per unit changes in the safety measures. During the 6 years, the police registered 2,565 pedestrian-motor vehicle crashes: 1,597 (62%) outside crosswalks, 766 (30%) on non-signalized crosswalks, and 202 (8%) on signalized crosswalks. Crash rates outside crosswalks and on signalized crosswalks decreased on average by 1.1% per month, whereas the crash rate on non-signalized crosswalks remained unchanged. Numbers of signalized and non-signalized crosswalks increased by 14 and 19%, respectively. Also, 10% of non-signalized crosswalks were combined with speed humps, and 4% with light-reflecting vertical signs. Pedestrian penalties for traffic violations increased 4-fold. Driver penalties for ignoring prohibiting signal and failure to give way to pedestrian on non-signalized crosswalk increased 7- and 8-fold, respectively. The rate of total registered drivers' traffic violations per total MVs decreased on average by 0.3% per month. All studied infrastructure and legislative measures had inverse associations with the rate of crashes outside crosswalks. The rate of crashes on signalized crosswalks showed inverse associations with related monetary penalties. The introduction of infrastructure and legislative measures is the most probable explanation of the reduction of pedestrian-motor vehicle crashes in Arkhangelsk. The overall reduction is due to decreases in rates of crashes outside crosswalks and on signalized crosswalks. No change was observed in the rate of crashes on non-signalized crosswalks.
On the p, q-binomial distribution and the Ising model
NASA Astrophysics Data System (ADS)
Lundow, P. H.; Rosengren, A.
2010-08-01
We employ p, q-binomial coefficients, a generalisation of the binomial coefficients, to describe the magnetisation distributions of the Ising model. For the complete graph this distribution corresponds exactly to the limit case p = q. We apply our investigation to the simple d-dimensional lattices for d = 1, 2, 3, 4, 5 and fit p, q-binomial distributions to our data, some of which are exact but most are sampled. For d = 1 and d = 5, the magnetisation distributions are remarkably well-fitted by p,q-binomial distributions. For d = 4 we are only slightly less successful, while for d = 2, 3 we see some deviations (with exceptions!) between the p, q-binomial and the Ising distribution. However, at certain temperatures near T c the statistical moments of the fitted distribution agree with the moments of the sampled data within the precision of sampling. We begin the paper by giving results of the behaviour of the p, q-distribution and its moment growth exponents given a certain parameterisation of p, q. Since the moment exponents are known for the Ising model (or at least approximately for d = 3) we can predict how p, q should behave and compare this to our measured p, q. The results speak in favour of the p, q-binomial distribution's correctness regarding its general behaviour in comparison to the Ising model. The full extent to which they correctly model the Ising distribution, however, is not settled.
Tobit analysis of vehicle accident rates on interstate highways.
Anastasopoulos, Panagiotis Ch; Tarko, Andrew P; Mannering, Fred L
2008-03-01
There has been an abundance of research that has used Poisson models and its variants (negative binomial and zero-inflated models) to improve our understanding of the factors that affect accident frequencies on roadway segments. This study explores the application of an alternate method, tobit regression, by viewing vehicle accident rates directly (instead of frequencies) as a continuous variable that is left-censored at zero. Using data from vehicle accidents on Indiana interstates, the estimation results show that many factors relating to pavement condition, roadway geometrics and traffic characteristics significantly affect vehicle accident rates.
The Dirichlet-Multinomial Model for Multivariate Randomized Response Data and Small Samples
ERIC Educational Resources Information Center
Avetisyan, Marianna; Fox, Jean-Paul
2012-01-01
In survey sampling the randomized response (RR) technique can be used to obtain truthful answers to sensitive questions. Although the individual answers are masked due to the RR technique, individual (sensitive) response rates can be estimated when observing multivariate response data. The beta-binomial model for binary RR data will be generalized…
C-5A Cargo Deck Low-Frequency Vibration Environment
1975-02-01
SAMPLE VIBRATION CALCULATIONS 13 1. Normal Distribution 13 2. Binomial Distribution 15 IV CONCLUSIONS 17 -! V REFERENCES 18 t: FEiCENDIJJ PAGS 2LANKNOT...Calculation for Binomial Distribution 108 (Vertical Acceleration, Right Rear Cargo Deck) xi I. INTRODUCTION The availability of large transport...the end of taxi. These peaks could then be used directly to compile the probability of occurrence of specific values of acceleration using the binomial
Identifiability in N-mixture models: a large-scale screening test with bird data.
Kéry, Marc
2018-02-01
Binomial N-mixture models have proven very useful in ecology, conservation, and monitoring: they allow estimation and modeling of abundance separately from detection probability using simple counts. Recently, doubts about parameter identifiability have been voiced. I conducted a large-scale screening test with 137 bird data sets from 2,037 sites. I found virtually no identifiability problems for Poisson and zero-inflated Poisson (ZIP) binomial N-mixture models, but negative-binomial (NB) models had problems in 25% of all data sets. The corresponding multinomial N-mixture models had no problems. Parameter estimates under Poisson and ZIP binomial and multinomial N-mixture models were extremely similar. Identifiability problems became a little more frequent with smaller sample sizes (267 and 50 sites), but were unaffected by whether the models did or did not include covariates. Hence, binomial N-mixture model parameters with Poisson and ZIP mixtures typically appeared identifiable. In contrast, NB mixtures were often unidentifiable, which is worrying since these were often selected by Akaike's information criterion. Identifiability of binomial N-mixture models should always be checked. If problems are found, simpler models, integrated models that combine different observation models or the use of external information via informative priors or penalized likelihoods, may help. © 2017 by the Ecological Society of America.
ERIC Educational Resources Information Center
Levin, Eugene M.
1981-01-01
Student access to programmable calculators and computer terminals, coupled with a familiarity with baseball, provides opportunities to enhance their understanding of the binomial distribution and other aspects of analysis. (MP)
NASA Astrophysics Data System (ADS)
Handayani, Dewi; Cahyaning Putri, Hera; Mahmudah, AMH
2017-12-01
Solo-Ngawi toll road project is part of the mega project of the Trans Java toll road development initiated by the government and is still under construction until now. PT Solo Ngawi Jaya (SNJ) as the Solo-Ngawi toll management company needs to determine the toll fare that is in accordance with the business plan. The determination of appropriate toll rates will affect progress in regional economic sustainability and decrease the traffic congestion. These policy instruments is crucial for achieving environmentally sustainable transport. Therefore, the objective of this research is to find out how the toll fare sensitivity of Solo-Ngawi toll road based on Willingness To Pay (WTP). Primary data was obtained by distributing stated preference questionnaires to four wheeled vehicle users in Kartasura-Palang Joglo artery road segment. Further data obtained will be analysed with logit and probit model. Based on the analysis, it is found that the effect of fare change on the amount of WTP on the binomial logit model is more sensitive than the probit model on the same travel conditions. The range of tariff change against values of WTP on the binomial logit model is 20% greater than the range of values in the probit model . On the other hand, the probability results of the binomial logit model and the binary probit have no significant difference (less than 1%).
Chen, Wansu; Shi, Jiaxiao; Qian, Lei; Azen, Stanley P
2014-06-26
To estimate relative risks or risk ratios for common binary outcomes, the most popular model-based methods are the robust (also known as modified) Poisson and the log-binomial regression. Of the two methods, it is believed that the log-binomial regression yields more efficient estimators because it is maximum likelihood based, while the robust Poisson model may be less affected by outliers. Evidence to support the robustness of robust Poisson models in comparison with log-binomial models is very limited. In this study a simulation was conducted to evaluate the performance of the two methods in several scenarios where outliers existed. The findings indicate that for data coming from a population where the relationship between the outcome and the covariate was in a simple form (e.g. log-linear), the two models yielded comparable biases and mean square errors. However, if the true relationship contained a higher order term, the robust Poisson models consistently outperformed the log-binomial models even when the level of contamination is low. The robust Poisson models are more robust (or less sensitive) to outliers compared to the log-binomial models when estimating relative risks or risk ratios for common binary outcomes. Users should be aware of the limitations when choosing appropriate models to estimate relative risks or risk ratios.
Yim, Cindi K; Barrón, Yolanda; Moore, Stanley; Murtaugh, Chris; Lala, Anuradha; Aldridge, Melissa; Goldstein, Nathan; Gelfman, Laura P
2017-03-01
Patients with advanced heart failure (HF) enroll in hospice at low rates, and data on their acute medical service utilization after hospice enrollment is limited. We performed a descriptive analysis of Medicare fee-for-service beneficiaries, with at least one home health claim between July 1, 2009, and June 30, 2010, and at least 2 HF hospitalizations between July 1, 2009, and December 31, 2009, who subsequently enrolled in hospice between July 1, 2009, and December 31, 2009. We estimated panel-negative binomial models on a subset of beneficiaries to compare their acute medical service utilization before and after enrollment. Our sample size included 5073 beneficiaries: 55% were female, 45% were ≥85 years of age, 13% were non-white, and the mean comorbidity count was 2.38 (standard deviation 1.22). The median number of days between the second HF hospital discharge and hospice enrollment was 45. The median number of days enrolled in hospice was 15, and 39% of the beneficiaries died within 7 days of enrollment. During the study period, 11% of the beneficiaries disenrolled from hospice at least once. The adjusted mean number of hospital, intensive care unit, and emergency room admissions decreased from 2.56, 0.87, and 1.17 before hospice enrollment to 0.53, 0.19, and 0.76 after hospice enrollment. Home health care Medicare beneficiaries with advanced HF who enrolled in hospice had lower acute medical service utilization after their enrollment. Their pattern of hospice use suggests that earlier referral and improved retention may benefit this population. Further research is necessary to understand hospice referral and palliative care needs of advanced HF patients. © 2017 American Heart Association, Inc.
Yim, Cindi K.; Barrón, Yolanda; Moore, Stanley; Murtaugh, Chris; Lala, Anuradha; Aldridge, Melissa; Goldstein, Nathan; Gelfman, Laura P.
2017-01-01
Background Patients with advanced heart failure (HF) enroll in hospice at low rates and data on their acute medical service utilization following hospice enrollment is limited. Methods and Results We performed a descriptive analysis of Medicare fee-for-service beneficiaries, with at least one home health claim between 07/01/2009 and 06/30/2010, and at least two HF hospitalizations between 07/01/2009 and 12/31/2009, who subsequently enrolled in hospice between 07/01/2009 and 12/31/2009. We estimated panel negative binomial models on a subset of beneficiaries to compare their acute medical service utilization before and after enrollment. Our sample size included 5,073 beneficiaries: 55% were female, 45% were ≥ 85 years of age, 13% were non-white, and the mean comorbidity count was 2.38 (STD 1.22). The median number of days between the second HF hospital discharge and hospice enrollment was 45. The median number of days enrolled in hospice was 15, and 39% of the beneficiaries died within 7 days of enrollment. During the study period, 11% of the beneficiaries disenrolled from hospice at least once. The adjusted mean number of hospital, ICU, and ER admissions decreased from 2.56, 0.87, and 1.17 before hospice enrollment to 0.53, 0.19, and 0.76 after hospice enrollment. Conclusions Home health care Medicare beneficiaries with advanced HF who enrolled in hospice had lower acute medical service utilization following their enrollment. Their pattern of hospice use suggests that earlier referral and improved retention may benefit this population. Further research is necessary to understand hospice referral and palliative care needs of advanced HF patients. PMID:28292824
Marginalized zero-inflated negative binomial regression with application to dental caries
Preisser, John S.; Das, Kalyan; Long, D. Leann; Divaris, Kimon
2015-01-01
The zero-inflated negative binomial regression model (ZINB) is often employed in diverse fields such as dentistry, health care utilization, highway safety, and medicine to examine relationships between exposures of interest and overdispersed count outcomes exhibiting many zeros. The regression coefficients of ZINB have latent class interpretations for a susceptible subpopulation at risk for the disease/condition under study with counts generated from a negative binomial distribution and for a non-susceptible subpopulation that provides only zero counts. The ZINB parameters, however, are not well-suited for estimating overall exposure effects, specifically, in quantifying the effect of an explanatory variable in the overall mixture population. In this paper, a marginalized zero-inflated negative binomial regression (MZINB) model for independent responses is proposed to model the population marginal mean count directly, providing straightforward inference for overall exposure effects based on maximum likelihood estimation. Through simulation studies, the finite sample performance of MZINB is compared to marginalized zero-inflated Poisson, Poisson, and negative binomial regression. The MZINB model is applied in the evaluation of a school-based fluoride mouthrinse program on dental caries in 677 children. PMID:26568034
Assessing historical rate changes in global tsunami occurrence
Geist, E.L.; Parsons, T.
2011-01-01
The global catalogue of tsunami events is examined to determine if transient variations in tsunami rates are consistent with a Poisson process commonly assumed for tsunami hazard assessments. The primary data analyzed are tsunamis with maximum sizes >1m. The record of these tsunamis appears to be complete since approximately 1890. A secondary data set of tsunamis >0.1m is also analyzed that appears to be complete since approximately 1960. Various kernel density estimates used to determine the rate distribution with time indicate a prominent rate change in global tsunamis during the mid-1990s. Less prominent rate changes occur in the early- and mid-20th century. To determine whether these rate fluctuations are anomalous, the distribution of annual event numbers for the tsunami catalogue is compared to Poisson and negative binomial distributions, the latter of which includes the effects of temporal clustering. Compared to a Poisson distribution, the negative binomial distribution model provides a consistent fit to tsunami event numbers for the >1m data set, but the Poisson null hypothesis cannot be falsified for the shorter duration >0.1m data set. Temporal clustering of tsunami sources is also indicated by the distribution of interevent times for both data sets. Tsunami event clusters consist only of two to four events, in contrast to protracted sequences of earthquakes that make up foreshock-main shock-aftershock sequences. From past studies of seismicity, it is likely that there is a physical triggering mechanism responsible for events within the tsunami source 'mini-clusters'. In conclusion, prominent transient rate increases in the occurrence of global tsunamis appear to be caused by temporal grouping of geographically distinct mini-clusters, in addition to the random preferential location of global M >7 earthquakes along offshore fault zones.
Speech-discrimination scores modeled as a binomial variable.
Thornton, A R; Raffin, M J
1978-09-01
Many studies have reported variability data for tests of speech discrimination, and the disparate results of these studies have not been given a simple explanation. Arguments over the relative merits of 25- vs 50-word tests have ignored the basic mathematical properties inherent in the use of percentage scores. The present study models performance on clinical tests of speech discrimination as a binomial variable. A binomial model was developed, and some of its characteristics were tested against data from 4120 scores obtained on the CID Auditory Test W-22. A table for determining significant deviations between scores was generated and compared to observed differences in half-list scores for the W-22 tests. Good agreement was found between predicted and observed values. Implications of the binomial characteristics of speech-discrimination scores are discussed.
Possibility and Challenges of Conversion of Current Virus Species Names to Linnaean Binomials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Postler, Thomas S.; Clawson, Anna N.; Amarasinghe, Gaya K.
Botanical, mycological, zoological, and prokaryotic species names follow the Linnaean format, consisting of an italicized Latinized binomen with a capitalized genus name and a lower case species epithet (e.g., Homo sapiens). Virus species names, however, do not follow a uniform format, and, even when binomial, are not Linnaean in style. In this thought exercise, we attempted to convert all currently official names of species included in the virus family Arenaviridae and the virus order Mononegavirales to Linnaean binomials, and to identify and address associated challenges and concerns. Surprisingly, this endeavor was not as complicated or time-consuming as even the authorsmore » of this article expected when conceiving the experiment. [Arenaviridae; binomials; ICTV; International Committee on Taxonomy of Viruses; Mononegavirales; virus nomenclature; virus taxonomy.]« less
Statistical methods for the beta-binomial model in teratology.
Yamamoto, E; Yanagimoto, T
1994-01-01
The beta-binomial model is widely used for analyzing teratological data involving littermates. Recent developments in statistical analyses of teratological data are briefly reviewed with emphasis on the model. For statistical inference of the parameters in the beta-binomial distribution, separation of the likelihood introduces an likelihood inference. This leads to reducing biases of estimators and also to improving accuracy of empirical significance levels of tests. Separate inference of the parameters can be conducted in a unified way. PMID:8187716
Grennan, J Troy; Loutfy, Mona R; Su, DeSheng; Harrigan, P Richard; Cooper, Curtis; Klein, Marina; Machouf, Nima; Montaner, Julio S G; Rourke, Sean; Tsoukas, Christos; Hogg, Bob; Raboud, Janet
2012-04-15
The importance of human immunodeficiency virus (HIV) blip magnitude on virologic rebound has been raised in clinical guidelines relating to viral load assays. Antiretroviral-naive individuals initiating combination antiretroviral therapy (cART) after 1 January 2000 and achieving virologic suppression were studied. Negative binomial models were used to identify blip correlates. Recurrent event models were used to determine the association between blips and rebound by incorporating multiple periods of virologic suppression per individual. 3550 participants (82% male; median age, 40 years) were included. In a multivariable negative binomial regression model, the Amplicor assay was associated with a lower blip rate than branched DNA (rate ratio, 0.69; P < .01), controlling for age, sex, region, baseline HIV-1 RNA and CD4 count, AIDS-defining illnesses, year of cART initiation, cART type, and HIV-1 RNA testing frequency. In a multivariable recurrent event model controlling for age, sex, intravenous drug use, cART start year, cART type, assay type, and HIV-1 RNA testing frequency, blips of 500-999 copies/mL were associated with virologic rebound (hazard ratio, 2.70; P = .002), whereas blips of 50-499 were not. HIV-1 RNA assay was an important determinant of blip rates and should be considered in clinical guidelines. Blips ≥500 copies/mL were associated with increased rebound risk.
Prognostic Significance of Baseline Serum Sodium in Heart Failure With Preserved Ejection Fraction.
Patel, Yash R; Kurgansky, Katherine E; Imran, Tasnim F; Orkaby, Ariela R; McLean, Robert R; Ho, Yuk-Lam; Cho, Kelly; Gaziano, J Michael; Djousse, Luc; Gagnon, David R; Joseph, Jacob
2018-06-13
The purpose of this study was to evaluate the relationship between serum sodium at the time of diagnosis and long term clinical outcomes in a large national cohort of patients with heart failure with preserved ejection fraction. We studied 25 440 patients with heart failure with preserved ejection fraction treated at Veterans Affairs medical centers across the United States between 2002 and 2012. Serum sodium at the time of heart failure diagnosis was analyzed as a continuous variable and in categories as follows: low (115.00-134.99 mmol/L), low-normal (135.00-137.99 mmol/L), referent group (138.00-140.99 mmol/L), high normal (141.00-143.99 mmol/L), and high (144.00-160.00 mmol/L). Multivariable Cox regression and negative binomial regression were performed to estimate hazard ratios (95% confidence interval [CI]) and incidence density ratios (95% CI) for the associations of serum sodium with mortality and hospitalizations (heart failure and all-cause), respectively. The average age of patients was 70.8 years, 96.2% were male, and 14% were black. Compared with the referent group, low, low-normal, and high sodium values were associated with 36% (95% CI, 28%-44%), 6% (95% CI, 1%-12%), and 9% (95% CI, 1%-17%) higher risk of all-cause mortality, respectively. Low and low-normal serum sodium were associated with 48% (95% CI, 10%-100%) and 38% (95% CI, 8%-77%) higher risk of number of days of heart failure hospitalizations per year, and with 44% (95% CI, 32%-56%) and 18% (95% CI, 10%-27%) higher risk of number of days of all-cause hospitalizations per year, respectively. Both elevated and reduced serum sodium, including values currently considered within normal range, are associated with adverse outcomes in patients with heart failure with preserved ejection fraction. © 2018 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
NASA Astrophysics Data System (ADS)
Amaliana, Luthfatul; Sa'adah, Umu; Wayan Surya Wardhani, Ni
2017-12-01
Tetanus Neonatorum is an infectious disease that can be prevented by immunization. The number of Tetanus Neonatorum cases in East Java Province is the highest in Indonesia until 2015. Tetanus Neonatorum data contain over dispersion and big enough proportion of zero-inflation. Negative Binomial (NB) regression is an alternative method when over dispersion happens in Poisson regression. However, the data containing over dispersion and zero-inflation are more appropriately analyzed by using Zero-Inflated Negative Binomial (ZINB) regression. The purpose of this study are: (1) to model Tetanus Neonatorum cases in East Java Province with 71.05 percent proportion of zero-inflation by using NB and ZINB regression, (2) to obtain the best model. The result of this study indicates that ZINB is better than NB regression with smaller AIC.
Simulation on Poisson and negative binomial models of count road accident modeling
NASA Astrophysics Data System (ADS)
Sapuan, M. S.; Razali, A. M.; Zamzuri, Z. H.; Ibrahim, K.
2016-11-01
Accident count data have often been shown to have overdispersion. On the other hand, the data might contain zero count (excess zeros). The simulation study was conducted to create a scenarios which an accident happen in T-junction with the assumption the dependent variables of generated data follows certain distribution namely Poisson and negative binomial distribution with different sample size of n=30 to n=500. The study objective was accomplished by fitting Poisson regression, negative binomial regression and Hurdle negative binomial model to the simulated data. The model validation was compared and the simulation result shows for each different sample size, not all model fit the data nicely even though the data generated from its own distribution especially when the sample size is larger. Furthermore, the larger sample size indicates that more zeros accident count in the dataset.
Analysis of generalized negative binomial distributions attached to hyperbolic Landau levels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chhaiba, Hassan, E-mail: chhaiba.hassan@gmail.com; Demni, Nizar, E-mail: nizar.demni@univ-rennes1.fr; Mouayn, Zouhair, E-mail: mouayn@fstbm.ac.ma
2016-07-15
To each hyperbolic Landau level of the Poincaré disc is attached a generalized negative binomial distribution. In this paper, we compute the moment generating function of this distribution and supply its atomic decomposition as a perturbation of the negative binomial distribution by a finitely supported measure. Using the Mandel parameter, we also discuss the nonclassical nature of the associated coherent states. Next, we derive a Lévy-Khintchine-type representation of its characteristic function when the latter does not vanish and deduce that it is quasi-infinitely divisible except for the lowest hyperbolic Landau level corresponding to the negative binomial distribution. By considering themore » total variation of the obtained quasi-Lévy measure, we introduce a new infinitely divisible distribution for which we derive the characteristic function.« less
Binomial leap methods for simulating stochastic chemical kinetics.
Tian, Tianhai; Burrage, Kevin
2004-12-01
This paper discusses efficient simulation methods for stochastic chemical kinetics. Based on the tau-leap and midpoint tau-leap methods of Gillespie [D. T. Gillespie, J. Chem. Phys. 115, 1716 (2001)], binomial random variables are used in these leap methods rather than Poisson random variables. The motivation for this approach is to improve the efficiency of the Poisson leap methods by using larger stepsizes. Unlike Poisson random variables whose range of sample values is from zero to infinity, binomial random variables have a finite range of sample values. This probabilistic property has been used to restrict possible reaction numbers and to avoid negative molecular numbers in stochastic simulations when larger stepsize is used. In this approach a binomial random variable is defined for a single reaction channel in order to keep the reaction number of this channel below the numbers of molecules that undergo this reaction channel. A sampling technique is also designed for the total reaction number of a reactant species that undergoes two or more reaction channels. Samples for the total reaction number are not greater than the molecular number of this species. In addition, probability properties of the binomial random variables provide stepsize conditions for restricting reaction numbers in a chosen time interval. These stepsize conditions are important properties of robust leap control strategies. Numerical results indicate that the proposed binomial leap methods can be applied to a wide range of chemical reaction systems with very good accuracy and significant improvement on efficiency over existing approaches. (c) 2004 American Institute of Physics.
Predictive accuracy of particle filtering in dynamic models supporting outbreak projections.
Safarishahrbijari, Anahita; Teyhouee, Aydin; Waldner, Cheryl; Liu, Juxin; Osgood, Nathaniel D
2017-09-26
While a new generation of computational statistics algorithms and availability of data streams raises the potential for recurrently regrounding dynamic models with incoming observations, the effectiveness of such arrangements can be highly subject to specifics of the configuration (e.g., frequency of sampling and representation of behaviour change), and there has been little attempt to identify effective configurations. Combining dynamic models with particle filtering, we explored a solution focusing on creating quickly formulated models regrounded automatically and recurrently as new data becomes available. Given a latent underlying case count, we assumed that observed incident case counts followed a negative binomial distribution. In accordance with the condensation algorithm, each such observation led to updating of particle weights. We evaluated the effectiveness of various particle filtering configurations against each other and against an approach without particle filtering according to the accuracy of the model in predicting future prevalence, given data to a certain point and a norm-based discrepancy metric. We examined the effectiveness of particle filtering under varying times between observations, negative binomial dispersion parameters, and rates with which the contact rate could evolve. We observed that more frequent observations of empirical data yielded super-linearly improved accuracy in model predictions. We further found that for the data studied here, the most favourable assumptions to make regarding the parameters associated with the negative binomial distribution and changes in contact rate were robust across observation frequency and the observation point in the outbreak. Combining dynamic models with particle filtering can perform well in projecting future evolution of an outbreak. Most importantly, the remarkable improvements in predictive accuracy resulting from more frequent sampling suggest that investments to achieve efficient reporting mechanisms may be more than paid back by improved planning capacity. The robustness of the results on particle filter configuration in this case study suggests that it may be possible to formulate effective standard guidelines and regularized approaches for such techniques in particular epidemiological contexts. Most importantly, the work tentatively suggests potential for health decision makers to secure strong guidance when anticipating outbreak evolution for emerging infectious diseases by combining even very rough models with particle filtering method.
Núñez, Julio; Rabinovich, Gabriel A.; Sandino, Justo; Mainar, Luis; Palau, Patricia; Santas, Enrique; Villanueva, Maria Pilar; Núñez, Eduardo; Bodí, Vicent; Chorro, Francisco J.; Miñana, Gema; Sanchis, Juan
2015-01-01
Aims Galectin-3 (Gal-3) and carbohydrate antigen 125 (CA125) have emerged as robust prognostic biomarkers in heart failure. Experimental data have also suggested a potential molecular interaction between CA125 and Gal-3; however, the biological and clinical relevance of this interaction is still uncertain. We sought to evaluate, in patients admitted for acute heart failure, the association between plasma Gal-3 with all-cause mortality and the risk for rehospitalizations among high and low levels of CA125. Methods and Results We included 264 consecutive patients admitted for acute heart failure to the Cardiology Department in a third-level center. Both biomarkers were measured on admission. Negative binomial and Cox regression models were used to evaluate the prognostic effect of the interaction between Gal-3 and CA125 (dichotomized by its median) with hospital readmission and all-cause mortality, respectively. During a median follow-up of 2 years (IQR = 1-2.8), 108 (40.9%) patients deaths and 365 rehospitalizations in 171 (69.5%) patients were registered. In a multivariable setting, the effect of Gal-3 on mortality and rehospitalization was differentially mediated by CA125 (p = 0.007 and p<0.001, respectively). Indeed, in patients with CA125 above median (>67 U/ml), values across the continuum of Gal-3 showed a positive and almost linear relationship with either the risk of death or rehospitalization. Conversely, when CA125 was below median (≤67 U/ml), Gal-3 lacked any prognostic effect on both endpoints. Conclusion In patients with acute heart failure, Gal-3 was strongly associated with higher risk of long-term mortality and repeated rehospitalizations, but only in those patients exhibiting higher values of CA125 (above 67 U/ml). PMID:25875367
The Binomial Distribution in Shooting
ERIC Educational Resources Information Center
Chalikias, Miltiadis S.
2009-01-01
The binomial distribution is used to predict the winner of the 49th International Shooting Sport Federation World Championship in double trap shooting held in 2006 in Zagreb, Croatia. The outcome of the competition was definitely unexpected.
Preisser, John S; Long, D Leann; Stamm, John W
2017-01-01
Marginalized zero-inflated count regression models have recently been introduced for the statistical analysis of dental caries indices and other zero-inflated count data as alternatives to traditional zero-inflated and hurdle models. Unlike the standard approaches, the marginalized models directly estimate overall exposure or treatment effects by relating covariates to the marginal mean count. This article discusses model interpretation and model class choice according to the research question being addressed in caries research. Two data sets, one consisting of fictional dmft counts in 2 groups and the other on DMFS among schoolchildren from a randomized clinical trial comparing 3 toothpaste formulations to prevent incident dental caries, are analyzed with negative binomial hurdle, zero-inflated negative binomial, and marginalized zero-inflated negative binomial models. In the first example, estimates of treatment effects vary according to the type of incidence rate ratio (IRR) estimated by the model. Estimates of IRRs in the analysis of the randomized clinical trial were similar despite their distinctive interpretations. The choice of statistical model class should match the study's purpose, while accounting for the broad decline in children's caries experience, such that dmft and DMFS indices more frequently generate zero counts. Marginalized (marginal mean) models for zero-inflated count data should be considered for direct assessment of exposure effects on the marginal mean dental caries count in the presence of high frequencies of zero counts. © 2017 S. Karger AG, Basel.
Preisser, John S.; Long, D. Leann; Stamm, John W.
2017-01-01
Marginalized zero-inflated count regression models have recently been introduced for the statistical analysis of dental caries indices and other zero-inflated count data as alternatives to traditional zero-inflated and hurdle models. Unlike the standard approaches, the marginalized models directly estimate overall exposure or treatment effects by relating covariates to the marginal mean count. This article discusses model interpretation and model class choice according to the research question being addressed in caries research. Two datasets, one consisting of fictional dmft counts in two groups and the other on DMFS among schoolchildren from a randomized clinical trial (RCT) comparing three toothpaste formulations to prevent incident dental caries, are analysed with negative binomial hurdle (NBH), zero-inflated negative binomial (ZINB), and marginalized zero-inflated negative binomial (MZINB) models. In the first example, estimates of treatment effects vary according to the type of incidence rate ratio (IRR) estimated by the model. Estimates of IRRs in the analysis of the RCT were similar despite their distinctive interpretations. Choice of statistical model class should match the study’s purpose, while accounting for the broad decline in children’s caries experience, such that dmft and DMFS indices more frequently generate zero counts. Marginalized (marginal mean) models for zero-inflated count data should be considered for direct assessment of exposure effects on the marginal mean dental caries count in the presence of high frequencies of zero counts. PMID:28291962
The magnetisation distribution of the Ising model - a new approach
NASA Astrophysics Data System (ADS)
Hakan Lundow, Per; Rosengren, Anders
2010-03-01
A completely new approach to the Ising model in 1 to 5 dimensions is developed. We employ a generalisation of the binomial coefficients to describe the magnetisation distributions of the Ising model. For the complete graph this distribution is exact. For simple lattices of dimensions d=1 and d=5 the magnetisation distributions are remarkably well-fitted by the generalized binomial distributions. For d=4 we are only slightly less successful, while for d=2,3 we see some deviations (with exceptions!) between the generalized binomial and the Ising distribution. The results speak in favour of the generalized binomial distribution's correctness regarding their general behaviour in comparison to the Ising model. A theoretical analysis of the distribution's moments also lends support their being correct asymptotically, including the logarithmic corrections in d=4. The full extent to which they correctly model the Ising distribution, and for which graph families, is not settled though.
Abstract knowledge versus direct experience in processing of binomial expressions
Morgan, Emily; Levy, Roger
2016-01-01
We ask whether word order preferences for binomial expressions of the form A and B (e.g. bread and butter) are driven by abstract linguistic knowledge of ordering constraints referencing the semantic, phonological, and lexical properties of the constituent words, or by prior direct experience with the specific items in questions. Using forced-choice and self-paced reading tasks, we demonstrate that online processing of never-before-seen binomials is influenced by abstract knowledge of ordering constraints, which we estimate with a probabilistic model. In contrast, online processing of highly frequent binomials is primarily driven by direct experience, which we estimate from corpus frequency counts. We propose a trade-off wherein processing of novel expressions relies upon abstract knowledge, while reliance upon direct experience increases with increased exposure to an expression. Our findings support theories of language processing in which both compositional generation and direct, holistic reuse of multi-word expressions play crucial roles. PMID:27776281
Calibration of short rate term structure models from bid-ask coupon bond prices
NASA Astrophysics Data System (ADS)
Gomes-Gonçalves, Erika; Gzyl, Henryk; Mayoral, Silvia
2018-02-01
In this work we use the method of maximum entropy in the mean to provide a model free, non-parametric methodology that uses only market data to provide the prices of the zero coupon bonds, and then, a term structure of the short rates. The data used consists of the prices of the bid-ask ranges of a few coupon bonds quoted in the market. The prices of the zero coupon bonds obtained in the first stage, are then used as input to solve a recursive set of equations to determine a binomial recombinant model of the short term structure of the interest rates.
Oral health of schoolchildren in Western Australia.
Arrow, P
2016-09-01
The West Australian School Dental Service (SDS) provides free, statewide, primary dental care to schoolchildren aged 5-17 years. This study reports on an evaluation of the oral health of children examined during the 2014 calendar year. Children were sampled, based on their date of birth, and SDS clinicians collected the clinical information. Weighted mean values of caries experience were presented. Negative binomial regression modelling was undertaken to test for factors of significance in the rate of caries occurrence. Data from children aged 5-15 years were used (girls = 4616, boys = 4900). Mean dmft (5-10-year-olds), 1.42 SE 0.03; mean DMFT (6-15-year-olds), 0.51 SE 0.01. Negative binomial regression model of permanent tooth caries found higher rates of caries in children who were from non-fluoridated areas (RR 2.1); Aboriginal (RR 2.4); had gingival inflammation (RR 1.5); lower ICSEA level (RR 1.4); and recalled at more than 24-month interval (RR 1.8). The study highlighted poor dental health associated with living in non-fluoridated areas, Aboriginal identity, poor oral hygiene, lower socioeconomic level and having extended intervals between dental checkups. Timely assessments and preventive measures targeted at groups, including extending community water fluoridation, may assist in further improving the oral health of children in Western Australia. © 2015 Australian Dental Association.
Community covariates of malnutrition based mortality among older adults.
Lee, Matthew R; Berthelot, Emily R
2010-05-01
The purpose of this study was to identify community level covariates of malnutrition-based mortality among older adults. A community level framework was delineated which explains rates of malnutrition-related mortality among older adults as a function of community levels of socioeconomic disadvantage, disability, and social isolation among members of this group. County level data on malnutrition mortality of people 65 years of age and older for the period 2000-2003 were drawn from the CDC WONDER system databases. County level measures of older adult socioeconomic disadvantage, disability, and social isolation were derived from the 2000 US Census of Population and Housing. Negative binomial regression models adjusting for the size of the population at risk, racial composition, urbanism, and region were estimated to assess the relationships among these indicators. Results from negative binomial regression analysis yielded the following: a standard deviation increase in socioeconomic/physical disadvantage was associated with a 12% increase in the rate of malnutrition mortality among older adults (p < 0.001), whereas a standard deviation increase in social isolation was associated with a 5% increase in malnutrition mortality among older adults (p < 0.05). Community patterns of malnutrition based mortality among older adults are partly a function of levels of socioeconomic and physical disadvantage and social isolation among older adults. 2010 Elsevier Inc. All rights reserved.
Choi, Seung Hoan; Labadorf, Adam T; Myers, Richard H; Lunetta, Kathryn L; Dupuis, Josée; DeStefano, Anita L
2017-02-06
Next generation sequencing provides a count of RNA molecules in the form of short reads, yielding discrete, often highly non-normally distributed gene expression measurements. Although Negative Binomial (NB) regression has been generally accepted in the analysis of RNA sequencing (RNA-Seq) data, its appropriateness has not been exhaustively evaluated. We explore logistic regression as an alternative method for RNA-Seq studies designed to compare cases and controls, where disease status is modeled as a function of RNA-Seq reads using simulated and Huntington disease data. We evaluate the effect of adjusting for covariates that have an unknown relationship with gene expression. Finally, we incorporate the data adaptive method in order to compare false positive rates. When the sample size is small or the expression levels of a gene are highly dispersed, the NB regression shows inflated Type-I error rates but the Classical logistic and Bayes logistic (BL) regressions are conservative. Firth's logistic (FL) regression performs well or is slightly conservative. Large sample size and low dispersion generally make Type-I error rates of all methods close to nominal alpha levels of 0.05 and 0.01. However, Type-I error rates are controlled after applying the data adaptive method. The NB, BL, and FL regressions gain increased power with large sample size, large log2 fold-change, and low dispersion. The FL regression has comparable power to NB regression. We conclude that implementing the data adaptive method appropriately controls Type-I error rates in RNA-Seq analysis. Firth's logistic regression provides a concise statistical inference process and reduces spurious associations from inaccurately estimated dispersion parameters in the negative binomial framework.
Homicide mortality rates in Canada, 2000-2009: Youth at increased risk.
Basham, C Andrew; Snider, Carolyn
2016-10-20
To estimate and compare Canadian homicide mortality rates (HMRs) and trends in HMRs across age groups, with a focus on trends for youth. Data for the period of 2000 to 2009 were collected from Statistics Canada's CANSIM (Canadian Statistical Information Management) Table 102-0540 with the following ICD-10-CA coded external causes of death: X85 to Y09 (assault) and Y87.1 (sequelae of assault). Annual population counts from 2000 to 2009 were obtained from Statistics Canada's CANSIM Table 051-0001. Both death and population counts were organized into five-year age groups. A random effects negative binomial regression analysis was conducted to estimate age group-specific rates, rate ratios, and trends in homicide mortality. There were 9,878 homicide deaths in Canada during the study period. The increase in the overall homicide mortality rate (HMR) of 0.3% per year was not statistically significant (95% CI: -1.1% to +1.8%). Canadians aged 15-19 years and 20-24 years had the highest HMRs during the study period, and experienced statistically significant annual increases in their HMRs of 3% and 4% respectively (p < 0.05). A general, though not statistically significant, decrease in the HMR was observed for all age groups 50+ years. A fixed effects negative binomial regression model showed that the HMR for males was higher than for females over the study period [RRfemale/male = 0.473 (95% CI: 0.361, 0.621)], but no significant difference in sex-specific trends in the HMR was found. An increasing risk of homicide mortality was identified among Canadian youth, ages 15-24, over the 10-year study period. Research that seeks to understand the reasons for the increased homicide risk facing Canada's youth, and public policy responses to reduce this risk, are warranted.
On extinction time of a generalized endemic chain-binomial model.
Aydogmus, Ozgur
2016-09-01
We considered a chain-binomial epidemic model not conferring immunity after infection. Mean field dynamics of the model has been analyzed and conditions for the existence of a stable endemic equilibrium are determined. The behavior of the chain-binomial process is probabilistically linked to the mean field equation. As a result of this link, we were able to show that the mean extinction time of the epidemic increases at least exponentially as the population size grows. We also present simulation results for the process to validate our analytical findings. Copyright © 2016 Elsevier Inc. All rights reserved.
Density of wild prey modulates lynx kill rates on free-ranging domestic sheep.
Odden, John; Nilsen, Erlend B; Linnell, John D C
2013-01-01
Understanding the factors shaping the dynamics of carnivore-livestock conflicts is vital to facilitate large carnivore conservation in multi-use landscapes. We investigated how the density of their main wild prey, roe deer Capreolus capreolus, modulates individual Eurasian lynx Lynx lynx kill rates on free-ranging domestic sheep Ovis aries across a range of sheep and roe deer densities. Lynx kill rates on free-ranging domestic sheep were collected in south-eastern Norway from 1995 to 2011 along a gradient of different livestock and wild prey densities using VHF and GPS telemetry. We used zero-inflated negative binomial (ZINB) models including lynx sex, sheep density and an index of roe deer density as explanatory variables to model observed kill rates on sheep, and ranked the models based on their AICc values. The model including the effects of lynx sex and sheep density in the zero-inflation model and the effect of lynx sex and roe deer density in the negative binomial part received most support. Irrespective of sheep density and sex, we found the lowest sheep kill rates in areas with high densities of roe deer. As roe deer density decreased, males killed sheep at higher rates, and this pattern held for both high and low sheep densities. Similarly, females killed sheep at higher rates in areas with high densities of sheep and low densities of roe deer. However, when sheep densities were low females rarely killed sheep irrespective of roe deer density. Our quantification of depredation rates can be the first step towards establishing fairer compensation systems based on more accurate and area specific estimation of losses. This study demonstrates how we can use ecological theory to predict where losses of sheep will be greatest, and can be used to identify areas where mitigation measures are most likely to be needed.
Density of Wild Prey Modulates Lynx Kill Rates on Free-Ranging Domestic Sheep
Odden, John; Nilsen, Erlend B.; Linnell, John D. C.
2013-01-01
Understanding the factors shaping the dynamics of carnivore–livestock conflicts is vital to facilitate large carnivore conservation in multi-use landscapes. We investigated how the density of their main wild prey, roe deer Capreolus capreolus, modulates individual Eurasian lynx Lynx lynx kill rates on free-ranging domestic sheep Ovis aries across a range of sheep and roe deer densities. Lynx kill rates on free-ranging domestic sheep were collected in south-eastern Norway from 1995 to 2011 along a gradient of different livestock and wild prey densities using VHF and GPS telemetry. We used zero-inflated negative binomial (ZINB) models including lynx sex, sheep density and an index of roe deer density as explanatory variables to model observed kill rates on sheep, and ranked the models based on their AICc values. The model including the effects of lynx sex and sheep density in the zero-inflation model and the effect of lynx sex and roe deer density in the negative binomial part received most support. Irrespective of sheep density and sex, we found the lowest sheep kill rates in areas with high densities of roe deer. As roe deer density decreased, males killed sheep at higher rates, and this pattern held for both high and low sheep densities. Similarly, females killed sheep at higher rates in areas with high densities of sheep and low densities of roe deer. However, when sheep densities were low females rarely killed sheep irrespective of roe deer density. Our quantification of depredation rates can be the first step towards establishing fairer compensation systems based on more accurate and area specific estimation of losses. This study demonstrates how we can use ecological theory to predict where losses of sheep will be greatest, and can be used to identify areas where mitigation measures are most likely to be needed. PMID:24278123
Prolonged resuscitation of metabolic acidosis after trauma is associated with more complications.
Weinberg, Douglas S; Narayanan, Arvind S; Moore, Timothy A; Vallier, Heather A
2015-09-24
Optimal patterns for fluid management are controversial in the resuscitation of major trauma. Similarly, appropriate surgical timing is often unclear in orthopedic polytrauma. Early appropriate care (EAC) has recently been introduced as an objective model to determine readiness for surgery based on the resuscitation of metabolic acidosis. EAC is an objective treatment algorithm that recommends fracture fixation within 36 h when either lactate <4.0 mmol/L, pH ≥ 7.25, or base excess (BE) ≥-5.5 mmol/L. The aim of this study is to better characterize the relationship between post-operative complications and the time required for resuscitation of metabolic acidosis using EAC. At an adult level 1 trauma center, 332 patients with major trauma (Injury Severity Score (ISS) ≥16) were prospectively treated with EAC. The time from injury to EAC resuscitation was determined in all patients. Age, race, gender, ISS, American Society of Anesthesiologists score (ASA), body mass index (BMI), outside hospital transfer status, number of fractures, and the specific fractures were also reviewed. Complications in the 6-month post-operative period were adjudicated by an independent multidisciplinary committee of trauma physicians and included infection, sepsis, pulmonary embolism, deep venous thrombosis, renal failure, multiorgan failure, pneumonia, and acute respiratory distress syndrome. Univariate analysis and binomial logistic regression analysis were used to compare complications between groups. Sixty-six patients developed complications, which was less than a historical cohort of 1,441 patients (19.9% vs. 22.1%). ISS (p < 0.0005) and time to EAC resuscitation (p = 0.041) were independent predictors of complication rate. A 2.7-h increase in time to resuscitation had odds for sustaining a complication equivalent to a 1-unit increase on the ISS. EAC guidelines were safe, effective, and practically implemented in a level 1 trauma center. During the resuscitation course, increased exposure to acidosis was associated with a higher complication rate. Identifying the innate differences in the response, regulation, and resolution of acidosis in these critically injured patients is an important area for trauma research. Level 1: prognostic study.
Lara, Jesus R; Hoddle, Mark S
2015-08-01
Oligonychus perseae Tuttle, Baker, & Abatiello is a foliar pest of 'Hass' avocados [Persea americana Miller (Lauraceae)]. The recommended action threshold is 50-100 motile mites per leaf, but this count range and other ecological factors associated with O. perseae infestations limit the application of enumerative sampling plans in the field. Consequently, a comprehensive modeling approach was implemented to compare the practical application of various binomial sampling models for decision-making of O. perseae in California. An initial set of sequential binomial sampling models were developed using three mean-proportion modeling techniques (i.e., Taylor's power law, maximum likelihood, and an empirical model) in combination with two-leaf infestation tally thresholds of either one or two mites. Model performance was evaluated using a robust mite count database consisting of >20,000 Hass avocado leaves infested with varying densities of O. perseae and collected from multiple locations. Operating characteristic and average sample number results for sequential binomial models were used as the basis to develop and validate a standardized fixed-size binomial sampling model with guidelines on sample tree and leaf selection within blocks of avocado trees. This final validated model requires a leaf sampling cost of 30 leaves and takes into account the spatial dynamics of O. perseae to make reliable mite density classifications for a 50-mite action threshold. Recommendations for implementing this fixed-size binomial sampling plan to assess densities of O. perseae in commercial California avocado orchards are discussed. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Using the Binomial Series to Prove the Arithmetic Mean-Geometric Mean Inequality
ERIC Educational Resources Information Center
Persky, Ronald L.
2003-01-01
In 1968, Leon Gerber compared (1 + x)[superscript a] to its kth partial sum as a binomial series. His result is stated and, as an application of this result, a proof of the arithmetic mean-geometric mean inequality is presented.
Four Bootstrap Confidence Intervals for the Binomial-Error Model.
ERIC Educational Resources Information Center
Lin, Miao-Hsiang; Hsiung, Chao A.
1992-01-01
Four bootstrap methods are identified for constructing confidence intervals for the binomial-error model. The extent to which similar results are obtained and the theoretical foundation of each method and its relevance and ranges of modeling the true score uncertainty are discussed. (SLD)
Possibility and Challenges of Conversion of Current Virus Species Names to Linnaean Binomials.
Postler, Thomas S; Clawson, Anna N; Amarasinghe, Gaya K; Basler, Christopher F; Bavari, Sbina; Benko, Mária; Blasdell, Kim R; Briese, Thomas; Buchmeier, Michael J; Bukreyev, Alexander; Calisher, Charles H; Chandran, Kartik; Charrel, Rémi; Clegg, Christopher S; Collins, Peter L; Juan Carlos, De La Torre; Derisi, Joseph L; Dietzgen, Ralf G; Dolnik, Olga; Dürrwald, Ralf; Dye, John M; Easton, Andrew J; Emonet, Sébastian; Formenty, Pierre; Fouchier, Ron A M; Ghedin, Elodie; Gonzalez, Jean-Paul; Harrach, Balázs; Hewson, Roger; Horie, Masayuki; Jiang, Dàohóng; Kobinger, Gary; Kondo, Hideki; Kropinski, Andrew M; Krupovic, Mart; Kurath, Gael; Lamb, Robert A; Leroy, Eric M; Lukashevich, Igor S; Maisner, Andrea; Mushegian, Arcady R; Netesov, Sergey V; Nowotny, Norbert; Patterson, Jean L; Payne, Susan L; PaWeska, Janusz T; Peters, Clarence J; Radoshitzky, Sheli R; Rima, Bertus K; Romanowski, Victor; Rubbenstroth, Dennis; Sabanadzovic, Sead; Sanfaçon, Hélène; Salvato, Maria S; Schwemmle, Martin; Smither, Sophie J; Stenglein, Mark D; Stone, David M; Takada, Ayato; Tesh, Robert B; Tomonaga, Keizo; Tordo, Noël; Towner, Jonathan S; Vasilakis, Nikos; Volchkov, Viktor E; Wahl-Jensen, Victoria; Walker, Peter J; Wang, Lin-Fa; Varsani, Arvind; Whitfield, Anna E; Zerbini, F Murilo; Kuhn, Jens H
2017-05-01
Botanical, mycological, zoological, and prokaryotic species names follow the Linnaean format, consisting of an italicized Latinized binomen with a capitalized genus name and a lower case species epithet (e.g., Homo sapiens). Virus species names, however, do not follow a uniform format, and, even when binomial, are not Linnaean in style. In this thought exercise, we attempted to convert all currently official names of species included in the virus family Arenaviridae and the virus order Mononegavirales to Linnaean binomials, and to identify and address associated challenges and concerns. Surprisingly, this endeavor was not as complicated or time-consuming as even the authors of this article expected when conceiving the experiment. [Arenaviridae; binomials; ICTV; International Committee on Taxonomy of Viruses; Mononegavirales; virus nomenclature; virus taxonomy.]. Published by Oxford University Press on behalf of Society of Systematic Biologists 2016. This work is written by a US Government employee and is in the public domain in the US.
The arcsine is asinine: the analysis of proportions in ecology.
Warton, David I; Hui, Francis K C
2011-01-01
The arcsine square root transformation has long been standard procedure when analyzing proportional data in ecology, with applications in data sets containing binomial and non-binomial response variables. Here, we argue that the arcsine transform should not be used in either circumstance. For binomial data, logistic regression has greater interpretability and higher power than analyses of transformed data. However, it is important to check the data for additional unexplained variation, i.e., overdispersion, and to account for it via the inclusion of random effects in the model if found. For non-binomial data, the arcsine transform is undesirable on the grounds of interpretability, and because it can produce nonsensical predictions. The logit transformation is proposed as an alternative approach to address these issues. Examples are presented in both cases to illustrate these advantages, comparing various methods of analyzing proportions including untransformed, arcsine- and logit-transformed linear models and logistic regression (with or without random effects). Simulations demonstrate that logistic regression usually provides a gain in power over other methods.
Alekseeva, N P; Alekseev, A O; Vakhtin, Iu B; Kravtsov, V Iu; Kuzovatov, S N; Skorikova, T I
2008-01-01
Distributions of nuclear morphology anomalies in transplantable rabdomiosarcoma RA-23 cell populations were investigated under effect of ionizing radiation from 0 to 45 Gy. Internuclear bridges, nuclear protrusions and dumbbell-shaped nuclei were accepted for morphological anomalies. Empirical distributions of the number of anomalies per 100 nuclei were used. The adequate model of reentrant binomial distribution has been found. The sum of binomial random variables with binomial number of summands has such distribution. Averages of these random variables were named, accordingly, internal and external average reentrant components. Their maximum likelihood estimations were received. Statistical properties of these estimations were investigated by means of statistical modeling. It has been received that at equally significant correlation between the radiation dose and the average of nuclear anomalies in cell populations after two-three cellular cycles from the moment of irradiation in vivo the irradiation doze significantly correlates with internal average reentrant component, and in remote descendants of cell transplants irradiated in vitro - with external one.
The Difference Calculus and The NEgative Binomial Distribution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowman, Kimiko o; Shenton, LR
2007-01-01
In a previous paper we state the dominant term in the third central moment of the maximum likelihood estimator k of the parameter k in the negative binomial probability function where the probability generating function is (p + 1 - pt){sup -k}. A partial sum of the series {Sigma}1/(k + x){sup 3} is involved, where x is a negative binomial random variate. In expectation this sum can only be found numerically using the computer. Here we give a simple definite integral in (0,1) for the generalized case. This means that now we do have a valid expression for {radical}{beta}{sub 11}(k)more » and {radical}{beta}{sub 11}(p). In addition we use the finite difference operator {Delta}, and E = 1 + {Delta} to set up formulas for low order moments. Other examples of the operators are quoted relating to the orthogonal set of polynomials associated with the negative binomial probability function used as a weight function.« less
Buffon, Georges Louis Leclerc Comte de (1707-88)
NASA Astrophysics Data System (ADS)
Murdin, P.
2000-11-01
French naturalist. Discovered the binomial theorem and worked on probability theory. In astronomy he suggested that the Earth might have been created by the collision of a comet with the Sun. Based on the cooling rate of iron, he calculated in Théorie de la Terre that the age of the Earth was 75 000 years. This estimate, so much larger than the official 6000 years, was condemned by the Catholic C...
NASA Astrophysics Data System (ADS)
Leier, André; Marquez-Lago, Tatiana T.; Burrage, Kevin
2008-05-01
The delay stochastic simulation algorithm (DSSA) by Barrio et al. [Plos Comput. Biol. 2, 117(E) (2006)] was developed to simulate delayed processes in cell biology in the presence of intrinsic noise, that is, when there are small-to-moderate numbers of certain key molecules present in a chemical reaction system. These delayed processes can faithfully represent complex interactions and mechanisms that imply a number of spatiotemporal processes often not explicitly modeled such as transcription and translation, basic in the modeling of cell signaling pathways. However, for systems with widely varying reaction rate constants or large numbers of molecules, the simulation time steps of both the stochastic simulation algorithm (SSA) and the DSSA can become very small causing considerable computational overheads. In order to overcome the limit of small step sizes, various τ-leap strategies have been suggested for improving computational performance of the SSA. In this paper, we present a binomial τ-DSSA method that extends the τ-leap idea to the delay setting and avoids drawing insufficient numbers of reactions, a common shortcoming of existing binomial τ-leap methods that becomes evident when dealing with complex chemical interactions. The resulting inaccuracies are most evident in the delayed case, even when considering reaction products as potential reactants within the same time step in which they are produced. Moreover, we extend the framework to account for multicellular systems with different degrees of intercellular communication. We apply these ideas to two important genetic regulatory models, namely, the hes1 gene, implicated as a molecular clock, and a Her1/Her 7 model for coupled oscillating cells.
Turi, Christina E; Murch, Susan J
2013-07-09
Ethnobotanical research and the study of plants used for rituals, ceremonies and to connect with the spirit world have led to the discovery of many novel psychoactive compounds such as nicotine, caffeine, and cocaine. In North America, spiritual and ceremonial uses of plants are well documented and can be accessed online via the University of Michigan's Native American Ethnobotany Database. The objective of the study was to compare Residual, Bayesian, Binomial and Imprecise Dirichlet Model (IDM) analyses of ritual, ceremonial and spiritual plants in Moerman's ethnobotanical database and to identify genera that may be good candidates for the discovery of novel psychoactive compounds. The database was queried with the following format "Family Name AND Ceremonial OR Spiritual" for 263 North American botanical families. Spiritual and ceremonial flora consisted of 86 families with 517 species belonging to 292 genera. Spiritual taxa were then grouped further into ceremonial medicines and items categories. Residual, Bayesian, Binomial and IDM analysis were performed to identify over and under-utilized families. The 4 statistical approaches were in good agreement when identifying under-utilized families but large families (>393 species) were underemphasized by Binomial, Bayesian and IDM approaches for over-utilization. Residual, Binomial, and IDM analysis identified similar families as over-utilized in the medium (92-392 species) and small (<92 species) classes. The families Apiaceae, Asteraceae, Ericacea, Pinaceae and Salicaceae were identified as significantly over-utilized as ceremonial medicines in medium and large sized families. Analysis of genera within the Apiaceae and Asteraceae suggest that the genus Ligusticum and Artemisia are good candidates for facilitating the discovery of novel psychoactive compounds. The 4 statistical approaches were not consistent in the selection of over-utilization of flora. Residual analysis revealed overall trends that were supported by Binomial analysis when separated into small, medium and large families. The Bayesian, Binomial and IDM approaches identified different genera as potentially important. Species belonging to the genus Artemisia and Ligusticum were most consistently identified and may be valuable in future studies of the ethnopharmacology. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Aitken, C G
1999-07-01
It is thought that, in a consignment of discrete units, a certain proportion of the units contain illegal material. A sample of the consignment is to be inspected. Various methods for the determination of the sample size are compared. The consignment will be considered as a random sample from some super-population of units, a certain proportion of which contain drugs. For large consignments, a probability distribution, known as the beta distribution, for the proportion of the consignment which contains illegal material is obtained. This distribution is based on prior beliefs about the proportion. Under certain specific conditions the beta distribution gives the same numerical results as an approach based on the binomial distribution. The binomial distribution provides a probability for the number of units in a sample which contain illegal material, conditional on knowing the proportion of the consignment which contains illegal material. This is in contrast to the beta distribution which provides probabilities for the proportion of a consignment which contains illegal material, conditional on knowing the number of units in the sample which contain illegal material. The interpretation when the beta distribution is used is much more intuitively satisfactory. It is also much more flexible in its ability to cater for prior beliefs which may vary given the different circumstances of different crimes. For small consignments, a distribution, known as the beta-binomial distribution, for the number of units in the consignment which are found to contain illegal material, is obtained, based on prior beliefs about the number of units in the consignment which are thought to contain illegal material. As with the beta and binomial distributions for large samples, it is shown that, in certain specific conditions, the beta-binomial and hypergeometric distributions give the same numerical results. However, the beta-binomial distribution, as with the beta distribution, has a more intuitively satisfactory interpretation and greater flexibility. The beta and the beta-binomial distributions provide methods for the determination of the minimum sample size to be taken from a consignment in order to satisfy a certain criterion. The criterion requires the specification of a proportion and a probability.
Heroic Reliability Improvement in Manned Space Systems
NASA Technical Reports Server (NTRS)
Jones, Harry W.
2017-01-01
System reliability can be significantly improved by a strong continued effort to identify and remove all the causes of actual failures. Newly designed systems often have unexpected high failure rates which can be reduced by successive design improvements until the final operational system has an acceptable failure rate. There are many causes of failures and many ways to remove them. New systems may have poor specifications, design errors, or mistaken operations concepts. Correcting unexpected problems as they occur can produce large early gains in reliability. Improved technology in materials, components, and design approaches can increase reliability. The reliability growth is achieved by repeatedly operating the system until it fails, identifying the failure cause, and fixing the problem. The failure rate reduction that can be obtained depends on the number and the failure rates of the correctable failures. Under the strong assumption that the failure causes can be removed, the decline in overall failure rate can be predicted. If a failure occurs at the rate of lambda per unit time, the expected time before the failure occurs and can be corrected is 1/lambda, the Mean Time Before Failure (MTBF). Finding and fixing a less frequent failure with the rate of lambda/2 per unit time requires twice as long, time of 1/(2 lambda). Cutting the failure rate in half requires doubling the test and redesign time and finding and eliminating the failure causes.Reducing the failure rate significantly requires a heroic reliability improvement effort.
HYPERSAMP - HYPERGEOMETRIC ATTRIBUTE SAMPLING SYSTEM BASED ON RISK AND FRACTION DEFECTIVE
NASA Technical Reports Server (NTRS)
De, Salvo L. J.
1994-01-01
HYPERSAMP is a demonstration of an attribute sampling system developed to determine the minimum sample size required for any preselected value for consumer's risk and fraction of nonconforming. This statistical method can be used in place of MIL-STD-105E sampling plans when a minimum sample size is desirable, such as when tests are destructive or expensive. HYPERSAMP utilizes the Hypergeometric Distribution and can be used for any fraction nonconforming. The program employs an iterative technique that circumvents the obstacle presented by the factorial of a non-whole number. HYPERSAMP provides the required Hypergeometric sample size for any equivalent real number of nonconformances in the lot or batch under evaluation. Many currently used sampling systems, such as the MIL-STD-105E, utilize the Binomial or the Poisson equations as an estimate of the Hypergeometric when performing inspection by attributes. However, this is primarily because of the difficulty in calculation of the factorials required by the Hypergeometric. Sampling plans based on the Binomial or Poisson equations will result in the maximum sample size possible with the Hypergeometric. The difference in the sample sizes between the Poisson or Binomial and the Hypergeometric can be significant. For example, a lot size of 400 devices with an error rate of 1.0% and a confidence of 99% would require a sample size of 400 (all units would need to be inspected) for the Binomial sampling plan and only 273 for a Hypergeometric sampling plan. The Hypergeometric results in a savings of 127 units, a significant reduction in the required sample size. HYPERSAMP is a demonstration program and is limited to sampling plans with zero defectives in the sample (acceptance number of zero). Since it is only a demonstration program, the sample size determination is limited to sample sizes of 1500 or less. The Hypergeometric Attribute Sampling System demonstration code is a spreadsheet program written for IBM PC compatible computers running DOS and Lotus 1-2-3 or Quattro Pro. This program is distributed on a 5.25 inch 360K MS-DOS format diskette, and the program price includes documentation. This statistical method was developed in 1992.
Williams, Rachael; de Vries, Frank; Kothny, Wolfgang; Serban, Carmen; Lopez-Leon, Sandra; Chu, Changan; Schlienger, Raymond
2017-10-01
The aim of this non-interventional, multi-database, analytical cohort study was to assess the cardiovascular (CV) safety of vildagliptin vs other non-insulin antidiabetic drugs (NIADs) using real-world data from 5 European electronic healthcare databases. Patients with type 2 diabetes aged ≥18 years on NIAD treatment were enrolled. Adjusted incidence rate ratios (IRRs) and 95% confidence intervals (CIs) for the outcomes of interest (myocardial infarction [MI], acute coronary syndrome [ACS], stroke, congestive heart failure [CHF], individually and as a composite) were estimated using negative binomial regression. Approximately 2.8% of the enrolled patients (n = 738 054) used vildagliptin at any time during the study, with an average follow-up time of 1.4 years, resulting in a cumulative current vildagliptin exposure of 28 330 person-years. The adjusted IRRs (vildagliptin [±other NIADs] vs other NIADs) were in the range of 0.61 to 0.97 (MI), 0.55 to 1.60 (ACS), 0.02 to 0.77 (stroke), 0.49 to 1.03 (CHF), and 0.22 to 1.02 (composite CV outcomes). The IRRs and their 95% CIs were close to 1, demonstrating no increased risk of adverse CV events, including the risk of CHF, with vildagliptin vs other NIADs in real-world conditions. © 2017 Crown copyright. Diabetes, Obesity and Metabolism © 2017 John Wiley & Sons Ltd.
Selecting Tools to Model Integer and Binomial Multiplication
ERIC Educational Resources Information Center
Pratt, Sarah Smitherman; Eddy, Colleen M.
2017-01-01
Mathematics teachers frequently provide concrete manipulatives to students during instruction; however, the rationale for using certain manipulatives in conjunction with concepts may not be explored. This article focuses on area models that are currently used in classrooms to provide concrete examples of integer and binomial multiplication. The…
Cao, Qingqing; Wu, Zhenqiang; Sun, Ying; Wang, Tiezhu; Han, Tengwei; Gu, Chaomei; Sun, Yehuan
2011-11-01
To Eexplore the application of negative binomial regression and modified Poisson regression analysis in analyzing the influential factors for injury frequency and the risk factors leading to the increase of injury frequency. 2917 primary and secondary school students were selected from Hefei by cluster random sampling method and surveyed by questionnaire. The data on the count event-based injuries used to fitted modified Poisson regression and negative binomial regression model. The risk factors incurring the increase of unintentional injury frequency for juvenile students was explored, so as to probe the efficiency of these two models in studying the influential factors for injury frequency. The Poisson model existed over-dispersion (P < 0.0001) based on testing by the Lagrangemultiplier. Therefore, the over-dispersion dispersed data using a modified Poisson regression and negative binomial regression model, was fitted better. respectively. Both showed that male gender, younger age, father working outside of the hometown, the level of the guardian being above junior high school and smoking might be the results of higher injury frequencies. On a tendency of clustered frequency data on injury event, both the modified Poisson regression analysis and negative binomial regression analysis can be used. However, based on our data, the modified Poisson regression fitted better and this model could give a more accurate interpretation of relevant factors affecting the frequency of injury.
Martínez-Ferrer, María Teresa; Ripollés, José Luís; Garcia-Marí, Ferran
2006-06-01
The spatial distribution of the citrus mealybug, Planococcus citri (Risso) (Homoptera: Pseudococcidae), was studied in citrus groves in northeastern Spain. Constant precision sampling plans were designed for all developmental stages of citrus mealybug under the fruit calyx, for late stages on fruit, and for females on trunks and main branches; more than 66, 286, and 101 data sets, respectively, were collected from nine commercial fields during 1992-1998. Dispersion parameters were determined using Taylor's power law, giving aggregated spatial patterns for citrus mealybug populations in three locations of the tree sampled. A significant relationship between the number of insects per organ and the percentage of occupied organs was established using either Wilson and Room's binomial model or Kono and Sugino's empirical formula. Constant precision (E = 0.25) sampling plans (i.e., enumerative plans) for estimating mean densities were developed using Green's equation and the two binomial models. For making management decisions, enumerative counts may be less labor-intensive than binomial sampling. Therefore, we recommend enumerative sampling plans for the use in an integrated pest management program in citrus. Required sample sizes for the range of population densities near current management thresholds, in the three plant locations calyx, fruit, and trunk were 50, 110-330, and 30, respectively. Binomial sampling, especially the empirical model, required a higher sample size to achieve equivalent levels of precision.
Bennett, Bradley C; Husby, Chad E
2008-03-28
Botanical pharmacopoeias are non-random subsets of floras, with some taxonomic groups over- or under-represented. Moerman [Moerman, D.E., 1979. Symbols and selectivity: a statistical analysis of Native American medical ethnobotany, Journal of Ethnopharmacology 1, 111-119] introduced linear regression/residual analysis to examine these patterns. However, regression, the commonly-employed analysis, suffers from several statistical flaws. We use contingency table and binomial analyses to examine patterns of Shuar medicinal plant use (from Amazonian Ecuador). We first analyzed the Shuar data using Moerman's approach, modified to better meet requirements of linear regression analysis. Second, we assessed the exact randomization contingency table test for goodness of fit. Third, we developed a binomial model to test for non-random selection of plants in individual families. Modified regression models (which accommodated assumptions of linear regression) reduced R(2) to from 0.59 to 0.38, but did not eliminate all problems associated with regression analyses. Contingency table analyses revealed that the entire flora departs from the null model of equal proportions of medicinal plants in all families. In the binomial analysis, only 10 angiosperm families (of 115) differed significantly from the null model. These 10 families are largely responsible for patterns seen at higher taxonomic levels. Contingency table and binomial analyses offer an easy and statistically valid alternative to the regression approach.
Estimating the Parameters of the Beta-Binomial Distribution.
ERIC Educational Resources Information Center
Wilcox, Rand R.
1979-01-01
For some situations the beta-binomial distribution might be used to describe the marginal distribution of test scores for a particular population of examinees. Several different methods of approximating the maximum likelihood estimate were investigated, and it was found that the Newton-Raphson method should be used when it yields admissable…
Multilevel Models for Binary Data
ERIC Educational Resources Information Center
Powers, Daniel A.
2012-01-01
The methods and models for categorical data analysis cover considerable ground, ranging from regression-type models for binary and binomial data, count data, to ordered and unordered polytomous variables, as well as regression models that mix qualitative and continuous data. This article focuses on methods for binary or binomial data, which are…
Rørth, Rasmus; Fosbøl, Emil L; Mogensen, Ulrik M; Kragholm, Kristian; Numé, Anna-Karin; Gislason, Gunnar H; Jhund, Pardeep S; Petrie, Mark C; McMurray, John J V; Torp-Pedersen, Christian; Køber, Lars; Kristensen, Søren L
2018-02-01
Employment status at time of first heart failure (HF) hospitalization may be an indicator of both self-perceived and objective health status. In this study, we examined the association between employment status and the risk of all-cause mortality and recurrent HF hospitalization in a nationwide cohort of patients with HF. We identified all patients of working age (18-60 years) with a first HF hospitalization in the period 1997-2015 in Denmark, categorized according to whether or not they were part of the workforce at time of the index admission. The primary outcome was death from any cause and the secondary outcome was readmission for HF. Cumulative incidence curves, binomial regression and Cox regression models were used to assess outcomes. Of 25 571 patients with a first hospitalization for HF, 15 428 (60%) were part of the workforce at baseline. Patients in the workforce were significantly younger (53 vs. 55 years) more likely to be male (75% vs 64%) and less likely to have diabetes (13% vs 22%) and chronic obstructive pulmonary disease (5% vs 10%) (all P < 0.0001). Not being part of the workforce was associated with a significantly higher risk of death [hazard ratio (HR) 1.59; 95% confidence interval (CI) 1.50-1.68] and rehospitalization for HF (HR 1.09; 95% CI 1.05-1.14), in analyses adjusted for age, sex, co-morbidities, education level, calendar time, and duration of first HF hospitalization. Not being part of the workforce at time of first HF hospitalization was independently associated with increased mortality and recurrent HF hospitalization. © 2017 The Authors. European Journal of Heart Failure © 2017 European Society of Cardiology.
A statistical model to estimate the impact of a hepatitis A vaccination programme.
Oviedo, Manuel; Pilar Muñoz, M; Domínguez, Angela; Borras, Eva; Carmona, Gloria
2008-11-11
A program of routine hepatitis A+B vaccination in preadolescents was introduced in 1998 in Catalonia, a region situated in the northeast of Spain. The objective of this study was to quantify the reduction in the incidence of hepatitis A in order to differentiate the natural reduction of the incidence of hepatitis A from that produced due to the vaccination programme and to predict the evolution of the disease in forthcoming years. A generalized linear model (GLM) using negative binomial regression was used to estimate the incidence rates of hepatitis A in Catalonia by year, age group and vaccination. Introduction of the vaccine reduced cases by 5.5 by year (p-value<0.001), but there was a significant interaction between the year of report and vaccination that smoothed this reduction (p-value<0.001). The reduction was not equal in all age groups, being greater in the 12-18 years age group, which fell from a mean rate of 8.15 per 100,000 person/years in the pre-vaccination period (1992-1998) to 1.4 in the vaccination period (1999-2005). The model predicts the evolution accurately for the group of vaccinated subjects. Negative binomial regression is more appropriate than Poisson regression when observed variance exceeds the observed mean (overdispersed count data), can cause a variable apparently contribute more on the model of what really makes it.
Possibility and Challenges of Conversion of Current Virus Species Names to Linnaean Binomials
Postler, Thomas S.; Clawson, Anna N.; Amarasinghe, Gaya K.; Basler, Christopher F.; Bavari, Sbina; Benkő, Mária; Blasdell, Kim R.; Briese, Thomas; Buchmeier, Michael J.; Bukreyev, Alexander; Calisher, Charles H.; Chandran, Kartik; Charrel, Rémi; Clegg, Christopher S.; Collins, Peter L.; Juan Carlos, De La Torre; Derisi, Joseph L.; Dietzgen, Ralf G.; Dolnik, Olga; Dürrwald, Ralf; Dye, John M.; Easton, Andrew J.; Emonet, Sébastian; Formenty, Pierre; Fouchier, Ron A. M.; Ghedin, Elodie; Gonzalez, Jean-Paul; Harrach, Balázs; Hewson, Roger; Horie, Masayuki; Jiāng, Dàohóng; Kobinger, Gary; Kondo, Hideki; Kropinski, Andrew M.; Krupovic, Mart; Kurath, Gael; Lamb, Robert A.; Leroy, Eric M.; Lukashevich, Igor S.; Maisner, Andrea; Mushegian, Arcady R.; Netesov, Sergey V.; Nowotny, Norbert; Patterson, Jean L.; Payne, Susan L.; PaWeska, Janusz T.; Peters, Clarence J.; Radoshitzky, Sheli R.; Rima, Bertus K.; Romanowski, Victor; Rubbenstroth, Dennis; Sabanadzovic, Sead; Sanfaçon, Hélène; Salvato, Maria S.; Schwemmle, Martin; Smither, Sophie J.; Stenglein, Mark D.; Stone, David M.; Takada, Ayato; Tesh, Robert B.; Tomonaga, Keizo; Tordo, Noël; Towner, Jonathan S.; Vasilakis, Nikos; Volchkov, Viktor E.; Wahl-Jensen, Victoria; Walker, Peter J.; Wang, Lin-Fa; Varsani, Arvind; Whitfield, Anna E.; Zerbini, F. Murilo; Kuhn, Jens H.
2017-01-01
Abstract Botanical, mycological, zoological, and prokaryotic species names follow the Linnaean format, consisting of an italicized Latinized binomen with a capitalized genus name and a lower case species epithet (e.g., Homo sapiens). Virus species names, however, do not follow a uniform format, and, even when binomial, are not Linnaean in style. In this thought exercise, we attempted to convert all currently official names of species included in the virus family Arenaviridae and the virus order Mononegavirales to Linnaean binomials, and to identify and address associated challenges and concerns. Surprisingly, this endeavor was not as complicated or time-consuming as even the authors of this article expected when conceiving the experiment. PMID:27798405
Emukule, Gideon O; Spreeuwenberg, Peter; Chaves, Sandra S; Mott, Joshua A; Tempia, Stefano; Bigogo, Godfrey; Nyawanda, Bryan; Nyaguara, Amek; Widdowson, Marc-Alain; van der Velden, Koos; Paget, John W
2017-01-01
Influenza and respiratory syncytial virus (RSV) associated mortality has not been well-established in tropical Africa. We used the negative binomial regression method and the rate-difference method (i.e. deaths during low and high influenza/RSV activity months), to estimate excess mortality attributable to influenza and RSV using verbal autopsy data collected through a health and demographic surveillance system in Western Kenya, 2007-2013. Excess mortality rates were calculated for a) all-cause mortality, b) respiratory deaths (including pneumonia), c) HIV-related deaths, and d) pulmonary tuberculosis (TB) related deaths. Using the negative binomial regression method, the mean annual all-cause excess mortality rate associated with influenza and RSV was 14.1 (95% confidence interval [CI] 0.0-93.3) and 17.1 (95% CI 0.0-111.5) per 100,000 person-years (PY) respectively; and 10.5 (95% CI 0.0-28.5) and 7.3 (95% CI 0.0-27.3) per 100,000 PY for respiratory deaths, respectively. Highest mortality rates associated with influenza were among ≥50 years, particularly among persons with TB (41.6[95% CI 0.0-122.7]); and with RSV were among <5 years. Using the rate-difference method, the excess mortality rate for influenza and RSV was 44.8 (95% CI 36.8-54.4) and 19.7 (95% CI 14.7-26.5) per 100,000 PY, respectively, for all-cause deaths; and 9.6 (95% CI 6.3-14.7) and 6.6 (95% CI 3.9-11.0) per 100,000 PY, respectively, for respiratory deaths. Our study shows a substantial excess mortality associated with influenza and RSV in Western Kenya, especially among children <5 years and older persons with TB, supporting recommendations for influenza vaccination and efforts to develop RSV vaccines.
Binomial Coefficients Modulo a Prime--A Visualization Approach to Undergraduate Research
ERIC Educational Resources Information Center
Bardzell, Michael; Poimenidou, Eirini
2011-01-01
In this article we present, as a case study, results of undergraduate research involving binomial coefficients modulo a prime "p." We will discuss how undergraduates were involved in the project, even with a minimal mathematical background beforehand. There are two main avenues of exploration described to discover these binomial…
Using the β-binomial distribution to characterize forest health
S.J. Zarnoch; R.L. Anderson; R.M. Sheffield
1995-01-01
The β-binomial distribution is suggested as a model for describing and analyzing the dichotomous data obtained from programs monitoring the health of forests in the United States. Maximum likelihood estimation of the parameters is given as well as asymptotic likelihood ratio tests. The procedure is illustrated with data on dogwood anthracnose infection (caused...
Integer Solutions of Binomial Coefficients
ERIC Educational Resources Information Center
Gilbertson, Nicholas J.
2016-01-01
A good formula is like a good story, rich in description, powerful in communication, and eye-opening to readers. The formula presented in this article for determining the coefficients of the binomial expansion of (x + y)n is one such "good read." The beauty of this formula is in its simplicity--both describing a quantitative situation…
Confidence Intervals for Weighted Composite Scores under the Compound Binomial Error Model
ERIC Educational Resources Information Center
Kim, Kyung Yong; Lee, Won-Chan
2018-01-01
Reporting confidence intervals with test scores helps test users make important decisions about examinees by providing information about the precision of test scores. Although a variety of estimation procedures based on the binomial error model are available for computing intervals for test scores, these procedures assume that items are randomly…
Cai, Qing; Lee, Jaeyoung; Eluru, Naveen; Abdel-Aty, Mohamed
2016-08-01
This study attempts to explore the viability of dual-state models (i.e., zero-inflated and hurdle models) for traffic analysis zones (TAZs) based pedestrian and bicycle crash frequency analysis. Additionally, spatial spillover effects are explored in the models by employing exogenous variables from neighboring zones. The dual-state models such as zero-inflated negative binomial and hurdle negative binomial models (with and without spatial effects) are compared with the conventional single-state model (i.e., negative binomial). The model comparison for pedestrian and bicycle crashes revealed that the models that considered observed spatial effects perform better than the models that did not consider the observed spatial effects. Across the models with spatial spillover effects, the dual-state models especially zero-inflated negative binomial model offered better performance compared to single-state models. Moreover, the model results clearly highlighted the importance of various traffic, roadway, and sociodemographic characteristics of the TAZ as well as neighboring TAZs on pedestrian and bicycle crash frequency. Copyright © 2016 Elsevier Ltd. All rights reserved.
Jiang, Honghua; Ni, Xiao; Huster, William; Heilmann, Cory
2015-01-01
Hypoglycemia has long been recognized as a major barrier to achieving normoglycemia with intensive diabetic therapies. It is a common safety concern for the diabetes patients. Therefore, it is important to apply appropriate statistical methods when analyzing hypoglycemia data. Here, we carried out bootstrap simulations to investigate the performance of the four commonly used statistical models (Poisson, negative binomial, analysis of covariance [ANCOVA], and rank ANCOVA) based on the data from a diabetes clinical trial. Zero-inflated Poisson (ZIP) model and zero-inflated negative binomial (ZINB) model were also evaluated. Simulation results showed that Poisson model inflated type I error, while negative binomial model was overly conservative. However, after adjusting for dispersion, both Poisson and negative binomial models yielded slightly inflated type I errors, which were close to the nominal level and reasonable power. Reasonable control of type I error was associated with ANCOVA model. Rank ANCOVA model was associated with the greatest power and with reasonable control of type I error. Inflated type I error was observed with ZIP and ZINB models.
Pricing American Asian options with higher moments in the underlying distribution
NASA Astrophysics Data System (ADS)
Lo, Keng-Hsin; Wang, Kehluh; Hsu, Ming-Feng
2009-01-01
We develop a modified Edgeworth binomial model with higher moment consideration for pricing American Asian options. With lognormal underlying distribution for benchmark comparison, our algorithm is as precise as that of Chalasani et al. [P. Chalasani, S. Jha, F. Egriboyun, A. Varikooty, A refined binomial lattice for pricing American Asian options, Rev. Derivatives Res. 3 (1) (1999) 85-105] if the number of the time steps increases. If the underlying distribution displays negative skewness and leptokurtosis as often observed for stock index returns, our estimates can work better than those in Chalasani et al. [P. Chalasani, S. Jha, F. Egriboyun, A. Varikooty, A refined binomial lattice for pricing American Asian options, Rev. Derivatives Res. 3 (1) (1999) 85-105] and are very similar to the benchmarks in Hull and White [J. Hull, A. White, Efficient procedures for valuing European and American path-dependent options, J. Derivatives 1 (Fall) (1993) 21-31]. The numerical analysis shows that our modified Edgeworth binomial model can value American Asian options with greater accuracy and speed given higher moments in their underlying distribution.
Discrimination of numerical proportions: A comparison of binomial and Gaussian models.
Raidvee, Aire; Lember, Jüri; Allik, Jüri
2017-01-01
Observers discriminated the numerical proportion of two sets of elements (N = 9, 13, 33, and 65) that differed either by color or orientation. According to the standard Thurstonian approach, the accuracy of proportion discrimination is determined by irreducible noise in the nervous system that stochastically transforms the number of presented visual elements onto a continuum of psychological states representing numerosity. As an alternative to this customary approach, we propose a Thurstonian-binomial model, which assumes discrete perceptual states, each of which is associated with a certain visual element. It is shown that the probability β with which each visual element can be noticed and registered by the perceptual system can explain data of numerical proportion discrimination at least as well as the continuous Thurstonian-Gaussian model, and better, if the greater parsimony of the Thurstonian-binomial model is taken into account using AIC model selection. We conclude that Gaussian and binomial models represent two different fundamental principles-internal noise vs. using only a fraction of available information-which are both plausible descriptions of visual perception.
Van der Heyden, H; Dutilleul, P; Brodeur, L; Carisse, O
2014-06-01
Spatial distribution of single-nucleotide polymorphisms (SNPs) related to fungicide resistance was studied for Botrytis cinerea populations in vineyards and for B. squamosa populations in onion fields. Heterogeneity in this distribution was characterized by performing geostatistical analyses based on semivariograms and through the fitting of discrete probability distributions. Two SNPs known to be responsible for boscalid resistance (H272R and H272Y), both located on the B subunit of the succinate dehydrogenase gene, and one SNP known to be responsible for dicarboximide resistance (I365S) were chosen for B. cinerea in grape. For B. squamosa in onion, one SNP responsible for dicarboximide resistance (I365S homologous) was chosen. One onion field was sampled in 2009 and another one was sampled in 2010 for B. squamosa, and two vineyards were sampled in 2011 for B. cinerea, for a total of four sampled sites. Cluster sampling was carried on a 10-by-10 grid, each of the 100 nodes being the center of a 10-by-10-m quadrat. In each quadrat, 10 samples were collected and analyzed by restriction fragment length polymorphism polymerase chain reaction (PCR) or allele specific PCR. Mean SNP incidence varied from 16 to 68%, with an overall mean incidence of 43%. In the geostatistical analyses, omnidirectional variograms showed spatial autocorrelation characterized by ranges of 21 to 1 m. Various levels of anisotropy were detected, however, with variograms computed in four directions (at 0°, 45°, 90°, and 135° from the within-row direction used as reference), indicating that spatial autocorrelation was prevalent or characterized by a longer range in one direction. For all eight data sets, the β-binomial distribution was found to fit the data better than the binomial distribution. This indicates local aggregation of fungicide resistance among sampling units, as supported by estimates of the parameter θ of the β-binomial distribution of 0.09 to 0.23 (overall median value = 0.20). On the basis of the observed spatial distribution patterns of SNP incidence, sampling curves were computed for different levels of reliability, emphasizing the importance of sample size for the detection of mutation incidence below the risk threshold for control failure.
Zero adjusted models with applications to analysing helminths count data.
Chipeta, Michael G; Ngwira, Bagrey M; Simoonga, Christopher; Kazembe, Lawrence N
2014-11-27
It is common in public health and epidemiology that the outcome of interest is counts of events occurrence. Analysing these data using classical linear models is mostly inappropriate, even after transformation of outcome variables due to overdispersion. Zero-adjusted mixture count models such as zero-inflated and hurdle count models are applied to count data when over-dispersion and excess zeros exist. Main objective of the current paper is to apply such models to analyse risk factors associated with human helminths (S. haematobium) particularly in a case where there's a high proportion of zero counts. The data were collected during a community-based randomised control trial assessing the impact of mass drug administration (MDA) with praziquantel in Malawi, and a school-based cross sectional epidemiology survey in Zambia. Count data models including traditional (Poisson and negative binomial) models, zero modified models (zero inflated Poisson and zero inflated negative binomial) and hurdle models (Poisson logit hurdle and negative binomial logit hurdle) were fitted and compared. Using Akaike information criteria (AIC), the negative binomial logit hurdle (NBLH) and zero inflated negative binomial (ZINB) showed best performance in both datasets. With regards to zero count capturing, these models performed better than other models. This paper showed that zero modified NBLH and ZINB models are more appropriate methods for the analysis of data with excess zeros. The choice between the hurdle and zero-inflated models should be based on the aim and endpoints of the study.
Coleman, Marlize; Coleman, Michael; Mabuza, Aaron M; Kok, Gerdalize; Coetzee, Maureen; Durrheim, David N
2008-04-27
To evaluate the performance of a novel malaria outbreak identification system in the epidemic prone rural area of Mpumalanga Province, South Africa, for timely identification of malaria outbreaks and guiding integrated public health responses. Using five years of historical notification data, two binomial thresholds were determined for each primary health care facility in the highest malaria risk area of Mpumalanga province. Whenever the thresholds were exceeded at health facility level (tier 1), primary health care staff notified the malaria control programme, which then confirmed adequate stocks of malaria treatment to manage potential increased cases. The cases were followed up at household level to verify the likely source of infection. The binomial thresholds were reviewed at village/town level (tier 2) to determine whether additional response measures were required. In addition, an automated electronic outbreak identification system at town/village level (tier 2) was integrated into the case notification database (tier 3) to ensure that unexpected increases in case notification were not missed.The performance of these binomial outbreak thresholds was evaluated against other currently recommended thresholds using retrospective data. The acceptability of the system at primary health care level was evaluated through structured interviews with health facility staff. Eighty four percent of health facilities reported outbreaks within 24 hours (n = 95), 92% (n = 104) within 48 hours and 100% (n = 113) within 72 hours. Appropriate response to all malaria outbreaks (n = 113, tier 1, n = 46, tier 2) were achieved within 24 hours. The system was positively viewed by all health facility staff. When compared to other epidemiological systems for a specified 12 month outbreak season (June 2003 to July 2004) the binomial exact thresholds produced one false weekly outbreak, the C-sum 12 weekly outbreaks and the mean + 2 SD nine false weekly outbreaks. Exceeding the binomial level 1 threshold triggered an alert four weeks prior to an outbreak, but exceeding the binomial level 2 threshold identified an outbreak as it occurred. The malaria outbreak surveillance system using binomial thresholds achieved its primary goal of identifying outbreaks early facilitating appropriate local public health responses aimed at averting a possible large-scale epidemic in a low, and unstable, malaria transmission setting.
Salas-Wright, Christopher P; Vaughn, Michael G; Goings, Trenette Clark
2017-10-01
To examine the prevalence of self-reported criminal and violent behavior, substance use disorders, and mental disorders among Mexican immigrants vis-à-vis the US born. Study findings are based on national data collected between 2012 and 2013. Binomial logistic regression was employed to examine the relationship between immigrant status and behavioral/psychiatric outcomes. Mexican immigrants report substantially lower levels of criminal and violent behaviors, substance use disorders, and mental disorders compared to US-born individuals. While some immigrants from Mexico have serious behavioral and psychiatric problems, Mexican immigrants in general experience such problems at far lower rates than US-born individuals.
Palau, Patricia; Domínguez, Eloy; Núñez, Eduardo; Ramón, José María; López, Laura; Melero, Joana; Sanchis, Juan; Bellver, Alejandro; Santas, Enrique; Bayes-Genis, Antoni; Chorro, Francisco J; Núñez, Julio
2018-04-01
Heart failure with preserved ejection fraction (HFpEF) is a highly prevalent syndrome with an elevated risk of morbidity and mortality. To date, there is scarce evidence on the role of peak exercise oxygen uptake (peak VO 2 ) for predicting the morbidity burden in HFpEF. We sought to evaluate the association between peak VO 2 and the risk of recurrent hospitalizations in patients with HFpEF. A total of 74 stable symptomatic patients with HFpEF underwent a cardiopulmonary exercise test between June 2012 and May 2016. A negative binomial regression method was used to determine the association between the percentage of predicted peak VO 2 (pp-peak VO 2 ) and recurrent hospitalizations. Risk estimates are reported as incidence rate ratios. The mean age was 72.5 ± 9.1 years, 53% were women, and all patients were in New York Heart Association functional class II to III. Mean peak VO 2 and median pp-peak VO 2 were 10 ± 2.8mL/min/kg and 60% (range, 47-67), respectively. During a median follow-up of 276 days [interquartile range, 153-1231], 84 all-cause hospitalizations in 31 patients (41.9%) were registered. A total of 15 (20.3%) deaths were also recorded. On multivariate analysis, accounting for mortality as a terminal event, pp-peak VO 2 was independently and linearly associated with the risk of recurrent admission. Thus, and modeled as continuous, a 10% decrease of pp-peak VO 2 increased the risk of recurrent hospitalizations by 32% (IRR, 1.32; 95%CI, 1.03-1.68; P = .028). In symptomatic elderly patients with HFpEF, pp-peak VO 2 predicts all-cause recurrent admission. Copyright © 2017 Sociedad Española de Cardiología. Published by Elsevier España, S.L.U. All rights reserved.
Smisc - A collection of miscellaneous functions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Landon Sego, PNNL
2015-08-31
A collection of functions for statistical computing and data manipulation. These include routines for rapidly aggregating heterogeneous matrices, manipulating file names, loading R objects, sourcing multiple R files, formatting datetimes, multi-core parallel computing, stream editing, specialized plotting, etc. Smisc-package A collection of miscellaneous functions allMissing Identifies missing rows or columns in a data frame or matrix as.numericSilent Silent wrapper for coercing a vector to numeric comboList Produces all possible combinations of a set of linear model predictors cumMax Computes the maximum of the vector up to the current index cumsumNA Computes the cummulative sum of a vector without propogating NAsmore » d2binom Probability functions for the sum of two independent binomials dataIn A flexible way to import data into R. dbb The Beta-Binomial Distribution df2list Row-wise conversion of a data frame to a list dfplapply Parallelized single row processing of a data frame dframeEquiv Examines the equivalence of two dataframes or matrices dkbinom Probability functions for the sum of k independent binomials factor2character Converts all factor variables in a dataframe to character variables findDepMat Identify linearly dependent rows or columns in a matrix formatDT Converts date or datetime strings into alternate formats getExtension Filename manipulations: remove the extension or path, extract the extension or path getPath Filename manipulations: remove the extension or path, extract the extension or path grabLast Filename manipulations: remove the extension or path, extract the extension or path ifelse1 Non-vectorized version of ifelse integ Simple numerical integration routine interactionPlot Two-way Interaction Plot with Error Bar linearMap Linear mapping of a numerical vector or scalar list2df Convert a list to a data frame loadObject Loads and returns the object(s) in an ".Rdata" file more Display the contents of a file to the R terminal movAvg2 Calculate the moving average using a 2-sided window openDevice Opens a graphics device based on the filename extension p2binom Probability functions for the sum of two independent binomials padZero Pad a vector of numbers with zeros parseJob Parses a collection of elements into (almost) equal sized groups pbb The Beta-Binomial Distribution pcbinom A continuous version of the binomial cdf pkbinom Probability functions for the sum of k independent binomials plapply Simple parallelization of lapply plotFun Plot one or more functions on a single plot PowerData An example of power data pvar Prints the name and value of one or more objects qbb The Beta-Binomial Distribution rbb And numerous others (space limits reporting).« less
An analytical framework for estimating aquatic species density from environmental DNA
Chambert, Thierry; Pilliod, David S.; Goldberg, Caren S.; Doi, Hideyuki; Takahara, Teruhiko
2018-01-01
Environmental DNA (eDNA) analysis of water samples is on the brink of becoming a standard monitoring method for aquatic species. This method has improved detection rates over conventional survey methods and thus has demonstrated effectiveness for estimation of site occupancy and species distribution. The frontier of eDNA applications, however, is to infer species density. Building upon previous studies, we present and assess a modeling approach that aims at inferring animal density from eDNA. The modeling combines eDNA and animal count data from a subset of sites to estimate species density (and associated uncertainties) at other sites where only eDNA data are available. As a proof of concept, we first perform a cross-validation study using experimental data on carp in mesocosms. In these data, fish densities are known without error, which allows us to test the performance of the method with known data. We then evaluate the model using field data from a study on a stream salamander species to assess the potential of this method to work in natural settings, where density can never be known with absolute certainty. Two alternative distributions (Normal and Negative Binomial) to model variability in eDNA concentration data are assessed. Assessment based on the proof of concept data (carp) revealed that the Negative Binomial model provided much more accurate estimates than the model based on a Normal distribution, likely because eDNA data tend to be overdispersed. Greater imprecision was found when we applied the method to the field data, but the Negative Binomial model still provided useful density estimates. We call for further model development in this direction, as well as further research targeted at sampling design optimization. It will be important to assess these approaches on a broad range of study systems.
I Remember You: Independence and the Binomial Model
ERIC Educational Resources Information Center
Levine, Douglas W.; Rockhill, Beverly
2006-01-01
We focus on the problem of ignoring statistical independence. A binomial experiment is used to determine whether judges could match, based on looks alone, dogs to their owners. The experimental design introduces dependencies such that the probability of a given judge correctly matching a dog and an owner changes from trial to trial. We show how…
Possibility and challenges of conversion of current virus species names to Linnaean binomials
Thomas, Postler; Clawson, Anna N.; Amarasinghe, Gaya K.; Basler, Christopher F.; Bavari, Sina; Benko, Maria; Blasdell, Kim R.; Briese, Thomas; Buchmeier, Michael J.; Bukreyev, Alexander; Calisher, Charles H.; Chandran, Kartik; Charrel, Remi; Clegg, Christopher S.; Collins, Peter L.; De la Torre, Juan Carlos; DeRisi, Joseph L.; Dietzgen, Ralf G.; Dolnik, Olga; Durrwald, Ralf; Dye, John M.; Easton, Andrew J.; Emonet, Sebastian; Formenty, Pierre; Fouchier, Ron A. M.; Ghedin, Elodie; Gonzalez, Jean-Paul; Harrach, Balazs; Hewson, Roger; Horie, Masayuki; Jiang, Daohong; Kobinger, Gary P.; Kondo, Hideki; Kropinski, Andrew; Krupovic, Mart; Kurath, Gael; Lamb, Robert A.; Leroy, Eric M.; Lukashevich, Igor S.; Maisner, Andrea; Mushegian, Arcady; Netesov, Sergey V.; Nowotny, Norbert; Patterson, Jean L.; Payne, Susan L.; Paweska, Janusz T.; Peters, C.J.; Radoshitzky, Sheli; Rima, Bertus K.; Romanowski, Victor; Rubbenstroth, Dennis; Sabanadzovic, Sead; Sanfacon, Helene; Salvato , Maria; Schwemmle, Martin; Smither, Sophie J.; Stenglein, Mark; Stone, D.M.; Takada , Ayato; Tesh, Robert B.; Tomonaga, Keizo; Tordo, N.; Towner, Jonathan S.; Vasilakis, Nikos; Volchkov, Victor E.; Jensen, Victoria; Walker, Peter J.; Wang, Lin-Fa; Varsani, Arvind; Whitfield , Anna E.; Zerbini, Francisco Murilo; Kuhn, Jens H.
2017-01-01
Botanical, mycological, zoological, and prokaryotic species names follow the Linnaean format, consisting of an italicized Latinized binomen with a capitalized genus name and a lower case species epithet (e.g., Homo sapiens). Virus species names, however, do not follow a uniform format, and, even when binomial, are not Linnaean in style. In this thought exercise, we attempted to convert all currently official names of species included in the virus family Arenaviridae and the virus order Mononegavirales to Linnaean binomials, and to identify and address associated challenges and concerns. Surprisingly, this endeavor was not as complicated or time-consuming as even the authors of this article expected when conceiving the experiment.
Sundaram, Aparna; Vaughan, Barbara; Kost, Kathryn; Bankole, Akinrinola; Finer, Lawrence; Singh, Susheela; Trussell, James
2017-03-01
Contraceptive failure rates measure a woman's probability of becoming pregnant while using a contraceptive. Information about these rates enables couples to make informed contraceptive choices. Failure rates were last estimated for 2002, and social and economic changes that have occurred since then necessitate a reestimation. To estimate failure rates for the most commonly used reversible methods in the United States, data from the 2006-2010 National Survey of Family Growth were used; some 15,728 contraceptive use intervals, contributed by 6,683 women, were analyzed. Data from the Guttmacher Institute's 2008 Abortion Patient Survey were used to adjust for abortion underreporting. Kaplan-Meier methods were used to estimate the associated single-decrement probability of failure by duration of use. Failure rates were compared with those from 1995 and 2002. Long-acting reversible contraceptives (the IUD and the implant) had the lowest failure rates of all methods (1%), while condoms and withdrawal carried the highest probabilities of failure (13% and 20%, respectively). However, the failure rate for the condom had declined significantly since 1995 (from 18%), as had the failure rate for all hormonal methods combined (from 8% to 6%). The failure rate for all reversible methods combined declined from 12% in 2002 to 10% in 2006-2010. These broad-based declines in failure rates reverse a long-term pattern of minimal change. Future research should explore what lies behind these trends, as well as possibilities for further improvements. © 2017 The Authors. Perspectives on Sexual and Reproductive Health published by Wiley Periodicals, Inc., on behalf of the Guttmacher Institute.
Failure rates of mini-implants placed in the infrazygomatic region.
Uribe, Flavio; Mehr, Rana; Mathur, Ajay; Janakiraman, Nandakumar; Allareddy, Veerasathpurush
2015-01-01
The purpose of this pilot study was to evaluate the failure rates of mini-implants placed in the infrazygomatic region and to evaluate factors that affect their stability. A retrospective cohort study of 30 consecutive patients (55 mini-implants) who had infrazygomatic mini-implants at a University Clinic were evaluated for failure rates. Patient, mini-implant, orthodontic, surgical, and mini-implant maintenance factors were evaluated by univariate logistic regression models for association to failure rates. A 21.8 % failure rate of mini-implants placed in the infazygomatic region was observed. None of the predictor variables were significantly associated with higher or lower odds for failed implants. Failure rates for infrazygomatic mini-implants were slightly higher than those reported in other maxilla-mandibular osseous locations. No predictor variables were found to be associated to the failure rates.
Aquilanti, Vincenzo; Coutinho, Nayara Dantas
2017-01-01
This article surveys the empirical information which originated both by laboratory experiments and by computational simulations, and expands previous understanding of the rates of chemical processes in the low-temperature range, where deviations from linearity of Arrhenius plots were revealed. The phenomenological two-parameter Arrhenius equation requires improvement for applications where interpolation or extrapolations are demanded in various areas of modern science. Based on Tolman's theorem, the dependence of the reciprocal of the apparent activation energy as a function of reciprocal absolute temperature permits the introduction of a deviation parameter d covering uniformly a variety of rate processes, from those where quantum mechanical tunnelling is significant and d < 0, to those where d > 0, corresponding to the Pareto–Tsallis statistical weights: these generalize the Boltzmann–Gibbs weight, which is recovered for d = 0. It is shown here how the weights arise, relaxing the thermodynamic equilibrium limit, either for a binomial distribution if d > 0 or for a negative binomial distribution if d < 0, formally corresponding to Fermion-like or Boson-like statistics, respectively. The current status of the phenomenology is illustrated emphasizing case studies; specifically (i) the super-Arrhenius kinetics, where transport phenomena accelerate processes as the temperature increases; (ii) the sub-Arrhenius kinetics, where quantum mechanical tunnelling propitiates low-temperature reactivity; (iii) the anti-Arrhenius kinetics, where processes with no energetic obstacles are rate-limited by molecular reorientation requirements. Particular attention is given for case (i) to the treatment of diffusion and viscosity, for case (ii) to formulation of a transition rate theory for chemical kinetics including quantum mechanical tunnelling, and for case (iii) to the stereodirectional specificity of the dynamics of reactions strongly hindered by the increase of temperature. This article is part of the themed issue ‘Theoretical and computational studies of non-equilibrium and non-statistical dynamics in the gas phase, in the condensed phase and at interfaces’. PMID:28320904
Aquilanti, Vincenzo; Coutinho, Nayara Dantas; Carvalho-Silva, Valter Henrique
2017-04-28
This article surveys the empirical information which originated both by laboratory experiments and by computational simulations, and expands previous understanding of the rates of chemical processes in the low-temperature range, where deviations from linearity of Arrhenius plots were revealed. The phenomenological two-parameter Arrhenius equation requires improvement for applications where interpolation or extrapolations are demanded in various areas of modern science. Based on Tolman's theorem, the dependence of the reciprocal of the apparent activation energy as a function of reciprocal absolute temperature permits the introduction of a deviation parameter d covering uniformly a variety of rate processes, from those where quantum mechanical tunnelling is significant and d < 0, to those where d > 0, corresponding to the Pareto-Tsallis statistical weights: these generalize the Boltzmann-Gibbs weight, which is recovered for d = 0. It is shown here how the weights arise, relaxing the thermodynamic equilibrium limit, either for a binomial distribution if d > 0 or for a negative binomial distribution if d < 0, formally corresponding to Fermion-like or Boson-like statistics, respectively. The current status of the phenomenology is illustrated emphasizing case studies; specifically (i) the super -Arrhenius kinetics, where transport phenomena accelerate processes as the temperature increases; (ii) the sub -Arrhenius kinetics, where quantum mechanical tunnelling propitiates low-temperature reactivity; (iii) the anti -Arrhenius kinetics, where processes with no energetic obstacles are rate-limited by molecular reorientation requirements. Particular attention is given for case (i) to the treatment of diffusion and viscosity, for case (ii) to formulation of a transition rate theory for chemical kinetics including quantum mechanical tunnelling, and for case (iii) to the stereodirectional specificity of the dynamics of reactions strongly hindered by the increase of temperature.This article is part of the themed issue 'Theoretical and computational studies of non-equilibrium and non-statistical dynamics in the gas phase, in the condensed phase and at interfaces'. © 2017 The Author(s).
Raw and Central Moments of Binomial Random Variables via Stirling Numbers
ERIC Educational Resources Information Center
Griffiths, Martin
2013-01-01
We consider here the problem of calculating the moments of binomial random variables. It is shown how formulae for both the raw and the central moments of such random variables may be obtained in a recursive manner utilizing Stirling numbers of the first kind. Suggestions are also provided as to how students might be encouraged to explore this…
Justin S. Crotteau; Martin W. Ritchie; J. Morgan Varner
2014-01-01
Many western USA fire regimes are typified by mixed-severity fire, which compounds the variability inherent to natural regeneration densities in associated forests. Tree regeneration data are often discrete and nonnegative; accordingly, we fit a series of Poisson and negative binomial variation models to conifer seedling counts across four distinct burn severities and...
Yelland, Lisa N; Salter, Amy B; Ryan, Philip
2011-10-15
Modified Poisson regression, which combines a log Poisson regression model with robust variance estimation, is a useful alternative to log binomial regression for estimating relative risks. Previous studies have shown both analytically and by simulation that modified Poisson regression is appropriate for independent prospective data. This method is often applied to clustered prospective data, despite a lack of evidence to support its use in this setting. The purpose of this article is to evaluate the performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data, by using generalized estimating equations to account for clustering. A simulation study is conducted to compare log binomial regression and modified Poisson regression for analyzing clustered data from intervention and observational studies. Both methods generally perform well in terms of bias, type I error, and coverage. Unlike log binomial regression, modified Poisson regression is not prone to convergence problems. The methods are contrasted by using example data sets from 2 large studies. The results presented in this article support the use of modified Poisson regression as an alternative to log binomial regression for analyzing clustered prospective data when clustering is taken into account by using generalized estimating equations.
The effects of heart rate control in chronic heart failure with reduced ejection fraction.
Grande, Dario; Iacoviello, Massimo; Aspromonte, Nadia
2018-07-01
Elevated heart rate has been associated with worse prognosis both in the general population and in patients with heart failure. Heart rate is finely modulated by neurohormonal signals and it reflects the balance between the sympathetic and the parasympathetic limbs of the autonomic nervous system. For this reason, elevated heart rate in heart failure has been considered an epiphenomenon of the sympathetic hyperactivation during heart failure. However, experimental and clinical evidence suggests that high heart rate could have a direct pathogenetic role. Consequently, heart rate might act as a pathophysiological mediator of heart failure as well as a marker of adverse outcome. This hypothesis has been supported by the observation that the positive effect of beta-blockade could be linked to the degree of heart rate reduction. In addition, the selective heart rate control with ivabradine has recently been demonstrated to be beneficial in patients with heart failure and left ventricular systolic dysfunction. The objective of this review is to examine the pathophysiological implications of elevated heart rate in chronic heart failure and explore the mechanisms underlying the effects of pharmacological heart rate control.
Savannah River Site generic data base development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blanton, C.H.; Eide, S.A.
This report describes the results of a project to improve the generic component failure data base for the Savannah River Site (SRS). A representative list of components and failure modes for SRS risk models was generated by reviewing existing safety analyses and component failure data bases and from suggestions from SRS safety analysts. Then sources of data or failure rate estimates were identified and reviewed for applicability. A major source of information was the Nuclear Computerized Library for Assessing Reactor Reliability, or NUCLARR. This source includes an extensive collection of failure data and failure rate estimates for commercial nuclear powermore » plants. A recent Idaho National Engineering Laboratory report on failure data from the Idaho Chemical Processing Plant was also reviewed. From these and other recent sources, failure data and failure rate estimates were collected for the components and failure modes of interest. This information was aggregated to obtain a recommended generic failure rate distribution (mean and error factor) for each component failure mode.« less
Tellier, Stéphanie; Dallocchio, Aymeric; Guigonis, Vincent; Saint-Marcoux, Frank; Llanas, Brigitte; Ichay, Lydia; Bandin, Flavio; Godron, Astrid; Morin, Denis; Brochard, Karine; Gandia, Peggy; Bouchet, Stéphane; Marquet, Pierre; Decramer, Stéphane
2016-01-01
Background and objectives Therapeutic drug monitoring of mycophenolic acid can improve clinical outcome in organ transplantation and lupus, but data are scarce in idiopathic nephrotic syndrome. The aim of our study was to investigate whether mycophenolic acid pharmacokinetics are associated with disease control in children receiving mycophenolate mofetil for the treatment of steroid–dependent nephrotic syndrome. Design, setting, participants, & measurements This was a retrospective multicenter study including 95 children with steroid–dependent nephrotic syndrome treated with mycophenolate mofetil with or without steroids. Area under the concentration-time curve of mycophenolic acid was determined in all children on the basis of sampling times at 20, 60, and 180 minutes postdose, using Bayesian estimation. The association between a threshold value of the area under the concentration-time curve of mycophenolic acid and the relapse rate was assessed using a negative binomial model. Results In total, 140 areas under the concentration-time curve of mycophenolic acid were analyzed. The findings indicate individual dose adaptation in 53 patients (38%) to achieve an area under the concentration-time curve target of 30–60 mg·h/L. In a multivariable negative binomial model including sex, age at disease onset, time to start of mycophenolate mofetil, previous immunomodulatory treatment, and concomitant prednisone dose, a level of area under the concentration-time curve of mycophenolic acid >45 mg·h/L was significantly associated with a lower relapse rate (rate ratio, 0.65; 95% confidence interval, 0.46 to 0.89; P=0.01). Conclusions Therapeutic drug monitoring leading to individualized dosing may improve the efficacy of mycophenolate mofetil in steroid–dependent nephrotic syndrome. Additional prospective studies are warranted to determine the optimal target for area under the concentration-time curve of mycophenolic acid in this population. PMID:27445161
Tellier, Stéphanie; Dallocchio, Aymeric; Guigonis, Vincent; Saint-Marcoux, Frank; Llanas, Brigitte; Ichay, Lydia; Bandin, Flavio; Godron, Astrid; Morin, Denis; Brochard, Karine; Gandia, Peggy; Bouchet, Stéphane; Marquet, Pierre; Decramer, Stéphane; Harambat, Jérôme
2016-10-07
Therapeutic drug monitoring of mycophenolic acid can improve clinical outcome in organ transplantation and lupus, but data are scarce in idiopathic nephrotic syndrome. The aim of our study was to investigate whether mycophenolic acid pharmacokinetics are associated with disease control in children receiving mycophenolate mofetil for the treatment of steroid-dependent nephrotic syndrome. This was a retrospective multicenter study including 95 children with steroid-dependent nephrotic syndrome treated with mycophenolate mofetil with or without steroids. Area under the concentration-time curve of mycophenolic acid was determined in all children on the basis of sampling times at 20, 60, and 180 minutes postdose, using Bayesian estimation. The association between a threshold value of the area under the concentration-time curve of mycophenolic acid and the relapse rate was assessed using a negative binomial model. In total, 140 areas under the concentration-time curve of mycophenolic acid were analyzed. The findings indicate individual dose adaptation in 53 patients (38%) to achieve an area under the concentration-time curve target of 30-60 mg·h/L. In a multivariable negative binomial model including sex, age at disease onset, time to start of mycophenolate mofetil, previous immunomodulatory treatment, and concomitant prednisone dose, a level of area under the concentration-time curve of mycophenolic acid >45 mg·h/L was significantly associated with a lower relapse rate (rate ratio, 0.65; 95% confidence interval, 0.46 to 0.89; P =0.01). Therapeutic drug monitoring leading to individualized dosing may improve the efficacy of mycophenolate mofetil in steroid-dependent nephrotic syndrome. Additional prospective studies are warranted to determine the optimal target for area under the concentration-time curve of mycophenolic acid in this population. Copyright © 2016 by the American Society of Nephrology.
[Association between cesarean birth and the risk of obesity in 6-17 year-olds].
Wang, Z H; Xu, R B; Dong, Y H; Yang, Y D; Wang, S; Wang, X J; Yang, Z G; Zou, Z Y; Ma, J
2017-12-10
Objective: To explore the association between cesarean section and obesity in child and adolescent. Methods: In this study, a total number of 42 758 primary and middle school students aged between 6 and 17 were selected, using the stratified cluster sampling method in 93 primary and middle schools in Hunan, Ningxia, Tianjin, Chongqing, Liaoning, Shanghai and Guangdong provinces and autonomous regions. Log-Binomial regression model was used to analyze the association between cesarean section and obesity in childhood or adolescent. Results: Mean age of the subjects was (10.5±3.2) years. The overall rate of cesarean section among subjects attending primary or secondary schools was 42.3%, with 55.9% in boys and, 40.6% in girls respectively and with difference statistically significant ( P <0.001). The rate on obesity among those that received cesarean section (17.6%) was significantly higher than those who experienced vaginal delivery (10.2%) ( P <0.001). Results from the log-binomial regression model showed that cesarean section significantly increased the risk of obesity in child and adolescent ( OR =1.72, 95% CI : 1.63-1.82; P <0.001). After adjusting for factors as sex, residential areas (urban or rural), feeding patterns, frequencies of milk-feeding, eating high-energy foods, eating fried foods and the levels of parental education, family income, parental obesity, physical activity levels, gestational age and birth weight etc ., the differences were still statistically significant ( OR =1.48, 95% CI : 1.39-1.57; P <0.001). Conclusion: The rate of cesarean section among pregnant women in China appeared high which may significantly increase the risk of obesity in child or adolescent.
Use of the negative binomial-truncated Poisson distribution in thunderstorm prediction
NASA Technical Reports Server (NTRS)
Cohen, A. C.
1971-01-01
A probability model is presented for the distribution of thunderstorms over a small area given that thunderstorm events (1 or more thunderstorms) are occurring over a larger area. The model incorporates the negative binomial and truncated Poisson distributions. Probability tables for Cape Kennedy for spring, summer, and fall months and seasons are presented. The computer program used to compute these probabilities is appended.
Bianca N.I. Eskelson; Hailemariam Temesgen; Tara M. Barrett
2009-01-01
Cavity tree and snag abundance data are highly variable and contain many zero observations. We predict cavity tree and snag abundance from variables that are readily available from forest cover maps or remotely sensed data using negative binomial (NB), zero-inflated NB, and zero-altered NB (ZANB) regression models as well as nearest neighbor (NN) imputation methods....
ERIC Educational Resources Information Center
Abrahamson, Dor
2009-01-01
This article reports on a case study from a design-based research project that investigated how students make sense of the disciplinary tools they are taught to use, and specifically, what personal, interpersonal, and material resources support this process. The probability topic of binomial distribution was selected due to robust documentation of…
Analysis of overdispersed count data by mixtures of Poisson variables and Poisson processes.
Hougaard, P; Lee, M L; Whitmore, G A
1997-12-01
Count data often show overdispersion compared to the Poisson distribution. Overdispersion is typically modeled by a random effect for the mean, based on the gamma distribution, leading to the negative binomial distribution for the count. This paper considers a larger family of mixture distributions, including the inverse Gaussian mixture distribution. It is demonstrated that it gives a significantly better fit for a data set on the frequency of epileptic seizures. The same approach can be used to generate counting processes from Poisson processes, where the rate or the time is random. A random rate corresponds to variation between patients, whereas a random time corresponds to variation within patients.
Relaxed Poisson cure rate models.
Rodrigues, Josemar; Cordeiro, Gauss M; Cancho, Vicente G; Balakrishnan, N
2016-03-01
The purpose of this article is to make the standard promotion cure rate model (Yakovlev and Tsodikov, ) more flexible by assuming that the number of lesions or altered cells after a treatment follows a fractional Poisson distribution (Laskin, ). It is proved that the well-known Mittag-Leffler relaxation function (Berberan-Santos, ) is a simple way to obtain a new cure rate model that is a compromise between the promotion and geometric cure rate models allowing for superdispersion. So, the relaxed cure rate model developed here can be considered as a natural and less restrictive extension of the popular Poisson cure rate model at the cost of an additional parameter, but a competitor to negative-binomial cure rate models (Rodrigues et al., ). Some mathematical properties of a proper relaxed Poisson density are explored. A simulation study and an illustration of the proposed cure rate model from the Bayesian point of view are finally presented. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Civic communities and urban violence.
Doucet, Jessica M; Lee, Matthew R
2015-07-01
Civic communities have a spirit of entrepreneurialism, a locally invested population and an institutional structure fostering civic engagement. Prior research, mainly confined to studying rural communities and fairly large geographic areas, has demonstrated that civic communities have lower rates of violence. The current study analyzes the associations between the components of civic communities and homicide rates for New Orleans neighborhoods (census tracts) in the years following Hurricane Katrina. Results from negative binomial regression models adjusting for spatial autocorrelation reveal that community homicide rates are lower where an entrepreneurial business climate is more pronounced and where there is more local investment. Additionally, an interaction between the availability of civic institutions and resource disadvantage reveals that the protective effects of civic institutions are only evident in disadvantaged communities. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Williams, R. E.; Kruger, R.
1980-01-01
Estimation procedures are described for measuring component failure rates, for comparing the failure rates of two different groups of components, and for formulating confidence intervals for testing hypotheses (based on failure rates) that the two groups perform similarly or differently. Appendix A contains an example of an analysis in which these methods are applied to investigate the characteristics of two groups of spacecraft components. The estimation procedures are adaptable to system level testing and to monitoring failure characteristics in orbit.
Galvan, T L; Burkness, E C; Hutchison, W D
2007-06-01
To develop a practical integrated pest management (IPM) system for the multicolored Asian lady beetle, Harmonia axyridis (Pallas) (Coleoptera: Coccinellidae), in wine grapes, we assessed the spatial distribution of H. axyridis and developed eight sampling plans to estimate adult density or infestation level in grape clusters. We used 49 data sets collected from commercial vineyards in 2004 and 2005, in Minnesota and Wisconsin. Enumerative plans were developed using two precision levels (0.10 and 0.25); the six binomial plans reflected six unique action thresholds (3, 7, 12, 18, 22, and 31% of cluster samples infested with at least one H. axyridis). The spatial distribution of H. axyridis in wine grapes was aggregated, independent of cultivar and year, but it was more randomly distributed as mean density declined. The average sample number (ASN) for each sampling plan was determined using resampling software. For research purposes, an enumerative plan with a precision level of 0.10 (SE/X) resulted in a mean ASN of 546 clusters. For IPM applications, the enumerative plan with a precision level of 0.25 resulted in a mean ASN of 180 clusters. In contrast, the binomial plans resulted in much lower ASNs and provided high probabilities of arriving at correct "treat or no-treat" decisions, making these plans more efficient for IPM applications. For a tally threshold of one adult per cluster, the operating characteristic curves for the six action thresholds provided binomial sequential sampling plans with mean ASNs of only 19-26 clusters, and probabilities of making correct decisions between 83 and 96%. The benefits of the binomial sampling plans are discussed within the context of improving IPM programs for wine grapes.
Zhang, Zhongheng; Hongying, Ni
2012-01-01
Regional citrate anticoagulation (RCA) is an attractive anticoagulation mode in continuous renal replacement therapy (CRRT) because it restricts the anticoagulatory effect to the extracorporeal circuit. In recent years, several randomized controlled trials have been conducted to investigate its superiority over other anticoagulation modes. Thus, we performed a systematic review of available evidence on the efficacy and safety of RCA. A systematic review of randomized controlled trials investigating the efficacy and safety of RCA was performed. PubMed, Current Contents, CINAHL, and EMBASE databases were searched to identify relevance articles. Data on circuit life span, bleeding events, metabolic derangement, and mortality were abstracted. Mean difference was used for continuous variables, and risk ratio was used for binomial variables. The random effects or fixed effect model was used to combine these data according to heterogeneity. The software Review Manager 5.1 was used for the meta-analysis. Six studies met our inclusion criteria, which involved a total of 658 circuits. In these six studies patients with liver failure or a high risk of bleeding were excluded. The circuit life span in the RCA group was significantly longer than that in the control group, with a mean difference of 23.03 h (95% CI 0.45-45.61 h). RCA was able to reduce the risk of bleeding, with a risk ratio of 0.28 (95% CI 0.15-0.50). Metabolic stability (electrolyte and acid-base stabilities) in performing RCA was comparable to that in other anticoagulation modes, and metabolic derangements (hypernatremia, metabolic alkalosis, and hypocalcemia) could be easily controlled without significant clinical consequences. Two studies compared mortality rate between RCA and control groups, with one reported similar mortality rate and the other reported superiority of RCA over the control group (hazards ratio 0.7). RCA is effective in maintaining circuit patency and reducing the risk of bleeding, and thus can be recommended for CRRT if and when metabolic monitoring is adequate and the protocol is followed. However, the safety of citrate in patients with liver failure cannot be concluded from current analysis. The metabolic stability can be easily controlled during RCA. Survival benefit from RCA is still controversial due to limited evidence.
A retrospective survey of the causes of bracket- and tube-bonding failures.
Roelofs, Tom; Merkens, Nico; Roelofs, Jeroen; Bronkhorst, Ewald; Breuning, Hero
2017-01-01
To investigate the causes of bonding failures of orthodontic brackets and tubes and the effect of premedicating for saliva reduction. Premedication with atropine sulfate was administered randomly. Failure rate of brackets and tubes placed in a group of 158 consecutive patients was evaluated after a mean period of 67 weeks after bonding. The failure rate in the group without atropine sulfate premedication was 2.4%. In the group with premedication, the failure rate was 2.7%. The Cox regression analysis of these groups showed that atropine application did not lead to a reduction in bond failures. Statistically significant differences in the hazard ratio were found for the bracket regions and for the dental assistants who prepared for the bonding procedure. Premedication did not lead to fewer bracket failures. The roles of the dental assistant and patient in preventing failures was relevant. A significantly higher failure rate for orthodontic appliances was found in the posterior regions.
Negussie, Yamrot; Vanture, Sarah; Pleskunas, Jane; Ross, Craig S.; King, Charles
2014-01-01
Objectives. We examined the relationship between gun ownership and stranger versus nonstranger homicide rates. Methods. Using data from the Supplemental Homicide Reports of the Federal Bureau of Investigation’s Uniform Crime Reports for all 50 states for 1981 to 2010, we modeled stranger and nonstranger homicide rates as a function of state-level gun ownership, measured by a proxy, controlling for potential confounders. We used a negative binomial regression model with fixed effects for year, accounting for clustering of observations among states by using generalized estimating equations. Results. We found no robust, statistically significant correlation between gun ownership and stranger firearm homicide rates. However, we found a positive and significant association between gun ownership and nonstranger firearm homicide rates. The incidence rate ratio for nonstranger firearm homicide rate associated with gun ownership was 1.014 (95% confidence interval = 1.009, 1.019). Conclusions. Our findings challenge the argument that gun ownership deters violent crime, in particular, homicides. PMID:25121817
Siegel, Michael; Negussie, Yamrot; Vanture, Sarah; Pleskunas, Jane; Ross, Craig S; King, Charles
2014-10-01
We examined the relationship between gun ownership and stranger versus nonstranger homicide rates. Using data from the Supplemental Homicide Reports of the Federal Bureau of Investigation's Uniform Crime Reports for all 50 states for 1981 to 2010, we modeled stranger and nonstranger homicide rates as a function of state-level gun ownership, measured by a proxy, controlling for potential confounders. We used a negative binomial regression model with fixed effects for year, accounting for clustering of observations among states by using generalized estimating equations. We found no robust, statistically significant correlation between gun ownership and stranger firearm homicide rates. However, we found a positive and significant association between gun ownership and nonstranger firearm homicide rates. The incidence rate ratio for nonstranger firearm homicide rate associated with gun ownership was 1.014 (95% confidence interval=1.009, 1.019). Our findings challenge the argument that gun ownership deters violent crime, in particular, homicides.
Abuse and diversion of buprenorphine sublingual tablets and film.
Lavonas, Eric J; Severtson, S Geoffrey; Martinez, Erin M; Bucher-Bartelson, Becki; Le Lait, Marie-Claire; Green, Jody L; Murrelle, Lenn E; Cicero, Theodore J; Kurtz, Steven P; Rosenblum, Andrew; Surratt, Hilary L; Dart, Richard C
2014-07-01
Buprenorphine abuse is common worldwide. Rates of abuse and diversion of three sublingual buprenorphine formulations (single ingredient tablets; naloxone combination tablets and film) were compared. Data were obtained from the Researched Abuse, Diversion, and Addiction-Related Surveillance (RADARS) System Poison Center, Drug Diversion, Opioid Treatment (OTP), Survey of Key Informants' Patients (SKIP), and College Survey Programs through December 2012. To control for drug availability, event ratios (rates) were calculated quarterly, based on the number of patients filling prescriptions for each formulation ("unique recipients of a dispensed drug," URDD) and averaged and compared using negative binomial regression. Abuse rates in the OTP, SKIP, and College Survey Programs were greatest for single ingredient tablets, and abuse rates in the Poison Center Program and illicit diversion rates were greatest for the combination tablets. Combination film rates were significantly less than rates for either tablet formulation in all programs. No geographic pattern could be discerned. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Martín-González, F; González-Robledo, J; Sánchez-Hernández, F; Moreno-García, M N; Barreda-Mellado, I
2016-01-01
To assess the effectiveness and identify predictors of failure of noninvasive ventilation. A retrospective, longitudinal descriptive study was made. Adult patients with acute respiratory failure. A total of 410 consecutive patients with noninvasive ventilation treated in an Intensive Care Unit of a tertiary university hospital from 2006 to 2011. Noninvasive ventilation. Demographic variables and clinical and laboratory test parameters at the start and two hours after the start of noninvasive ventilation. Evolution during admission to the Unit and until hospital discharge. The failure rate was 50%, with an overall mortality rate of 33%. A total of 156 patients had hypoxemic respiratory failure, 87 postextubation respiratory failure, 78 exacerbation of chronic obstructive pulmonary disease, 61 hypercapnic respiratory failure without chronic obstructive pulmonary disease, and 28 had acute pulmonary edema. The failure rates were 74%, 54%, 27%, 31% and 21%, respectively. The etiology of respiratory failure, serum bilirubin at the start, APACHEII score, radiological findings, the need for sedation to tolerate noninvasive ventilation, changes in level of consciousness, PaO2/FIO2 ratio, respiratory rate and heart rate from the start and two hours after the start of noninvasive ventilation were independently associated to failure. The effectiveness of noninvasive ventilation varies according to the etiology of respiratory failure. Its use in hypoxemic respiratory failure and postextubation respiratory failure should be assessed individually. Predictors of failure could be useful to prevent delayed intubation. Copyright © 2015 Elsevier España, S.L.U. and SEMICYUC. All rights reserved.
Dyrda, Katia; Roy, Denis; Leduc, Hugues; Talajic, Mario; Stevenson, Lynne Warner; Guerra, Peter G; Andrade, Jason; Dubuc, Marc; Macle, Laurent; Thibault, Bernard; Rivard, Lena; Khairy, Paul
2015-12-01
Rate and rhythm control strategies for atrial fibrillation (AF) are not always effective or well tolerated in patients with congestive heart failure (CHF). We assessed reasons for treatment failure, associated characteristics, and effects on survival. A total of 1,376 patients enrolled in the AF-CHF trial were followed for 37 ± 19 months, 206 (15.0%) of whom failed initial therapy leading to crossover. Rhythm control was abandoned more frequently than rate control (21.0% vs. 9.1%, P < 0.0001). Crossovers from rhythm to rate control were driven by inefficacy, whereas worsening heart failure was the most common reason to crossover from rate to rhythm control. In multivariate analyses, failure of rhythm control was associated with female sex, higher serum creatinine, functional class III or IV symptoms, lack of digoxin, and oral anticoagulation. Factors independently associated with failure of rate control were paroxysmal (vs. persistent) AF, statin therapy, and presence of an implantable cardioverter-defibrillator. Crossovers were not associated with cardiovascular mortality (hazard ratio [HR] 1.11 from rhythm to rate control; 95% confidence interval [95% CI, 0.73-1.73]; P = 0.6069; HR 1.29 from rate to rhythm control; 95% CI, 0.73-2.25; P = 0.3793) or all-cause mortality (HR 1.16 from rhythm to rate control, 95% CI [0.79-1.72], P = 0.4444; HR 1.15 from rate to rhythm control, 95% [0.69, 1.91], P = 0.5873). Rhythm control is abandoned more frequently than rate control in patients with AF and CHF. The most common reasons for treatment failure are inefficacy for rhythm control and worsening heart failure for rate control. Changing strategies does not impact survival. © 2015 Wiley Periodicals, Inc.
Gillis, Jennifer; Bayoumi, Ahmed M; Burchell, Ann N; Cooper, Curtis; Klein, Marina B; Loutfy, Mona; Machouf, Nima; Montaner, Julio Sg; Tsoukas, Chris; Hogg, Robert S; Raboud, Janet
2015-10-26
As the average age of the HIV-positive population increases, there is increasing need to monitor patients for the development of comorbidities as well as for drug toxicities. We examined factors associated with the frequency of measurement of liver enzymes, renal function tests, and lipid levels among participants of the Canadian Observational Cohort (CANOC) collaboration which follows people who initiated HIV antiretroviral therapy in 2000 or later. We used zero-inflated negative binomial regression models to examine the associations of demographic and clinical characteristics with the rates of measurement during follow-up. Generalized estimating equations with a logit link were used to examine factors associated with gaps of 12 months or more between measurements. Electronic laboratory data were available for 3940 of 7718 CANOC participants. The median duration of electronic follow-up was 3.5 years. The median (interquartile) rates of tests per year were 2.76 (1.60, 3.73), 2.55 (1.44, 3.38) and 1.42 (0.50, 2.52) for liver, renal and lipid parameters, respectively. In multivariable zero-inflated negative binomial regression models, individuals infected through injection drug use (IDU) were significantly less likely to have any measurements. Among participants with at least one measurement, rates of measurement of liver, renal and lipid tests were significantly lower for younger individuals and Aboriginal Peoples. Hepatitis C co-infected individuals with a history of IDU had lower rates of measurement and were at greater risk of having 12 month gaps between measurements. Hepatitis C co-infected participants infected through IDU were at increased risk of gaps in testing, despite publicly funded health care and increased risk of comorbid conditions. This should be taken into consideration in analyses examining factors associated with outcomes based on laboratory parameters.
Guillaumin, Julien; Olp, Nichole M; Magnusson, Karissa D; Butler, Amy L; Daniels, Joshua B
2017-09-01
To assess the rate of bacterial contamination of fluid and ports in intravenous bags in a veterinary emergency room (ER) and intensive care unit (ICU). Experimental model. Ninety intravenous fluid bags of lactated balanced-electrolytes solution (1 L) hung in a university hospital. Bags were hung in 2 different locations in the ER (sink and bins) and one location in the ICU (sink) for 11 days. Bags were punctured 3 times daily with a sterile needle to simulate clinical use. Injection ports were swabbed and 50 mL of fluid were collected in duplicates on days 0, 2, 4, 7, and 10. Aerobic bacterial cultures were performed on the fluid and injection port. Contamination was defined as bacterial growth of a similar phenotype across 2 consecutive times. Increase in the fluid contamination rate from day 0 was tested using an exact binomial test. Port contamination rate between locations was tested using Fisher's exact test. Combined bacterial growth on injection ports reached a mean (95% confidence interval) of 8.1 (0.005-16.2) cfu/port on day 10. The combined port contamination was 3.3%, 11.1%, 17.8%, and 31.1% on days 0, 2, 4, and 7, respectively. Port contamination was similar between ER and ICU. However, port contamination was higher in the sink versus the bins area (38.3% vs 16.7%, P = 0.032). No fluid bag was contaminated at days 0 and 2. The contamination rate of fluid bag was 1.1% and 4.4% on days 4 and 7, respectively. All bags with contaminated fluid were in the ER (6.7%, 95% exact binomial confidence interval 1.9-16.2%). Injection port contamination reached 31.1% on day 7. Contamination was more likely when the bags were hung next to a sink. In our model of bag puncture, fluid contamination occurred between days 2 and 4. © Veterinary Emergency and Critical Care Society 2017.
Reliability analysis of C-130 turboprop engine components using artificial neural network
NASA Astrophysics Data System (ADS)
Qattan, Nizar A.
In this study, we predict the failure rate of Lockheed C-130 Engine Turbine. More than thirty years of local operational field data were used for failure rate prediction and validation. The Weibull regression model and the Artificial Neural Network model including (feed-forward back-propagation, radial basis neural network, and multilayer perceptron neural network model); will be utilized to perform this study. For this purpose, the thesis will be divided into five major parts. First part deals with Weibull regression model to predict the turbine general failure rate, and the rate of failures that require overhaul maintenance. The second part will cover the Artificial Neural Network (ANN) model utilizing the feed-forward back-propagation algorithm as a learning rule. The MATLAB package will be used in order to build and design a code to simulate the given data, the inputs to the neural network are the independent variables, the output is the general failure rate of the turbine, and the failures which required overhaul maintenance. In the third part we predict the general failure rate of the turbine and the failures which require overhaul maintenance, using radial basis neural network model on MATLAB tool box. In the fourth part we compare the predictions of the feed-forward back-propagation model, with that of Weibull regression model, and radial basis neural network model. The results show that the failure rate predicted by the feed-forward back-propagation artificial neural network model is closer in agreement with radial basis neural network model compared with the actual field-data, than the failure rate predicted by the Weibull model. By the end of the study, we forecast the general failure rate of the Lockheed C-130 Engine Turbine, the failures which required overhaul maintenance and six categorical failures using multilayer perceptron neural network (MLP) model on DTREG commercial software. The results also give an insight into the reliability of the engine turbine under actual operating conditions, which can be used by aircraft operators for assessing system and component failures and customizing the maintenance programs recommended by the manufacturer.
Contraceptive failure rates: new estimates from the 1995 National Survey of Family Growth.
Fu, H; Darroch, J E; Haas, T; Ranjit, N
1999-01-01
Unintended pregnancy remains a major public health concern in the United States. Information on pregnancy rates among contraceptive users is needed to guide medical professionals' recommendations and individuals' choices of contraceptive methods. Data were taken from the 1995 National Survey of Family Growth (NSFG) and the 1994-1995 Abortion Patient Survey (APS). Hazards models were used to estimate method-specific contraceptive failure rates during the first six months and during the first year of contraceptive use for all U.S. women. In addition, rates were corrected to take into account the underreporting of induced abortion in the NSFG. Corrected 12-month failure rates were also estimated for subgroups of women by age, union status, poverty level, race or ethnicity, and religion. When contraceptive methods are ranked by effectiveness over the first 12 months of use (corrected for abortion underreporting), the implant and injectables have the lowest failure rates (2-3%), followed by the pill (8%), the diaphragm and the cervical cap (12%), the male condom (14%), periodic abstinence (21%), withdrawal (24%) and spermicides (26%). In general, failure rates are highest among cohabiting and other unmarried women, among those with an annual family income below 200% of the federal poverty level, among black and Hispanic women, among adolescents and among women in their 20s. For example, adolescent women who are not married but are cohabiting experience a failure rate of about 31% in the first year of contraceptive use, while the 12-month failure rate among married women aged 30 and older is only 7%. Black women have a contraceptive failure rate of about 19%, and this rate does not vary by family income; in contrast, overall 12-month rates are lower among Hispanic women (15%) and white women (10%), but vary by income, with poorer women having substantially greater failure rates than more affluent women. Levels of contraceptive failure vary widely by method, as well as by personal and background characteristics. Income's strong influence on contraceptive failure suggests that access barriers and the general disadvantage associated with poverty seriously impede effective contraceptive practice in the United States.
Lipscomb, Hester J; Schoenfisch, Ashley; Cameron, Wilfrid
2013-07-01
We evaluated work-related injuries involving a hand or fingers and associated costs among a cohort of 24,830 carpenters between 1989 and 2008. Injury rates and rate ratios were calculated by using Poisson regression to explore higher risk on the basis of age, sex, time in the union, predominant work, and calendar time. Negative binomial regression was used to model dollars paid per claim after adjustment for inflation and discounting. Hand injuries accounted for 21.1% of reported injuries and 9.5% of paid lost time injuries. Older carpenters had proportionately more amputations, fractures, and multiple injuries, but their rates of these more severe injuries were not higher. Costs exceeded $21 million, a cost burden of $0.11 per hour worked. Older carpenters' higher proportion of serious injuries in the absence of higher rates likely reflects age-related reporting differences.
Comparison of Sprint Fidelis and Riata defibrillator lead failure rates.
Fazal, Iftikhar A; Shepherd, Ewen J; Tynan, Margaret; Plummer, Christopher J; McComb, Janet M
2013-09-30
Sprint Fidelis and Riata defibrillator leads are prone to early failure. Few data exist on the comparative failure rates and mortality related to lead failure. The aims of this study were to determine the failure rate of Sprint Fidelis and Riata leads, and to compare failure rates and mortality rates in both groups. Patients implanted with Sprint Fidelis leads and Riata leads at a single centre were identified and in July 2012, records were reviewed to ascertain lead failures, deaths, and relationship to device/lead problems. 113 patients had Sprint Fidelis leads implanted between June 2005 and September 2007; Riata leads were implanted in 106 patients between January 2003 and February 2008. During 53.0 ± 22.3 months of follow-up there were 13 Sprint Fidelis lead failures (11.5%, 2.60% per year) and 25 deaths. Mean time to failure was 45.1 ± 15.5 months. In the Riata lead cohort there were 32 deaths, and 13 lead failures (11.3%, 2.71% per year) over 54.8 ± 26.3 months follow-up with a mean time to failure of 53.5 ± 24.5 months. There were no significant differences in the lead failure-free Kaplan-Meier survival curve (p=0.77), deaths overall (p=0.17), or deaths categorised as sudden/cause unknown (p=0.54). Sprint Fidelis and Riata leads have a significant but comparable failure rate at 2.60% per year and 2.71% per year of follow-up respectively. The number of deaths in both groups is similar and no deaths have been identified as being related to lead failure in either cohort. Copyright © 2012. Published by Elsevier Ireland Ltd.
Use of the binomial distribution to predict impairment: application in a nonclinical sample.
Axelrod, Bradley N; Wall, Jacqueline R; Estes, Bradley W
2008-01-01
A mathematical model based on the binomial theory was developed to illustrate when abnormal score variations occur by chance in a multitest battery (Ingraham & Aiken, 1996). It has been successfully used as a comparison for obtained test scores in clinical samples, but not in nonclinical samples. In the current study, this model has been applied to demographically corrected scores on the Halstead-Reitan Neuropsychological Test Battery, obtained from a sample of 94 nonclinical college students. Results found that 15% of the sample had impairments suggested by the Halstead Impairment Index, using criteria established by Reitan and Wolfson (1993). In addition, one-half of the sample obtained impaired scores on one or two tests. These results were compared to that predicted by the binomial model and found to be consistent. The model therefore serves as a useful resource for clinicians considering the probability of impaired test performance.
Poisson and negative binomial item count techniques for surveys with sensitive question.
Tian, Guo-Liang; Tang, Man-Lai; Wu, Qin; Liu, Yin
2017-04-01
Although the item count technique is useful in surveys with sensitive questions, privacy of those respondents who possess the sensitive characteristic of interest may not be well protected due to a defect in its original design. In this article, we propose two new survey designs (namely the Poisson item count technique and negative binomial item count technique) which replace several independent Bernoulli random variables required by the original item count technique with a single Poisson or negative binomial random variable, respectively. The proposed models not only provide closed form variance estimate and confidence interval within [0, 1] for the sensitive proportion, but also simplify the survey design of the original item count technique. Most importantly, the new designs do not leak respondents' privacy. Empirical results show that the proposed techniques perform satisfactorily in the sense that it yields accurate parameter estimate and confidence interval.
Reliability Growth in Space Life Support Systems
NASA Technical Reports Server (NTRS)
Jones, Harry W.
2014-01-01
A hardware system's failure rate often increases over time due to wear and aging, but not always. Some systems instead show reliability growth, a decreasing failure rate with time, due to effective failure analysis and remedial hardware upgrades. Reliability grows when failure causes are removed by improved design. A mathematical reliability growth model allows the reliability growth rate to be computed from the failure data. The space shuttle was extensively maintained, refurbished, and upgraded after each flight and it experienced significant reliability growth during its operational life. In contrast, the International Space Station (ISS) is much more difficult to maintain and upgrade and its failure rate has been constant over time. The ISS Carbon Dioxide Removal Assembly (CDRA) reliability has slightly decreased. Failures on ISS and with the ISS CDRA continue to be a challenge.
Chisti, Mohammod Jobayer; Salam, Mohammed Abdus; Bardhan, Pradip Kumar; Faruque, Abu S. G.; Shahid, Abu S. M. S. B.; Shahunja, K. M.; Das, Sumon Kumar; Hossain, Md Iqbal; Ahmed, Tahmeed
2015-01-01
Background Appropriate intervention is critical in reducing deaths among under-five, severe acutely malnourished (SAM) children with danger signs of severe pneumonia; however, there is paucity of data on outcome of World Health Organisation (WHO) recommended interventions of SAM children with severe pneumonia. We sought to evaluate outcome of the interventions in such children. Methods We prospectively enrolled SAM children aged 0–59 months, admitted to the Intensive Care Unit (ICU) or Acute Respiratory Infection (ARI) ward of the Dhaka Hospital of the International Centre for Diarrhoeal Disease Research, Bangladesh (icddr,b), between April 2011 and June 2012 with cough or respiratory difficulty and radiological pneumonia. All the enrolled children were treated with ampicillin and gentamicin, and micronutrients as recommended by the WHO. Comparison was made among pneumonic children with (n = 111) and without WHO defined danger signs of severe pneumonia (n = 296). The outcomes of interest were treatment failure (if a child required changing of antibiotics) and deaths during hospitalization. Further comparison was also made among those who developed treatment failure and who did not and among the survivors and deaths. Results SAM children with danger signs of severe pneumonia more often experienced treatment failure (58% vs. 20%; p<0.001) and fatal outcome (21% vs. 4%; p<0.001) compared to those without danger signs. Only 6/111 (5.4%) SAM children with danger signs of severe pneumonia and 12/296 (4.0%) without danger signs had bacterial isolates from blood. In log-linear binomial regression analysis, after adjusting for potential confounders, danger signs of severe pneumonia, dehydration, hypocalcaemia, and bacteraemia were independently associated both with treatment failure and deaths in SAM children presenting with cough or respiratory difficulty and radiological pneumonia (p<0.01). Conclusion and Significance The result suggests that SAM children with cough or respiratory difficulty and radiologic pneumonia who had WHO-defined danger signs of severe pneumonia more often had treatment failure and fatal outcome compared to those without the danger signs. In addition to danger signs of severe pneumonia, other common causes of both treatment failure and deaths were dehydration, hypocalcaemia, and bacteraemia on admission. The result underscores the importance for further research especially a randomized, controlled clinical trial to validate standard WHO therapy in SAM children with pneumonia especially with danger signs of severe pneumonia to reduce treatment failures and deaths. PMID:26451603
Lin, Huan-Tang; Liu, Fu-Chao; Lin, Jr-Rung; Pang, See-Tong; Yu, Huang-Ping
2018-06-04
Most patients with uraemia must undergo chronic dialysis while awaiting kidney transplantation; however, the role of the pretransplant dialysis modality on the outcomes of kidney transplantation remains obscure. The objective of this study was to clarify the associations between the pretransplant dialysis modality, namely haemodialysis (HD) or peritoneal dialysis (PD), and the development of post-transplant de novo diseases, allograft failure and all-cause mortality for kidney-transplant recipients. Retrospective nationwide cohort study. Data retrieved from the Taiwan National Health Insurance Research Database. The National Health Insurance database was explored for patients who received kidney transplantation in Taiwan during 1998-2011 and underwent dialysis >90 days before transplantation. The pretransplant characteristics, complications during kidney transplantation and post-transplant outcomes were statistically analysed and compared between the HD and PD groups. Cox regression analysis was used to evaluate the HR of the dialysis modality on graft failure and all-cause mortality. The primary outcomes were long-term post-transplant death-censored allograft failure and all-cause mortality started after 90 days of kidney transplantation until the end of follow-up. The secondary outcomes were events during kidney transplantation and post-transplant de novo diseases adjusted by propensity score in log-binomial model. There were 1812 patients included in our cohort, among which 1209 (66.7%) and 603 (33.3%) recipients received pretransplant HD and PD, respectively. Recipients with chronic HD were generally older and male, had higher risks of developing post-transplant de novo ischaemic heart disease, tuberculosis and hepatitis C after adjustment. Pretransplant HD contributed to higher graft failure in the multivariate analysis (HR 1.38, p<0.05) after adjustment for the recipient age, sex, duration of dialysis and pretransplant diseases. There was no significant between-group difference in overall survival. Pretransplant HD contributed to higher risks of death-censored allograft failure after kidney transplantation when compared with PD. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Evaluation of possible prognostic factors for the success, survival, and failure of dental implants.
Geckili, Onur; Bilhan, Hakan; Geckili, Esma; Cilingir, Altug; Mumcu, Emre; Bural, Canan
2014-02-01
To analyze the prognostic factors that are associated with the success, survival, and failure rates of dental implants. Data including implant sizes, insertion time, implant location, and prosthetic treatment of 1656 implants have been collected, and the association of these factors with success, survival, and failure of implants was analyzed. The success rate was lower for short and maxillary implants. The failure rate of maxillary implants exceeded that of mandibular implants, and the failure rate of implants that were placed in the maxillary anterior region was significantly higher than other regions. The failure rates of implants that were placed 5 years ago or more were higher than those that were placed later. Anterior maxilla is more critical for implant loss than other sites. Implants in the anterior mandible show better success compared with other locations, and longer implants show better success rates. The learning curve of the clinician influences survival and success rates of dental implants.
Human evolution in the age of the intelligent machine
NASA Technical Reports Server (NTRS)
Mclaughlin, W. I.
1983-01-01
A systems analysis of the future evolution of man can be conducted by analyzing the biological material of the galaxy into three subsystems: man, intelligent machines, and intelligent extraterrestrial organisms. A binomial interpretation is applied to this system wherein each of the subsystems is assigned a designation of success or failure. For man the two alternatives are, respectively, 'decline' or 'flourish', for machine they are 'become intelligent' or 'stay dumb', while for extraterrestrial intelligence the dichotomy is that of 'existence' or 'nonexistence'. The choices for each of three subsystems yield a total of eight possible states for the system. The relative lack of integration between brain components makes man a weak evolutionary contestant compared to machines. It is judged that machines should become dominant on earth within 100 years, probably by means of continuing development of existing man-machine systems. Advanced forms of extraterrestrial intelligence may exist but are too difficult to observe. The prospects for communication with extraterrestrial intelligence are reviewed.
Automatic variance analysis of multistage care pathways.
Li, Xiang; Liu, Haifeng; Zhang, Shilei; Mei, Jing; Xie, Guotong; Yu, Yiqin; Li, Jing; Lakshmanan, Geetika T
2014-01-01
A care pathway (CP) is a standardized process that consists of multiple care stages, clinical activities and their relations, aimed at ensuring and enhancing the quality of care. However, actual care may deviate from the planned CP, and analysis of these deviations can help clinicians refine the CP and reduce medical errors. In this paper, we propose a CP variance analysis method to automatically identify the deviations between actual patient traces in electronic medical records (EMR) and a multistage CP. As the care stage information is usually unavailable in EMR, we first align every trace with the CP using a hidden Markov model. From the aligned traces, we report three types of deviations for every care stage: additional activities, absent activities and violated constraints, which are identified by using the techniques of temporal logic and binomial tests. The method has been applied to a CP for the management of congestive heart failure and real world EMR, providing meaningful evidence for the further improvement of care quality.
Human evolution in the age of the intelligent machine
NASA Astrophysics Data System (ADS)
McLaughlin, W. I.
A systems analysis of the future evolution of man can be conducted by analyzing the biological material of the galaxy into three subsystems: man, intelligent machines, and intelligent extraterrestrial organisms. A binomial interpretation is applied to this system wherein each of the subsystems is assigned a designation of success or failure. For man the two alternatives are, respectively, 'decline' or 'flourish', for machine they are 'become intelligent' or 'stay dumb', while for extraterrestrial intelligence the dichotomy is that of 'existence' or 'nonexistence'. The choices for each of three subsystems yield a total of eight possible states for the system. The relative lack of integration between brain components makes man a weak evolutionary contestant compared to machines. It is judged that machines should become dominant on earth within 100 years, probably by means of continuing development of existing man-machine systems. Advanced forms of extraterrestrial intelligence may exist but are too difficult to observe. The prospects for communication with extraterrestrial intelligence are reviewed.
Failure rate and reliability of the KOMATSU hydraulic excavator in surface limestone mine
NASA Astrophysics Data System (ADS)
Harish Kumar N., S.; Choudhary, R. P.; Murthy, Ch. S. N.
2018-04-01
The model with failure rate function of bathtub-shaped is helpful in reliability analysis of any system and particularly in reliability associated privative maintenance. The usual Weibull distribution is, however, not capable to model the complete lifecycle of the any with a bathtub-shaped failure rate function. In this paper, failure rate and reliability analysis of the KOMATSU hydraulic excavator/shovel in surface mine is presented and also to improve the reliability and decrease the failure rate of each subsystem of the shovel based on the preventive maintenance. The model of the bathtub-shaped for shovel can also be seen as a simplification of the Weibull distribution.
Actuarial calculation for PSAK-24 purposes post-employment benefit using market-consistent approach
NASA Astrophysics Data System (ADS)
Effendie, Adhitya Ronnie
2015-12-01
In this paper we use a market-consistent approach to calculate present value of obligation of a companies' post-employment benefit in accordance with PSAK-24 (the Indonesian accounting standard). We set some actuarial assumption such as Indonesian TMI 2011 mortality tables for mortality assumptions, accumulated salary function for wages assumption, a scaled (to mortality) disability assumption and a pre-defined turnover rate for termination assumption. For economic assumption, we use binomial tree method with estimated discount rate as its average movement. In accordance with PSAK-24, the Projected Unit Credit method has been adapted to determine the present value of obligation (actuarial liability), so we use this method with a modification in its discount function.
Hopelessness as a Predictor of Suicide Ideation in Depressed Male and Female Adolescent Youth.
Wolfe, Kristin L; Nakonezny, Paul A; Owen, Victoria J; Rial, Katherine V; Moorehead, Alexandra P; Kennard, Beth D; Emslie, Graham J
2017-12-21
We examined hopelessness as a predictor of suicide ideation in depressed youth after acute medication treatment. A total of 158 depressed adolescents were administered the Children's Depression Rating Scale-Revised (CDRS-R) and Columbia Suicide Severity Rating Scale (C-SSRS) as part of a larger battery at baseline and at weekly visits across 6 weeks of acute fluoxetine treatment. The Beck Hopelessness Scale (BHS) was administered at baseline and week 6. A negative binomial regression model via a generalized estimating equation analysis of repeated measures was used to estimate suicide ideation over the 6 weeks of acute treatment from baseline measure of hopelessness. Depression severity and gender were included as covariates in the model. The negative binomial analysis was also conducted separately for the sample of males and females (in a gender-stratified analysis). Mean CDRS-R total scores were 60.30 ± 8.93 at baseline and 34.65 ± 10.41 at week 6. Mean baseline and week 6 BHS scores were 9.57 ± 5.51 and 5.59 ± 5.38, respectively. Per the C-SSRS, 43.04% and 83.54% reported having no suicide ideation at baseline and at week 6, respectively. The analyses revealed that baseline hopelessness was positively related to suicide ideation over treatment (p = .0027), independent of changes in depression severity. This significant finding persisted only for females (p = .0024). These results indicate the importance of early identification of hopelessness. © 2017 The American Association of Suicidology.
VLSI (Very Large Scale Integrated Circuits) Device Reliability Models.
1984-12-01
CIRCUIT COMPLEXITY FAILURE RATES FOR... A- 40 MOS SSI/MSI DEVICES IN FAILURE PER 106 HOURS TABLE 5.1.2.5-19: C1 AND C2 CIRCUIT COMPLEXITY FAILURE RATES FOR...A- 40 MOS SSI/MSI DEVICES IN FAILURE PER 106 HOURS TABLE 5.1.2.5-19: Cl AND C2 CIRCUIT COMPLEXITY FAILURE RATES FOR... A-41 LINEAR DEVICES IN...19 National Semiconductor 20 Nitron 21 Raytheon 22 Sprague 23 Synertek 24 Teledyne Crystalonics 25 TRW Semiconductor 26 Zilog The following companies
Data mining of tree-based models to analyze freeway accident frequency.
Chang, Li-Yen; Chen, Wen-Chieh
2005-01-01
Statistical models, such as Poisson or negative binomial regression models, have been employed to analyze vehicle accident frequency for many years. However, these models have their own model assumptions and pre-defined underlying relationship between dependent and independent variables. If these assumptions are violated, the model could lead to erroneous estimation of accident likelihood. Classification and Regression Tree (CART), one of the most widely applied data mining techniques, has been commonly employed in business administration, industry, and engineering. CART does not require any pre-defined underlying relationship between target (dependent) variable and predictors (independent variables) and has been shown to be a powerful tool, particularly for dealing with prediction and classification problems. This study collected the 2001-2002 accident data of National Freeway 1 in Taiwan. A CART model and a negative binomial regression model were developed to establish the empirical relationship between traffic accidents and highway geometric variables, traffic characteristics, and environmental factors. The CART findings indicated that the average daily traffic volume and precipitation variables were the key determinants for freeway accident frequencies. By comparing the prediction performance between the CART and the negative binomial regression models, this study demonstrates that CART is a good alternative method for analyzing freeway accident frequencies. By comparing the prediction performance between the CART and the negative binomial regression models, this study demonstrates that CART is a good alternative method for analyzing freeway accident frequencies.
Nakagawa, Shinichi; Johnson, Paul C D; Schielzeth, Holger
2017-09-01
The coefficient of determination R 2 quantifies the proportion of variance explained by a statistical model and is an important summary statistic of biological interest. However, estimating R 2 for generalized linear mixed models (GLMMs) remains challenging. We have previously introduced a version of R 2 that we called [Formula: see text] for Poisson and binomial GLMMs, but not for other distributional families. Similarly, we earlier discussed how to estimate intra-class correlation coefficients (ICCs) using Poisson and binomial GLMMs. In this paper, we generalize our methods to all other non-Gaussian distributions, in particular to negative binomial and gamma distributions that are commonly used for modelling biological data. While expanding our approach, we highlight two useful concepts for biologists, Jensen's inequality and the delta method, both of which help us in understanding the properties of GLMMs. Jensen's inequality has important implications for biologically meaningful interpretation of GLMMs, whereas the delta method allows a general derivation of variance associated with non-Gaussian distributions. We also discuss some special considerations for binomial GLMMs with binary or proportion data. We illustrate the implementation of our extension by worked examples from the field of ecology and evolution in the R environment. However, our method can be used across disciplines and regardless of statistical environments. © 2017 The Author(s).
Markov and semi-Markov processes as a failure rate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabski, Franciszek
2016-06-08
In this paper the reliability function is defined by the stochastic failure rate process with a non negative and right continuous trajectories. Equations for the conditional reliability functions of an object, under assumption that the failure rate is a semi-Markov process with an at most countable state space are derived. A proper theorem is presented. The linear systems of equations for the appropriate Laplace transforms allow to find the reliability functions for the alternating, the Poisson and the Furry-Yule failure rate processes.
Negative Binomial Process Count and Mixture Modeling.
Zhou, Mingyuan; Carin, Lawrence
2015-02-01
The seemingly disjoint problems of count and mixture modeling are united under the negative binomial (NB) process. A gamma process is employed to model the rate measure of a Poisson process, whose normalization provides a random probability measure for mixture modeling and whose marginalization leads to an NB process for count modeling. A draw from the NB process consists of a Poisson distributed finite number of distinct atoms, each of which is associated with a logarithmic distributed number of data samples. We reveal relationships between various count- and mixture-modeling distributions and construct a Poisson-logarithmic bivariate distribution that connects the NB and Chinese restaurant table distributions. Fundamental properties of the models are developed, and we derive efficient Bayesian inference. It is shown that with augmentation and normalization, the NB process and gamma-NB process can be reduced to the Dirichlet process and hierarchical Dirichlet process, respectively. These relationships highlight theoretical, structural, and computational advantages of the NB process. A variety of NB processes, including the beta-geometric, beta-NB, marked-beta-NB, marked-gamma-NB and zero-inflated-NB processes, with distinct sharing mechanisms, are also constructed. These models are applied to topic modeling, with connections made to existing algorithms under Poisson factor analysis. Example results show the importance of inferring both the NB dispersion and probability parameters.
Goodness-of-fit tests and model diagnostics for negative binomial regression of RNA sequencing data.
Mi, Gu; Di, Yanming; Schafer, Daniel W
2015-01-01
This work is about assessing model adequacy for negative binomial (NB) regression, particularly (1) assessing the adequacy of the NB assumption, and (2) assessing the appropriateness of models for NB dispersion parameters. Tools for the first are appropriate for NB regression generally; those for the second are primarily intended for RNA sequencing (RNA-Seq) data analysis. The typically small number of biological samples and large number of genes in RNA-Seq analysis motivate us to address the trade-offs between robustness and statistical power using NB regression models. One widely-used power-saving strategy, for example, is to assume some commonalities of NB dispersion parameters across genes via simple models relating them to mean expression rates, and many such models have been proposed. As RNA-Seq analysis is becoming ever more popular, it is appropriate to make more thorough investigations into power and robustness of the resulting methods, and into practical tools for model assessment. In this article, we propose simulation-based statistical tests and diagnostic graphics to address model adequacy. We provide simulated and real data examples to illustrate that our proposed methods are effective for detecting the misspecification of the NB mean-variance relationship as well as judging the adequacy of fit of several NB dispersion models.
Design and analysis of three-arm trials with negative binomially distributed endpoints.
Mütze, Tobias; Munk, Axel; Friede, Tim
2016-02-20
A three-arm clinical trial design with an experimental treatment, an active control, and a placebo control, commonly referred to as the gold standard design, enables testing of non-inferiority or superiority of the experimental treatment compared with the active control. In this paper, we propose methods for designing and analyzing three-arm trials with negative binomially distributed endpoints. In particular, we develop a Wald-type test with a restricted maximum-likelihood variance estimator for testing non-inferiority or superiority. For this test, sample size and power formulas as well as optimal sample size allocations will be derived. The performance of the proposed test will be assessed in an extensive simulation study with regard to type I error rate, power, sample size, and sample size allocation. For the purpose of comparison, Wald-type statistics with a sample variance estimator and an unrestricted maximum-likelihood estimator are included in the simulation study. We found that the proposed Wald-type test with a restricted variance estimator performed well across the considered scenarios and is therefore recommended for application in clinical trials. The methods proposed are motivated and illustrated by a recent clinical trial in multiple sclerosis. The R package ThreeArmedTrials, which implements the methods discussed in this paper, is available on CRAN. Copyright © 2015 John Wiley & Sons, Ltd.
Spacecraft Parachute Recovery System Testing from a Failure Rate Perspective
NASA Technical Reports Server (NTRS)
Stewart, Christine E.
2013-01-01
Spacecraft parachute recovery systems, especially those with a parachute cluster, require testing to identify and reduce failures. This is especially important when the spacecraft in question is human-rated. Due to the recent effort to make spaceflight affordable, the importance of determining a minimum requirement for testing has increased. The number of tests required to achieve a mature design, with a relatively constant failure rate, can be estimated from a review of previous complex spacecraft recovery systems. Examination of the Apollo parachute testing and the Shuttle Solid Rocket Booster recovery chute system operation will clarify at which point in those programs the system reached maturity. This examination will also clarify the risks inherent in not performing a sufficient number of tests prior to operation with humans on-board. When looking at complex parachute systems used in spaceflight landing systems, a pattern begins to emerge regarding the need for a minimum amount of testing required to wring out the failure modes and reduce the failure rate of the parachute system to an acceptable level for human spaceflight. Not only a sufficient number of system level testing, but also the ability to update the design as failure modes are found is required to drive the failure rate of the system down to an acceptable level. In addition, sufficient data and images are necessary to identify incipient failure modes or to identify failure causes when a system failure occurs. In order to demonstrate the need for sufficient system level testing prior to an acceptable failure rate, the Apollo Earth Landing System (ELS) test program and the Shuttle Solid Rocket Booster Recovery System failure history will be examined, as well as some experiences in the Orion Capsule Parachute Assembly System will be noted.
Moineddin, Rahim; Meaney, Christopher; Agha, Mohammad; Zagorski, Brandon; Glazier, Richard Henry
2011-08-19
Emergency departments are medical treatment facilities, designed to provide episodic care to patients suffering from acute injuries and illnesses as well as patients who are experiencing sporadic flare-ups of underlying chronic medical conditions which require immediate attention. Supply and demand for emergency department services varies across geographic regions and time. Some persons do not rely on the service at all whereas; others use the service on repeated occasions. Issues regarding increased wait times for services and crowding illustrate the need to investigate which factors are associated with increased frequency of emergency department utilization. The evidence from this study can help inform policy makers on the appropriate mix of supply and demand targeted health care policies necessary to ensure that patients receive appropriate health care delivery in an efficient and cost-effective manner. The purpose of this report is to assess those factors resulting in increased demand for emergency department services in Ontario. We assess how utilization rates vary according to the severity of patient presentation in the emergency department. We are specifically interested in the impact that access to primary care physicians has on the demand for emergency department services. Additionally, we wish to investigate these trends using a series of novel regression models for count outcomes which have yet to be employed in the domain of emergency medical research. Data regarding the frequency of emergency department visits for the respondents of Canadian Community Health Survey (CCHS) during our study interval (2003-2005) are obtained from the National Ambulatory Care Reporting System (NACRS). Patients' emergency department utilizations were linked with information from the Canadian Community Health Survey (CCHS) which provides individual level medical, socio-demographic, psychological and behavioral information for investigating predictors of increased emergency department utilization. Six different multiple regression models for count data were fitted to assess the influence of predictors on demand for emergency department services, including: Poisson, Negative Binomial, Zero-Inflated Poisson, Zero-Inflated Negative Binomial, Hurdle Poisson, and Hurdle Negative Binomial. Comparison of competing models was assessed by the Vuong test statistic. The CCHS cycle 2.1 respondents were a roughly equal mix of males (50.4%) and females (49.6%). The majority (86.2%) were young-middle aged adults between the ages of 20-64, living in predominantly urban environments (85.9%), with mid-high household incomes (92.2%) and well-educated, receiving at least a high-school diploma (84.1%). Many participants reported no chronic disease (51.9%), fell into a small number (0-5) of ambulatory diagnostic groups (62.3%), and perceived their health status as good/excellent (88.1%); however, were projected to have high Resource Utilization Band levels of health resource utilization (68.2%). These factors were largely stable for CCHS cycle 3.1 respondents. Factors influencing demand for emergency department services varied according to the severity of triage scores at initial presentation. For example, although a non-significant predictor of the odds of emergency department utilization in high severity cases, access to a primary care physician was a statistically significant predictor of the likelihood of emergency department utilization (OR: 0.69; 95% CI OR: 0.63-0.75) and the rate of emergency department utilization (RR: 0.57; 95% CI RR: 0.50-0.66) in low severity cases. Using a theoretically appropriate hurdle negative binomial regression model this unique study illustrates that access to a primary care physician is an important predictor of both the odds and rate of emergency department utilization in Ontario. Restructuring primary care services, with aims of increasing access to undersupplied populations may result in decreased emergency department utilization rates by approximately 43% for low severity triage level cases.
Faires, Meredith C; Pearl, David L; Ciccotelli, William A; Berke, Olaf; Reid-Smith, Richard J; Weese, J Scott
2014-05-12
In hospitals, Clostridium difficile infection (CDI) surveillance relies on unvalidated guidelines or threshold criteria to identify outbreaks. This can result in false-positive and -negative cluster alarms. The application of statistical methods to identify and understand CDI clusters may be a useful alternative or complement to standard surveillance techniques. The objectives of this study were to investigate the utility of the temporal scan statistic for detecting CDI clusters and determine if there are significant differences in the rate of CDI cases by month, season, and year in a community hospital. Bacteriology reports of patients identified with a CDI from August 2006 to February 2011 were collected. For patients detected with CDI from March 2010 to February 2011, stool specimens were obtained. Clostridium difficile isolates were characterized by ribotyping and investigated for the presence of toxin genes by PCR. CDI clusters were investigated using a retrospective temporal scan test statistic. Statistically significant clusters were compared to known CDI outbreaks within the hospital. A negative binomial regression model was used to identify associations between year, season, month and the rate of CDI cases. Overall, 86 CDI cases were identified. Eighteen specimens were analyzed and nine ribotypes were classified with ribotype 027 (n = 6) the most prevalent. The temporal scan statistic identified significant CDI clusters at the hospital (n = 5), service (n = 6), and ward (n = 4) levels (P ≤ 0.05). Three clusters were concordant with the one C. difficile outbreak identified by hospital personnel. Two clusters were identified as potential outbreaks. The negative binomial model indicated years 2007-2010 (P ≤ 0.05) had decreased CDI rates compared to 2006 and spring had an increased CDI rate compared to the fall (P = 0.023). Application of the temporal scan statistic identified several clusters, including potential outbreaks not detected by hospital personnel. The identification of time periods with decreased or increased CDI rates may have been a result of specific hospital events. Understanding the clustering of CDIs can aid in the interpretation of surveillance data and lead to the development of better early detection systems.
Hill, Anne-Marie; Etherton-Beer, Christopher; McPhail, Steven M; Morris, Meg E; Flicker, Leon; Shorr, Ronald; Bulsara, Max; Lee, Den-Ching; Francis-Coad, Jacqueline; Waldron, Nicholas; Boudville, Amanda; Haines, Terry
2017-02-02
Older adults frequently fall after discharge from hospital. Older people may have low self-perceived risk of falls and poor knowledge about falls prevention. The primary aim of the study is to evaluate the effect of providing tailored falls prevention education in addition to usual care on falls rates in older people after discharge from hospital compared to providing a social intervention in addition to usual care. The 'Back to My Best' study is a multisite, single blind, parallel-group randomised controlled trial with blinded outcome assessment and intention-to-treat analysis, adhering to CONSORT guidelines. Patients (n=390) (aged 60 years or older; score more than 7/10 on the Abbreviated Mental Test Score; discharged to community settings) from aged care rehabilitation wards in three hospitals will be recruited and randomly assigned to one of two groups. Participants allocated to the control group shall receive usual care plus a social visit. Participants allocated to the experimental group shall receive usual care and a falls prevention programme incorporating a video, workbook and individualised follow-up from an expert health professional to foster capability and motivation to engage in falls prevention strategies. The primary outcome is falls rates in the first 6 months after discharge, analysed using negative binomial regression with adjustment for participant's length of observation in the study. Secondary outcomes are injurious falls rates, the proportion of people who become fallers, functional status and health-related quality of life. Healthcare resource use will be captured from four sources for 6 months after discharge. The study is powered to detect a 30% relative reduction in the rate of falls (negative binomial incidence ratio 0.70) for a control rate of 0.80 falls per person over 6 months. Results will be presented in peer-reviewed journals and at conferences worldwide. This study is approved by hospital and university Human Research Ethics Committees. ACTRN12615000784516. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Hill, Anne-Marie; Etherton-Beer, Christopher; McPhail, Steven M; Morris, Meg E; Flicker, Leon; Bulsara, Max; Lee, Den-Ching; Francis-Coad, Jacqueline; Waldron, Nicholas; Boudville, Amanda; Haines, Terry
2017-01-01
Introduction Older adults frequently fall after discharge from hospital. Older people may have low self-perceived risk of falls and poor knowledge about falls prevention. The primary aim of the study is to evaluate the effect of providing tailored falls prevention education in addition to usual care on falls rates in older people after discharge from hospital compared to providing a social intervention in addition to usual care. Methods and analyses The ‘Back to My Best’ study is a multisite, single blind, parallel-group randomised controlled trial with blinded outcome assessment and intention-to-treat analysis, adhering to CONSORT guidelines. Patients (n=390) (aged 60 years or older; score more than 7/10 on the Abbreviated Mental Test Score; discharged to community settings) from aged care rehabilitation wards in three hospitals will be recruited and randomly assigned to one of two groups. Participants allocated to the control group shall receive usual care plus a social visit. Participants allocated to the experimental group shall receive usual care and a falls prevention programme incorporating a video, workbook and individualised follow-up from an expert health professional to foster capability and motivation to engage in falls prevention strategies. The primary outcome is falls rates in the first 6 months after discharge, analysed using negative binomial regression with adjustment for participant's length of observation in the study. Secondary outcomes are injurious falls rates, the proportion of people who become fallers, functional status and health-related quality of life. Healthcare resource use will be captured from four sources for 6 months after discharge. The study is powered to detect a 30% relative reduction in the rate of falls (negative binomial incidence ratio 0.70) for a control rate of 0.80 falls per person over 6 months. Ethics and dissemination Results will be presented in peer-reviewed journals and at conferences worldwide. This study is approved by hospital and university Human Research Ethics Committees. Trial registration number ACTRN12615000784516. PMID:28153933
Bennett, Bradley C; Balick, Michael J
2014-03-28
Medical research on plant-derived compounds requires a breadth of expertise from field to laboratory and clinical skills. Too often basic botanical skills are evidently lacking, especially with respect to plant taxonomy and botanical nomenclature. Binomial and familial names, synonyms and author citations are often misconstrued. The correct botanical name, linked to a vouchered specimen, is the sine qua non of phytomedical research. Without the unique identifier of a proper binomial, research cannot accurately be linked to the existing literature. Perhaps more significant, is the ambiguity of species determinations that ensues of from poor taxonomic practices. This uncertainty, not surprisingly, obstructs reproducibility of results-the cornerstone of science. Based on our combined six decades of experience with medicinal plants, we discuss the problems of inaccurate taxonomy and botanical nomenclature in biomedical research. This problems appear all too frequently in manuscripts and grant applications that we review and they extend to the published literature. We also review the literature on the importance of taxonomy in other disciplines that relate to medicinal plant research. In most cases, questions regarding orthography, synonymy, author citations, and current family designations of most plant binomials can be resolved using widely-available online databases and other electronic resources. Some complex problems require consultation with a professional plant taxonomist, which also is important for accurate identification of voucher specimens. Researchers should provide the currently accepted binomial and complete author citation, provide relevant synonyms, and employ the Angiosperm Phylogeny Group III family name. Taxonomy is a vital adjunct not only to plant-medicine research but to virtually every field of science. Medicinal plant researchers can increase the precision and utility of their investigations by following sound practices with respect to botanical nomenclature. Correct spellings, accepted binomials, author citations, synonyms, and current family designations can readily be found on reliable online databases. When questions arise, researcher should consult plant taxonomists. © 2013 Published by Elsevier Ireland Ltd.
Hosseinpour, Mehdi; Yahaya, Ahmad Shukri; Sadullah, Ahmad Farhan
2014-01-01
Head-on crashes are among the most severe collision types and of great concern to road safety authorities. Therefore, it justifies more efforts to reduce both the frequency and severity of this collision type. To this end, it is necessary to first identify factors associating with the crash occurrence. This can be done by developing crash prediction models that relate crash outcomes to a set of contributing factors. This study intends to identify the factors affecting both the frequency and severity of head-on crashes that occurred on 448 segments of five federal roads in Malaysia. Data on road characteristics and crash history were collected on the study segments during a 4-year period between 2007 and 2010. The frequency of head-on crashes were fitted by developing and comparing seven count-data models including Poisson, standard negative binomial (NB), random-effect negative binomial, hurdle Poisson, hurdle negative binomial, zero-inflated Poisson, and zero-inflated negative binomial models. To model crash severity, a random-effect generalized ordered probit model (REGOPM) was used given a head-on crash had occurred. With respect to the crash frequency, the random-effect negative binomial (RENB) model was found to outperform the other models according to goodness of fit measures. Based on the results of the model, the variables horizontal curvature, terrain type, heavy-vehicle traffic, and access points were found to be positively related to the frequency of head-on crashes, while posted speed limit and shoulder width decreased the crash frequency. With regard to the crash severity, the results of REGOPM showed that horizontal curvature, paved shoulder width, terrain type, and side friction were associated with more severe crashes, whereas land use, access points, and presence of median reduced the probability of severe crashes. Based on the results of this study, some potential countermeasures were proposed to minimize the risk of head-on crashes. Copyright © 2013 Elsevier Ltd. All rights reserved.
Milner, Alison J; Niven, Heather; LaMontagne, Anthony D
2015-09-21
Previous research showed an increase in Australian suicide rates during the Global Financial Crisis (GFC). There has been no research investigating whether suicide rates by occupational class changed during the GFC. The aim of this study was to investigate whether the GFC-associated increase in suicide rates in employed Australians may have masked changes by occupational class. Negative binomial regression models were used to investigate Rate Ratios (RRs) in suicide by occupational class. Years of the GFC (2007, 2008, 2009) were compared to the baseline years 2001-2006. There were widening disparities between a number of the lower class occupations and the highest class occupations during the years 2007, 2008, and 2009 for males, but less evidence of differences for females. Occupational disparities in suicide rates widened over the GFC period. There is a need for programs to be responsive to economic downturns, and to prioritise the occupational groups most affected.
A simulation model for the determination of tabarru' rate in a family takaful
NASA Astrophysics Data System (ADS)
Ismail, Hamizun bin
2014-06-01
The concept of tabarru' that is incorporated in family takaful serves to eliminate the element of uncertainty in the contract as a participant agree to relinquish as donation certain portion of his contribution. The most important feature in family takaful is that it does not guarantee a definite return on a participant's contribution, unlike its conventional counterpart where a premium is paid in return for a guaranteed amount of insurance benefit. In other words, investment return on contributed funds by the participants are based on actual investment experience. The objective of this study is to set up a framework for the determination of tabarru' rate by simulation. The model is based on binomial death process. Specifically, linear tabarru' rate and flat tabarru' rate are introduced. The results of the simulation trials show that the linear assumption on the tabarru' rate has an advantage over the flat counterpart as far as the risk of the investment accumulation on maturity is concerned.
Gregory, Joy; Lalor, Karin; Hall, Gillian V.; Becker, Niels
2012-01-01
We calculated rates of foodborne and waterborne infections reported to the health department in Victoria, Australia, during 2000–2009 for elderly residents of long-term care facilities (LTCFs) and the community. We used negative binomial regression to estimate incidence rate ratios, adjusting for age, sex, and reporting period. We analyzed 8,277 infections in elderly persons. Rates of campylobacteriosis, legionellosis, listeriosis, toxigenic Escherichia coli infections, and shigellosis were higher in community residents, and rates of Salmonella infection were higher in LTCF residents. Each year, 61.7 Campylobacter infections were reported per 100,000 LTCF residents, compared with 97.6 per 100,000 community residents. LTCF residents were at higher risk for S. enterica serotype Typhimurium associated with outbreaks. Rates of foodborne infections (except salmonellosis) were similar to or lower for LTCF residents than for community residents. These findings may indicate that food preparation practices in LTCFs are safer than those used by elderly persons in the community. PMID:22377177
Amick, Benjamin C; Hogg-Johnson, Sheilah; Latour-Villamil, Desiree; Saunders, Ron
2015-12-01
Do Ontario unionized construction firms have lower workers' compensation claims rates compared with nonunion firms? Building trade and construction trade association lists of union contractors were linked to Workplace Safety and Insurance Board claims data for 2006 to 2012. Data were pooled for 2006 to 2012, and negative binomial regressions conducted with adjustment to estimate a union safety effect. The sample included 5797 unionized and 38,626 nonunion construction firms. Total claims rates were 13% higher (1.13, 1.09 to 1.18) in unionized firms because of higher allowed no-lost-time claim rates (1.28, 1.23 to 1.34), whereas the lost-time claims rate was 14% lower (0.86, 0.82 to 0.91). Unionized construction firms compared with nonunion firms have higher no-lost-time and lower lost-time claims rates. Unionized firms may encourage occupational injury reporting and reduce risks through training and hazard identification and control strategies.
The relationship between gun ownership and firearm homicide rates in the United States, 1981-2010.
Siegel, Michael; Ross, Craig S; King, Charles
2013-11-01
We examined the relationship between levels of household firearm ownership, as measured directly and by a proxy-the percentage of suicides committed with a firearm-and age-adjusted firearm homicide rates at the state level. We conducted a negative binomial regression analysis of panel data from the Centers for Disease Control and Prevention's Web-Based Injury Statistics Query and Reporting Systems database on gun ownership and firearm homicide rates across all 50 states during 1981 to 2010. We determined fixed effects for year, accounted for clustering within states with generalized estimating equations, and controlled for potential state-level confounders. Gun ownership was a significant predictor of firearm homicide rates (incidence rate ratio = 1.009; 95% confidence interval = 1.004, 1.014). This model indicated that for each percentage point increase in gun ownership, the firearm homicide rate increased by 0.9%. We observed a robust correlation between higher levels of gun ownership and higher firearm homicide rates. Although we could not determine causation, we found that states with higher rates of gun ownership had disproportionately large numbers of deaths from firearm-related homicides.
The effectiveness of avalanche airbags.
Haegeli, Pascal; Falk, Markus; Procter, Emily; Zweifel, Benjamin; Jarry, Frédéric; Logan, Spencer; Kronholm, Kalle; Biskupič, Marek; Brugger, Hermann
2014-09-01
Asphyxia is the primary cause of death among avalanche victims. Avalanche airbags can lower mortality by directly reducing grade of burial, the single most important factor for survival. This study aims to provide an updated perspective on the effectiveness of this safety device. A retrospective analysis of avalanche accidents involving at least one airbag user between 1994 and 2012 in Austria, Canada, France, Norway, Slovakia, Switzerland and the United States. A multivariate analysis was used to calculate adjusted absolute risk reduction and estimate the effectiveness of airbags on grade of burial and mortality. A univariate analysis was used to examine causes of non-deployment. Binomial linear regression models showed main effects for airbag use, avalanche size and injuries on critical burial, and for grade of burial, injuries and avalanche size on mortality. The adjusted risk of critical burial is 47% with non-inflated airbags and 20% with inflated airbags. The adjusted mortality is 44% for critically buried victims and 3% for non-critically buried victims. The adjusted absolute mortality reduction for inflated airbags is -11 percentage points (22% to 11%; 95% confidence interval: -4 to -18 percentage points) and adjusted risk ratio is 0.51 (95% confidence interval: 0.29 to 0.72). Overall non-inflation rate is 20%, 60% of which is attributed to deployment failure by the user. Although the impact on survival is smaller than previously reported, these results confirm the effectiveness of airbags. Non-deployment remains the most considerable limitation to effectiveness. Development of standardized data collection protocols is encouraged to facilitate further research. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Binomial test statistics using Psi functions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowman, Kimiko o
2007-01-01
For the negative binomial model (probability generating function (p + 1 - pt){sup -k}) a logarithmic derivative is the Psi function difference {psi}(k + x) - {psi}(k); this and its derivatives lead to a test statistic to decide on the validity of a specified model. The test statistic uses a data base so there exists a comparison available between theory and application. Note that the test function is not dominated by outliers. Applications to (i) Fisher's tick data, (ii) accidents data, (iii) Weldon's dice data are included.
Lytras, Theodore; Georgakopoulou, Theano; Tsiodras, Sotirios
2018-01-01
Greece is currently experiencing a large measles outbreak, in the context of multiple similar outbreaks across Europe. We devised and applied a modified chain-binomial epidemic model, requiring very simple data, to estimate the transmission parameters of this outbreak. Model results indicate sustained measles transmission among the Greek Roma population, necessitating a targeted mass vaccination campaign to halt further spread of the epidemic. Our model may be useful for other countries facing similar measles outbreaks. PMID:29717695
Lytras, Theodore; Georgakopoulou, Theano; Tsiodras, Sotirios
2018-04-01
Greece is currently experiencing a large measles outbreak, in the context of multiple similar outbreaks across Europe. We devised and applied a modified chain-binomial epidemic model, requiring very simple data, to estimate the transmission parameters of this outbreak. Model results indicate sustained measles transmission among the Greek Roma population, necessitating a targeted mass vaccination campaign to halt further spread of the epidemic. Our model may be useful for other countries facing similar measles outbreaks.
Bayesian hierarchical modeling for detecting safety signals in clinical trials.
Xia, H Amy; Ma, Haijun; Carlin, Bradley P
2011-09-01
Detection of safety signals from clinical trial adverse event data is critical in drug development, but carries a challenging statistical multiplicity problem. Bayesian hierarchical mixture modeling is appealing for its ability to borrow strength across subgroups in the data, as well as moderate extreme findings most likely due merely to chance. We implement such a model for subject incidence (Berry and Berry, 2004 ) using a binomial likelihood, and extend it to subject-year adjusted incidence rate estimation under a Poisson likelihood. We use simulation to choose a signal detection threshold, and illustrate some effective graphics for displaying the flagged signals.
How and why of orthodontic bond failures: An in vivo study
Vijayakumar, R. K.; Jagadeep, Raju; Ahamed, Fayyaz; Kanna, Aprose; Suresh, K.
2014-01-01
Introduction: The bonding of orthodontic brackets and their failure rates by both direct and in-direct procedures are well-documented in orthodontic literature. Over the years different adhesive materials and various indirect bonding transfer procedures have been compared and evaluated for bond failure rates. The aim of our study is to highlight the use of a simple, inexpensive and ease of manipulation of a single thermo-plastic transfer tray and the use the of a single light cure adhesive to evaluate the bond failure rates in clinical situations. Materials and Methods: A total of 30 patients were randomly divided into two groups (Group A and Group B). A split-mouth study design was used, for, both the groups so that they were distributed equally with-out bias. After initial prophylaxis, both the procedures were done as per manufactures instructions. All patients were initially motivated and reviewed for bond failures rates for 6 months. Results: Bond failure rates were assessed for over-all direct and indirect procedures, anterior and posterior arches, and for individual tooth. Z-test was used for statistically analyzing, the normal distribution of the sample in a spilt mouth study. The results of the two groups were compared and P value was calculated using Z-proportion test to assess the significance of the bond failure. Conclusion: Over-all bond failure was more for direct bonding. Anterior bracket failure was more in-direct bonding than indirect procedure, which showed more posterior bracket failures. In individual tooth bond failure, mandibular incisor, and premolar brackets showed more failure, followed by maxillary premolars and canines. PMID:25210392
Thompson, Keith A; Morrissey, Ryan P; Phan, Anita; Schwarz, Ernst R
2012-08-01
To determine the effects of the US economy on heart failure hospitalization rates. The recession was associated with worsening unemployment, loss of private insurance and prescription medication benefits, medication nonadherence, and ultimately increased rates of hospitalization for heart failure. We compared hospitalization rates at a large, single, academic medical center from July 1, 2006 to February 28, 2007, a time of economic stability, and July 1, 2008 to February 28, 2009, a time of economic recession in the United States. Significantly fewer patients had private medical insurance during the economic recession than during the control period (36.5% vs 46%; P = 0.04). Despite this, there were no differences in the heart failure hospitalization or readmission rates, length of hospitalization, need for admission to an intensive care unit, in-hospital mortality, or use of guideline-recommended heart failure medications between the 2 study periods. We conclude that despite significant effects on medical insurance coverage, rates of heart failure hospitalization at our institution were not significantly affected by the recession. Additional large-scale population-based research is needed to better understand the effects of fluctuations in the US economy on heart failure hospitalization rates. © 2012 Wiley Periodicals, Inc.
Roche, Jesús; Guerra-Neira, Ana; Raso, José; Benito, Agustîn
2003-05-01
From 1992-1999, we have assessed the therapeutic efficacy of three malaria treatment regimens (chloroquine 25 mg/kg over three days, pyrimethamine/sulfadoxine 1.25/25 mg/kg in one dose, and quinine 25-30 mg/kg daily in three oral doses over a four-, five-, or seven-day period) in 1,189 children under age 10 at Malabo Regional Hospital in Equatorial Guinea. Of those children, 958 were followed up clinically and parasitologically for 14 days. With chloroquine, the failure rate varied from 55% in 1996 to 40% in 1999; the early treatment failure rate increased progressively over the years, from 6% in 1992 to 30% in 1999. With pyrimethamine/sulfadoxine, the failure rate varied from 0% in 1996 to 16% in 1995. The short quinine treatment regimens used in 1992 and 1993 (4 and 5 days, respectively) resulted in significantly higher failure rates (19% and 22%, respectively) than the 7d regimen (3-5.5%). We conclude that: a) failure rates for chloroquine are in the change period (> 25%), and urgent action is needed; b) pyrimethamine/ sulfadoxine failure rates are in the alert period (6-15%), and surveillance must be continued; and c) quinine failure rates are in the grace period (< 6%), so quinine can be recommended.
A study of Mariner 10 flight experiences and some flight piece part failure rate computations
NASA Technical Reports Server (NTRS)
Paul, F. A.
1976-01-01
The problems and failures encountered in Mariner flight are discussed and the data available through a quantitative accounting of all electronic piece parts on the spacecraft are summarized. It also shows computed failure rates for electronic piece parts. It is intended that these computed data be used in the continued updating of the failure rate base used for trade-off studies and predictions for future JPL space missions.
Carvalho, Vitor Oliveira; Guimarães, Guilherme Veiga; Ciolac, Emmanuel Gomes; Bocchi, Edimar Alcides
2008-01-01
BACKGROUND Calculating the maximum heart rate for age is one method to characterize the maximum effort of an individual. Although this method is commonly used, little is known about heart rate dynamics in optimized beta-blocked heart failure patients. AIM The aim of this study was to evaluate heart rate dynamics (basal, peak and % heart rate increase) in optimized beta-blocked heart failure patients compared to sedentary, normal individuals (controls) during a treadmill cardiopulmonary exercise test. METHODS Twenty-five heart failure patients (49±11 years, 76% male), with an average LVEF of 30±7%, and fourteen controls were included in the study. Patients with atrial fibrillation, a pacemaker or noncardiovascular functional limitations or whose drug therapy was not optimized were excluded. Optimization was considered to be 50 mg/day or more of carvedilol, with a basal heart rate between 50 to 60 bpm that was maintained for 3 months. RESULTS Basal heart rate was lower in heart failure patients (57±3 bpm) compared to controls (89±14 bpm; p<0.0001). Similarly, the peak heart rate (% maximum predicted for age) was lower in HF patients (65.4±11.1%) compared to controls (98.6±2.2; p<0.0001). Maximum respiratory exchange ratio did not differ between the groups (1.2±0.5 for controls and 1.15±1 for heart failure patients; p=0.42). All controls reached the maximum heart rate for their age, while no patients in the heart failure group reached the maximum. Moreover, the % increase of heart rate from rest to peak exercise between heart failure (48±9%) and control (53±8%) was not different (p=0.157). CONCLUSION No patient in the heart failure group reached the maximum heart rate for their age during a treadmill cardiopulmonary exercise test, despite the fact that the percentage increase of heart rate was similar to sedentary normal subjects. A heart rate increase in optimized beta-blocked heart failure patients during cardiopulmonary exercise test over 65% of the maximum age-adjusted value should be considered an effort near the maximum. This information may be useful in rehabilitation programs and ischemic tests, although further studies are required. PMID:18719758
Chen, Han-Yang; Chauhan, Suneet P; Ananth, Cande V; Vintzileos, Anthony M; Abuhamad, Alfred Z
2011-06-01
To examine the association between electronic fetal heart rate monitoring and neonatal and infant mortality, as well as neonatal morbidity. We used the United States 2004 linked birth and infant death data. Multivariable log-binomial regression models were fitted to estimate risk ratio for association between electronic fetal heart rate monitoring and mortality, while adjusting for potential confounders. In 2004, 89% of singleton pregnancies had electronic fetal heart rate monitoring. Electronic fetal heart rate monitoring was associated with significantly lower infant mortality (adjusted relative risk, 0.75); this was mainly driven by the lower risk of early neonatal mortality (adjusted relative risk, 0.50). In low-risk pregnancies, electronic fetal heart rate monitoring was associated with decreased risk for Apgar scores <4 at 5 minutes (relative risk, 0.54); in high-risk pregnancies, with decreased risk of neonatal seizures (relative risk, 0.65). In the United States, the use of electronic fetal heart rate monitoring was associated with a substantial decrease in early neonatal mortality and morbidity that lowered infant mortality. Copyright © 2011 Mosby, Inc. All rights reserved.
Optimal estimation for discrete time jump processes
NASA Technical Reports Server (NTRS)
Vaca, M. V.; Tretter, S. A.
1978-01-01
Optimum estimates of nonobservable random variables or random processes which influence the rate functions of a discrete time jump process (DTJP) are derived. The approach used is based on the a posteriori probability of a nonobservable event expressed in terms of the a priori probability of that event and of the sample function probability of the DTJP. Thus a general representation is obtained for optimum estimates, and recursive equations are derived for minimum mean-squared error (MMSE) estimates. In general, MMSE estimates are nonlinear functions of the observations. The problem is considered of estimating the rate of a DTJP when the rate is a random variable with a beta probability density function and the jump amplitudes are binomially distributed. It is shown that the MMSE estimates are linear. The class of beta density functions is rather rich and explains why there are insignificant differences between optimum unconstrained and linear MMSE estimates in a variety of problems.
The impact of vaccine failure rate on epidemic dynamics in responsive networks.
Liang, Yu-Hao; Juang, Jonq
2015-04-01
An SIS model based on the microscopic Markov-chain approximation is considered in this paper. It is assumed that the individual vaccination behavior depends on the contact awareness, local and global information of an epidemic. To better simulate the real situation, the vaccine failure rate is also taken into consideration. Our main conclusions are given in the following. First, we show that if the vaccine failure rate α is zero, then the epidemic eventually dies out regardless of what the network structure is or how large the effective spreading rate and the immunization response rates of an epidemic are. Second, we show that for any positive α, there exists a positive epidemic threshold depending on an adjusted network structure, which is only determined by the structure of the original network, the positive vaccine failure rate and the immunization response rate for contact awareness. Moreover, the epidemic threshold increases with respect to the strength of the immunization response rate for contact awareness. Finally, if the vaccine failure rate and the immunization response rate for contact awareness are positive, then there exists a critical vaccine failure rate αc > 0 so that the disease free equilibrium (DFE) is stable (resp., unstable) if α < αc (resp., α > αc). Numerical simulations to see the effectiveness of our theoretical results are also provided.
Sterilization failures in Singapore: an examination of ligation techniques and failure rates.
Cheng, M C; Wong, Y M; Rochat, R W; Ratnam, S S
1977-04-01
The University Department of Obstetrics and Gynecology, Kandang Kerbau Hospital in Singapore, initiated a study in early 1974 of failure rates for various methods of sterilization and the factors responsible for the failures. During the period January 1974 to March 1976, 51 cases of first pregnancy following ligation were discovered. Cumulative failure rates at 24 months were 0.34 per 100 women for abdominal sterilization, 1.67 for culdoscopic, 3.12 for vaginal, and 4.49 for laparoscopic procedures. Findings for 35 patients who underwent religation showed that recanalization and the establishment of a fistulous opening caused the majority of failures. Clearly, more effective methods of tubal occlusion in sterilization are needed.
Bakuza, Jared S.; Denwood, Matthew J.; Nkwengulila, Gamba
2017-01-01
Background Schistosoma mansoni is a parasite of major public health importance in developing countries, where it causes a neglected tropical disease known as intestinal schistosomiasis. However, the distribution of the parasite within many endemic regions is currently unknown, which hinders effective control. The purpose of this study was to characterize the prevalence and intensity of infection of S. mansoni in a remote area of western Tanzania. Methodology/Principal findings Stool samples were collected from 192 children and 147 adults residing in Gombe National Park and four nearby villages. Children were actively sampled in local schools, and adults were sampled passively by voluntary presentation at the local health clinics. The two datasets were therefore analysed separately. Faecal worm egg count (FWEC) data were analysed using negative binomial and zero-inflated negative binomial (ZINB) models with explanatory variables of site, sex, and age. The ZINB models indicated that a substantial proportion of the observed zero FWEC reflected a failure to detect eggs in truly infected individuals, meaning that the estimated true prevalence was much higher than the apparent prevalence as calculated based on the simple proportion of non-zero FWEC. For the passively sampled data from adults, the data were consistent with close to 100% true prevalence of infection. Both the prevalence and intensity of infection differed significantly between sites, but there were no significant associations with sex or age. Conclusions/Significance Overall, our data suggest a more widespread distribution of S. mansoni in this part of Tanzania than was previously thought. The apparent prevalence estimates substantially under-estimated the true prevalence as determined by the ZINB models, and the two types of sampling strategies also resulted in differing conclusions regarding prevalence of infection. We therefore recommend that future surveillance programmes designed to assess risk factors should use active sampling whenever possible, in order to avoid the self-selection bias associated with passive sampling. PMID:28934206
Analysis of railroad tank car releases using a generalized binomial model.
Liu, Xiang; Hong, Yili
2015-11-01
The United States is experiencing an unprecedented boom in shale oil production, leading to a dramatic growth in petroleum crude oil traffic by rail. In 2014, U.S. railroads carried over 500,000 tank carloads of petroleum crude oil, up from 9500 in 2008 (a 5300% increase). In light of continual growth in crude oil by rail, there is an urgent national need to manage this emerging risk. This need has been underscored in the wake of several recent crude oil release incidents. In contrast to highway transport, which usually involves a tank trailer, a crude oil train can carry a large number of tank cars, having the potential for a large, multiple-tank-car release incident. Previous studies exclusively assumed that railroad tank car releases in the same train accident are mutually independent, thereby estimating the number of tank cars releasing given the total number of tank cars derailed based on a binomial model. This paper specifically accounts for dependent tank car releases within a train accident. We estimate the number of tank cars releasing given the number of tank cars derailed based on a generalized binomial model. The generalized binomial model provides a significantly better description for the empirical tank car accident data through our numerical case study. This research aims to provide a new methodology and new insights regarding the further development of risk management strategies for improving railroad crude oil transportation safety. Copyright © 2015 Elsevier Ltd. All rights reserved.
Dispersion and sampling of adult Dermacentor andersoni in rangeland in Western North America.
Rochon, K; Scoles, G A; Lysyk, T J
2012-03-01
A fixed precision sampling plan was developed for off-host populations of adult Rocky Mountain wood tick, Dermacentor andersoni (Stiles) based on data collected by dragging at 13 locations in Alberta, Canada; Washington; and Oregon. In total, 222 site-date combinations were sampled. Each site-date combination was considered a sample, and each sample ranged in size from 86 to 250 10 m2 quadrats. Analysis of simulated quadrats ranging in size from 10 to 50 m2 indicated that the most precise sample unit was the 10 m2 quadrat. Samples taken when abundance < 0.04 ticks per 10 m2 were more likely to not depart significantly from statistical randomness than samples taken when abundance was greater. Data were grouped into ten abundance classes and assessed for fit to the Poisson and negative binomial distributions. The Poisson distribution fit only data in abundance classes < 0.02 ticks per 10 m2, while the negative binomial distribution fit data from all abundance classes. A negative binomial distribution with common k = 0.3742 fit data in eight of the 10 abundance classes. Both the Taylor and Iwao mean-variance relationships were fit and used to predict sample sizes for a fixed level of precision. Sample sizes predicted using the Taylor model tended to underestimate actual sample sizes, while sample sizes estimated using the Iwao model tended to overestimate actual sample sizes. Using a negative binomial with common k provided estimates of required sample sizes closest to empirically calculated sample sizes.
Su, Chun-Lung; Gardner, Ian A; Johnson, Wesley O
2004-07-30
The two-test two-population model, originally formulated by Hui and Walter, for estimation of test accuracy and prevalence estimation assumes conditionally independent tests, constant accuracy across populations and binomial sampling. The binomial assumption is incorrect if all individuals in a population e.g. child-care centre, village in Africa, or a cattle herd are sampled or if the sample size is large relative to population size. In this paper, we develop statistical methods for evaluating diagnostic test accuracy and prevalence estimation based on finite sample data in the absence of a gold standard. Moreover, two tests are often applied simultaneously for the purpose of obtaining a 'joint' testing strategy that has either higher overall sensitivity or specificity than either of the two tests considered singly. Sequential versions of such strategies are often applied in order to reduce the cost of testing. We thus discuss joint (simultaneous and sequential) testing strategies and inference for them. Using the developed methods, we analyse two real and one simulated data sets, and we compare 'hypergeometric' and 'binomial-based' inferences. Our findings indicate that the posterior standard deviations for prevalence (but not sensitivity and specificity) based on finite population sampling tend to be smaller than their counterparts for infinite population sampling. Finally, we make recommendations about how small the sample size should be relative to the population size to warrant use of the binomial model for prevalence estimation. Copyright 2004 John Wiley & Sons, Ltd.
Analysis of multiple tank car releases in train accidents.
Liu, Xiang; Liu, Chang; Hong, Yili
2017-10-01
There are annually over two million carloads of hazardous materials transported by rail in the United States. The American railroads use large blocks of tank cars to transport petroleum crude oil and other flammable liquids from production to consumption sites. Being different from roadway transport of hazardous materials, a train accident can potentially result in the derailment and release of multiple tank cars, which may result in significant consequences. The prior literature predominantly assumes that the occurrence of multiple tank car releases in a train accident is a series of independent Bernoulli processes, and thus uses the binomial distribution to estimate the total number of tank car releases given the number of tank cars derailing or damaged. This paper shows that the traditional binomial model can incorrectly estimate multiple tank car release probability by magnitudes in certain circumstances, thereby significantly affecting railroad safety and risk analysis. To bridge this knowledge gap, this paper proposes a novel, alternative Correlated Binomial (CB) model that accounts for the possible correlations of multiple tank car releases in the same train. We test three distinct correlation structures in the CB model, and find that they all outperform the conventional binomial model based on empirical tank car accident data. The analysis shows that considering tank car release correlations would result in a significantly improved fit of the empirical data than otherwise. Consequently, it is prudent to consider alternative modeling techniques when analyzing the probability of multiple tank car releases in railroad accidents. Copyright © 2017 Elsevier Ltd. All rights reserved.
Payload maintenance cost model for the space telescope
NASA Technical Reports Server (NTRS)
White, W. L.
1980-01-01
An optimum maintenance cost model for the space telescope for a fifteen year mission cycle was developed. Various documents and subsequent updates of failure rates and configurations were made. The reliability of the space telescope for one year, two and one half years, and five years were determined using the failure rates and configurations. The failure rates and configurations were also used in the maintenance simulation computer model which simulate the failure patterns for the fifteen year mission life of the space telescope. Cost algorithms associated with the maintenance options as indicated by the failure patterns were developed and integrated into the model.
Infant Mortality and Income in 4 World Cities: New York, London, Paris, and Tokyo
Rodwin, Victor G.; Neuberg, Leland G.
2005-01-01
Objectives. We investigated the association between average income or deprivation and infant mortality rate across neighborhoods of 4 world cities. Methods. Using a maximum likelihood negative binomial regression model that controls for births, we analyzed data for 1988–1992 and 1993–1997. Results. In Manhattan, for both periods, we found an association (.05% significance level) between income and infant mortality. In Tokyo, for both periods, and in Paris and London for period 1, we found none (5% significance level). For period 2, the association just missed statistical significance for Paris, whereas for London it was significant (5% level). Conclusions. In stark contrast to Tokyo, Paris, and London, the association of income and infant mortality rate was strongly evident in Manhattan. PMID:15623865
Expanding Paramedicine in the Community (EPIC): study protocol for a randomized controlled trial.
Drennan, Ian R; Dainty, Katie N; Hoogeveen, Paul; Atzema, Clare L; Barrette, Norm; Hawker, Gillian; Hoch, Jeffrey S; Isaranuwatchai, Wanrudee; Philpott, Jane; Spearen, Chris; Tavares, Walter; Turner, Linda; Farrell, Melissa; Filosa, Tom; Kane, Jennifer; Kiss, Alex; Morrison, Laurie J
2014-12-02
The incidence of chronic diseases, including diabetes mellitus (DM), heart failure (HF) and chronic obstructive pulmonary disease (COPD) is on the rise. The existing health care system must evolve to meet the growing needs of patients with these chronic diseases and reduce the strain on both acute care and hospital-based health care resources. Paramedics are an allied health care resource consisting of highly-trained practitioners who are comfortable working independently and in collaboration with other resources in the out-of-hospital setting. Expanding the paramedic's scope of practice to include community-based care may decrease the utilization of acute care and hospital-based health care resources by patients with chronic disease. This will be a pragmatic, randomized controlled trial comparing a community paramedic intervention to standard of care for patients with one of three chronic diseases. The objective of the trial is to determine whether community paramedics conducting regular home visits, including health assessments and evidence-based treatments, in partnership with primary care physicians and other community based resources, will decrease the rate of hospitalization and emergency department use for patients with DM, HF and COPD. The primary outcome measure will be the rate of hospitalization at one year. Secondary outcomes will include measures of health system utilization, overall health status, and cost-effectiveness of the intervention over the same time period. Outcome measures will be assessed using both Poisson regression and negative binomial regression analyses to assess the primary outcome. The results of this study will be used to inform decisions around the implementation of community paramedic programs. If successful in preventing hospitalizations, it has the ability to be scaled up to other regions, both nationally and internationally. The methods described in this paper will serve as a basis for future work related to this study. ClinicalTrials.gov: NCT02034045. Date: 9 January 2014.
Rate of change of heart size before congestive heart failure in dogs with mitral regurgitation.
Lord, P; Hansson, K; Kvart, C; Häggström, J
2010-04-01
The objective of the study was to examine the changes in vertebral heart scale, and left atrial and ventricular dimensions before and at onset of congestive heart failure in cavalier King Charles spaniels with mitral regurgitation. Records and radiographs from 24 cavalier King Charles spaniels with mitral regurgitation were used. Vertebral heart scale (24 dogs), and left atrial dimension and left ventricular end diastolic and end systolic diameters (18 dogs) and their rate of increase were measured at intervals over years to the onset of congestive heart failure. They were plotted against time to onset of congestive heart failure. Dimensions and rates of change of all parameters were highest at onset of congestive heart failure, the difference between observed and chance outcome being highly significant using a two-tailed chi-square test (P<0.001). The left heart chambers increase in size rapidly only in the last year before the onset of congestive heart failure. Increasing left ventricular end systolic dimension is suggestive of myocardial failure before the onset of congestive heart failure. Rate of increase of heart dimensions may be a useful indicator of impending congestive heart failure.
Heart failure and atrial fibrillation: current concepts and controversies.
Van den Berg, M. P.; Tuinenburg, A. E.; Crijns, H. J.; Van Gelder, I. C.; Gosselink, A. T.; Lie, K. I.
1997-01-01
Heart failure and atrial fibrillation are very common, particularly in the elderly. Owing to common risk factors both disorders are often present in the same patient. In addition, there is increasing evidence of a complex, reciprocal relation between heart failure and atrial fibrillation. Thus heart failure may cause atrial fibrillation, with electromechanical feedback and neurohumoral activation playing an important mediating role. In addition, atrial fibrillation may promote heart failure; in particular, when there is an uncontrolled ventricular rate, tachycardiomyopathy may develop and thereby heart failure. Eventually, a vicious circle between heart failure and atrial fibrillation may form, in which neurohumoral activation and subtle derangement of rate control are involved. Treatment should aim at unloading of the heart, adequate control of ventricular rate, and correction of neurohumoral activation. Angiotensin converting enzyme inhibitors may help to achieve these goals. Treatment should also include an attempt to restore sinus rhythm through electrical cardioversion, though appropriate timing of cardioversion is difficult. His bundle ablation may be used to achieve adequate rate control in drug refractory cases. PMID:9155607
Mirelman, Andrew J; Rose, Sherri; Khan, Jahangir Am; Ahmed, Sayem; Peters, David H; Niessen, Louis W; Trujillo, Antonio J
2016-07-01
In low-income countries, a growing proportion of the disease burden is attributable to non-communicable diseases (NCDs). There is little knowledge, however, of their impact on wealth, human capital, economic growth or household poverty. This article estimates the risk of being poor after an NCD death in the rural, low-income area of Matlab, Bangladesh. In a matched cohort study, we estimated the 2-year relative risk (RR) of being poor in Matlab households with an NCD death in 2010. Three separate measures of household economic status were used as outcomes: an asset-based index, self-rated household economic condition and total household landholding. Several estimation methods were used including contingency tables, log-binomial regression and regression standardization and machine learning. Households with an NCD death had a large and significant risk of being poor. The unadjusted RR of being poor after death was 1.19, 1.14 and 1.10 for the asset quintile, self-rated condition and landholding outcomes. Adjusting for household and individual level independent variables with log-binomial regression gave RRs of 1.19 [standard error (SE) 0.09], 1.16 (SE 0.07) and 1.14 (SE 0.06), which were found to be exactly the same using regression standardization (SE: 0.09, 0.05, 0.03). Machine learning-based standardization produced slightly smaller RRs though still in the same order of magnitude. The findings show that efforts to address the burden of NCD may also combat household poverty and provide a return beyond improved health. Future work should attempt to disentangle the mechanisms through which economic impacts from an NCD death occur. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Polcicová, Gabriela; Tino, Peter
2004-01-01
We introduce topographic versions of two latent class models (LCM) for collaborative filtering. Latent classes are topologically organized on a square grid. Topographic organization of latent classes makes orientation in rating/preference patterns captured by the latent classes easier and more systematic. The variation in film rating patterns is modelled by multinomial and binomial distributions with varying independence assumptions. In the first stage of topographic LCM construction, self-organizing maps with neural field organized according to the LCM topology are employed. We apply our system to a large collection of user ratings for films. The system can provide useful visualization plots unveiling user preference patterns buried in the data, without loosing potential to be a good recommender model. It appears that multinomial distribution is most adequate if the model is regularized by tight grid topologies. Since we deal with probabilistic models of the data, we can readily use tools from probability and information theories to interpret and visualize information extracted by our system.
Effect of Brazil's conditional cash transfer programme on tuberculosis incidence.
Nery, J S; Rodrigues, L C; Rasella, D; Aquino, R; Barreira, D; Torrens, A W; Boccia, D; Penna, G O; Penna, M L F; Barreto, M L; Pereira, S M
2017-07-01
To evaluate the impact of the Brazilian cash transfer programme (Bolsa Família Programme, BFP) on tuberculosis (TB) incidence in Brazil from 2004 to 2012. We studied tuberculosis surveillance data using a combination of an ecological multiple-group and time-trend design covering 2458 Brazilian municipalities. The main independent variable was BFP coverage and the outcome was the TB incidence rate. All study variables were obtained from national databases. We used fixed-effects negative binomial models for panel data adjusted for selected covariates and a variable representing time. After controlling for covariates, TB incidence rates were significantly reduced in municipalities with high BFP coverage compared with those with low and intermediate coverage (in a model with a time variable incidence rate ratio = 0.96, 95%CI 0.93-0.99). This was the first evidence of a statistically significant association between the increase in cash transfer programme coverage and a reduction in TB incidence rate. Our findings provide support for social protection interventions for tackling TB worldwide.
Taxonomy of the order Mononegavirales: update 2017.
Amarasinghe, Gaya K; Bào, Yīmíng; Basler, Christopher F; Bavari, Sina; Beer, Martin; Bejerman, Nicolás; Blasdell, Kim R; Bochnowski, Alisa; Briese, Thomas; Bukreyev, Alexander; Calisher, Charles H; Chandran, Kartik; Collins, Peter L; Dietzgen, Ralf G; Dolnik, Olga; Dürrwald, Ralf; Dye, John M; Easton, Andrew J; Ebihara, Hideki; Fang, Qi; Formenty, Pierre; Fouchier, Ron A M; Ghedin, Elodie; Harding, Robert M; Hewson, Roger; Higgins, Colleen M; Hong, Jian; Horie, Masayuki; James, Anthony P; Jiāng, Dàohóng; Kobinger, Gary P; Kondo, Hideki; Kurath, Gael; Lamb, Robert A; Lee, Benhur; Leroy, Eric M; Li, Ming; Maisner, Andrea; Mühlberger, Elke; Netesov, Sergey V; Nowotny, Norbert; Patterson, Jean L; Payne, Susan L; Paweska, Janusz T; Pearson, Michael N; Randall, Rick E; Revill, Peter A; Rima, Bertus K; Rota, Paul; Rubbenstroth, Dennis; Schwemmle, Martin; Smither, Sophie J; Song, Qisheng; Stone, David M; Takada, Ayato; Terregino, Calogero; Tesh, Robert B; Tomonaga, Keizo; Tordo, Noël; Towner, Jonathan S; Vasilakis, Nikos; Volchkov, Viktor E; Wahl-Jensen, Victoria; Walker, Peter J; Wang, Beibei; Wang, David; Wang, Fei; Wang, Lin-Fa; Werren, John H; Whitfield, Anna E; Yan, Zhichao; Ye, Gongyin; Kuhn, Jens H
2017-08-01
In 2017, the order Mononegavirales was expanded by the inclusion of a total of 69 novel species. Five new rhabdovirus genera and one new nyamivirus genus were established to harbor 41 of these species, whereas the remaining new species were assigned to already established genera. Furthermore, non-Latinized binomial species names replaced all paramyxovirus and pneumovirus species names, thereby accomplishing application of binomial species names throughout the entire order. This article presents the updated taxonomy of the order Mononegavirales as now accepted by the International Committee on Taxonomy of Viruses (ICTV).
Categorical Data Analysis Using a Skewed Weibull Regression Model
NASA Astrophysics Data System (ADS)
Caron, Renault; Sinha, Debajyoti; Dey, Dipak; Polpo, Adriano
2018-03-01
In this paper, we present a Weibull link (skewed) model for categorical response data arising from binomial as well as multinomial model. We show that, for such types of categorical data, the most commonly used models (logit, probit and complementary log-log) can be obtained as limiting cases. We further compare the proposed model with some other asymmetrical models. The Bayesian as well as frequentist estimation procedures for binomial and multinomial data responses are presented in details. The analysis of two data sets to show the efficiency of the proposed model is performed.
Public health consequences of macrolide use in food animals: a deterministic risk assessment.
Hurd, H Scott; Doores, Stephanie; Hayes, Dermot; Mathew, Alan; Maurer, John; Silley, Peter; Singer, Randall S; Jones, Ronald N
2004-05-01
The potential impact on human health from antibiotic-resistant bacteria selected by use of antibiotics in food animals has resulted in many reports and recommended actions. The U.S. Food and Drug Administration Center for Veterinary Medicine has issued Guidance Document 152, which advises veterinary drug sponsors of one potential process for conducting a qualitative risk assessment of drug use in food animals. Using this guideline, we developed a deterministic model to assess the risk from two macrolide antibiotics, tylosin and tilmicosin. The scope of modeling included all label claim uses of both macrolides in poultry, swine, and beef cattle. The Guidance Document was followed to define the hazard, which is illness (i) caused by foodborne bacteria with a resistance determinant, (ii) attributed to a specified animal-derived meat commodity, and (iii) treated with a human use drug of the same class. Risk was defined as the probability of this hazard combined with the consequence of treatment failure due to resistant Campylobacter spp. or Enterococcus faecium. A binomial event model was applied to estimate the annual risk for the U.S. general population. Parameters were derived from industry drug use surveys, scientific literature, medical guidelines, and government documents. This unique farm-to-patient risk assessment demonstrated that use of tylosin and tilmicosin in food animals presents a very low risk of human treatment failure, with an approximate annual probability of less than 1 in 10 million Campylobacter-derived and approximately 1 in 3 billion E. faecium-derived risk.
Model analysis of the link between interest rates and crashes
NASA Astrophysics Data System (ADS)
Broga, Kristijonas M.; Viegas, Eduardo; Jensen, Henrik Jeldtoft
2016-09-01
We analyse the effect of distinct levels of interest rates on the stability of the financial network under our modelling framework. We demonstrate that banking failures are likely to emerge early on under sustained high interest rates, and at much later stage-with higher probability-under a sustained low interest rate scenario. Moreover, we demonstrate that those bank failures are of a different nature: high interest rates tend to result in significantly more bankruptcies associated to credit losses whereas lack of liquidity tends to be the primary cause of failures under lower rates.
The influence of mandibular skeletal characteristics on inferior alveolar nerve block anesthesia.
You, Tae Min; Kim, Kee-Deog; Huh, Jisun; Woo, Eun-Jung; Park, Wonse
2015-09-01
The inferior alveolar nerve block (IANB) is the most common anesthetic techniques in dentistry; however, its success rate is low. The purpose of this study was to determine the correlation between IANB failure and mandibular skeletal characteristics. In total, 693 cases of lower third molar extraction (n = 575 patients) were examined in this study. The ratio of the condylar and coronoid distances from the mandibular foramen (condyle-coronoid ratio [CC ratio]) was calculated, and the mandibular skeleton was then classified as normal, retrognathic, or prognathic. The correlation between IANB failure and sex, treatment side, and the CC ratio was assessed. The IANB failure rates for normal, retrognathic, and prognathic mandibles were 7.3%, 14.5%, and 9.5%, respectively, and the failure rate was highest among those with a CC ratio < 0.8 (severe retrognathic mandible). The failure rate was significantly higher in the retrognathic group than in normal group (P = 0.019), and there was no statistically significant difference between the other two groups. IANB failure could be attributable, in part, to the skeletal characteristics of the mandible. In addition, the failure rate was found to be significantly higher in the retrognathic group.
The influence of mandibular skeletal characteristics on inferior alveolar nerve block anesthesia
You, Tae Min; Kim, Kee-Deog; Huh, Jisun; Woo, Eun-Jung
2015-01-01
Background The inferior alveolar nerve block (IANB) is the most common anesthetic techniques in dentistry; however, its success rate is low. The purpose of this study was to determine the correlation between IANB failure and mandibular skeletal characteristics Methods In total, 693 cases of lower third molar extraction (n = 575 patients) were examined in this study. The ratio of the condylar and coronoid distances from the mandibular foramen (condyle-coronoid ratio [CC ratio]) was calculated, and the mandibular skeleton was then classified as normal, retrognathic, or prognathic. The correlation between IANB failure and sex, treatment side, and the CC ratio was assessed. Results The IANB failure rates for normal, retrognathic, and prognathic mandibles were 7.3%, 14.5%, and 9.5%, respectively, and the failure rate was highest among those with a CC ratio < 0.8 (severe retrognathic mandible). The failure rate was significantly higher in the retrognathic group than in normal group (P = 0.019), and there was no statistically significant difference between the other two groups. Conclusions IANB failure could be attributable, in part, to the skeletal characteristics of the mandible. In addition, the failure rate was found to be significantly higher in the retrognathic group. PMID:28879267
A quantitative model of honey bee colony population dynamics.
Khoury, David S; Myerscough, Mary R; Barron, Andrew B
2011-04-18
Since 2006 the rate of honey bee colony failure has increased significantly. As an aid to testing hypotheses for the causes of colony failure we have developed a compartment model of honey bee colony population dynamics to explore the impact of different death rates of forager bees on colony growth and development. The model predicts a critical threshold forager death rate beneath which colonies regulate a stable population size. If death rates are sustained higher than this threshold rapid population decline is predicted and colony failure is inevitable. The model also predicts that high forager death rates draw hive bees into the foraging population at much younger ages than normal, which acts to accelerate colony failure. The model suggests that colony failure can be understood in terms of observed principles of honey bee population dynamics, and provides a theoretical framework for experimental investigation of the problem.
Pullicino, Patrick M.; Thompson, John L.P.; Sacco, Ralph L.; Sanford, Alexandra R.; Qian, Min; Teerlink, John R.; Haddad, Haissam; Diek, Monika; Freudenberger, Ronald S.; Labovitz, Arthur J.; Di Tullio, Marco R.; Lok, Dirk J.; Ponikowski, Piotr; Anker, Stefan D.; Graham, Susan; Mann, Douglas L.; Mohr, J.P.; Homma, Shunichi
2014-01-01
Background The Warfarin versus Aspirin in Reduced Cardiac Ejection Fraction trial found no difference between warfarin and aspirin in patients with low ejection fraction in sinus rhythm for the primary outcome: first to occur of 84 incident ischemic strokes (IIS), 7 intracerebral hemorrhages or 531 deaths. Prespecified secondary analysis showed a 48% hazard ratio reduction (p = 0.005) for warfarin in IIS. Cardioembolism is likely the main pathogenesis of stroke in heart failure. We examined the IIS benefit for warfarin in more detail in post hoc secondary analyses. Methods We subtyped IIS into definite, possible and noncardioembolic using the Stroke Prevention in Atrial Fibrillation method. Statistical tests, stratified by prior ischemic stroke or transient ischemic attack, were the conditional binomial for independent Poisson variables for rates, the Cochran-Mantel-Haenszel test for stroke subtype and the van Elteren test for modified Rankin Score (mRS) and National Institute of Health Stroke Scale (NIHSS) distributions, and an exact test for proportions. Results Twenty-nine of 1,142 warfarin and 55 of 1,163 aspirin patients had IIS. The warfarin IIS rate (0.727/100 patient-years, PY) was lower than for aspirin (1.36/100 PY, p = 0.003). Definite cardioembolic IIS was less frequent on warfarin than aspirin (0.22 vs. 0.55/100 PY, p = 0.012). Possible cardioembolic IIS tended to be less frequent on warfarin than aspirin (0.37 vs. 0.67/100 PY, p = 0.063) but noncardioembolic IIS showed no difference: 5 (0.12/100 PY) versus 6 (0.15/100 PY, p = 0.768). Among patients experiencing IIS, there were no differences by treatment arm in fatal IIS, baseline mRS, mRS 90 days after IIS, and change from baseline to post-IIS mRS. The warfarin arm showed a trend to a lower proportion of severe nonfatal IIS [mRS 3–5; 3/23 (13.0%) vs. 16/48 (33.3%), p = 0.086]. There was no difference in NIHSS at the time of stroke (p = 0.825) or in post-IIS mRS (p = 0.948) between cardioembolic, possible cardioembolic and noncardioembolic stroke including both warfarin and aspirin groups. Conclusions The observed benefits in the reduction of IIS for warfarin compared to aspirin are most significant for cardioembolic IIS among patients with low ejection fraction in sinus rhythm. This is supported by trends to lower frequencies of severe IIS and possible cardioembolic IIS in patients on warfarin compared to aspirin. PMID:23921215
Pullicino, Patrick M; Thompson, John L P; Sacco, Ralph L; Sanford, Alexandra R; Qian, Min; Teerlink, John R; Haddad, Haissam; Diek, Monika; Freudenberger, Ronald S; Labovitz, Arthur J; Di Tullio, Marco R; Lok, Dirk J; Ponikowski, Piotr; Anker, Stefan D; Graham, Susan; Mann, Douglas L; Mohr, J P; Homma, Shunichi
2013-01-01
The Warfarin versus Aspirin in Reduced Cardiac Ejection Fraction trial found no difference between warfarin and aspirin in patients with low ejection fraction in sinus rhythm for the primary outcome: first to occur of 84 incident ischemic strokes (IIS), 7 intracerebral hemorrhages or 531 deaths. Prespecified secondary analysis showed a 48% hazard ratio reduction (p = 0.005) for warfarin in IIS. Cardioembolism is likely the main pathogenesis of stroke in heart failure. We examined the IIS benefit for warfarin in more detail in post hoc secondary analyses. We subtyped IIS into definite, possible and noncardioembolic using the Stroke Prevention in Atrial Fibrillation method. Statistical tests, stratified by prior ischemic stroke or transient ischemic attack, were the conditional binomial for independent Poisson variables for rates, the Cochran-Mantel-Haenszel test for stroke subtype and the van Elteren test for modified Rankin Score (mRS) and National Institute of Health Stroke Scale (NIHSS) distributions, and an exact test for proportions. Twenty-nine of 1,142 warfarin and 55 of 1,163 aspirin patients had IIS. The warfarin IIS rate (0.727/100 patient-years, PY) was lower than for aspirin (1.36/100 PY, p = 0.003). Definite cardioembolic IIS was less frequent on warfarin than aspirin (0.22 vs. 0.55/100 PY, p = 0.012). Possible cardioembolic IIS tended to be less frequent on warfarin than aspirin (0.37 vs. 0.67/100 PY, p = 0.063) but noncardioembolic IIS showed no difference: 5 (0.12/100 PY) versus 6 (0.15/100 PY, p = 0.768). Among patients experiencing IIS, there were no differences by treatment arm in fatal IIS, baseline mRS, mRS 90 days after IIS, and change from baseline to post-IIS mRS. The warfarin arm showed a trend to a lower proportion of severe nonfatal IIS [mRS 3-5; 3/23 (13.0%) vs. 16/48 (33.3%), p = 0.086]. There was no difference in NIHSS at the time of stroke (p = 0.825) or in post-IIS mRS (p = 0.948) between cardioembolic, possible cardioembolic and noncardioembolic stroke including both warfarin and aspirin groups. The observed benefits in the reduction of IIS for warfarin compared to aspirin are most significant for cardioembolic IIS among patients with low ejection fraction in sinus rhythm. This is supported by trends to lower frequencies of severe IIS and possible cardioembolic IIS in patients on warfarin compared to aspirin. Copyright © 2013 S. Karger AG, Basel.
Risk factors for eye bank preparation failure of Descemet membrane endothelial keratoplasty tissue.
Vianna, Lucas M M; Stoeger, Christopher G; Galloway, Joshua D; Terry, Mark; Cope, Leslie; Belfort, Rubens; Jun, Albert S
2015-05-01
To assess the results of a single eye bank preparing a high volume of Descemet membrane endothelial keratoplasty (DMEK) tissues using multiple technicians to provide an overview of the experience and to identify possible risk factors for DMEK preparation failure. Cross-sectional study. setting: Lions VisionGift and Wilmer Eye Institute at Johns Hopkins Hospital. All 563 corneal tissues processed by technicians at Lions VisionGift for DMEK between October 2011 and May 2014 inclusive. Tissues were divided into 2 groups: DMEK preparation success and DMEK preparation failure. We compared donor characteristics, including past medical history. The overall tissue preparation failure rate was 5.2%. Univariate analysis showed diabetes mellitus (P = .000028) and its duration (P = .023), hypertension (P = .021), and hyperlipidemia or obesity (P = .0004) were more common in the failure group. Multivariate analysis showed diabetes mellitus (P = .0001) and hyperlipidemia or obesity (P = .0142) were more common in the failure group. Elimination of tissues from donors either with diabetes or with hyperlipidemia or obesity reduced the failure rate from 5.2% to 2.2%. Trends toward lower failure rates occurring with increased technician experience also were found. Our work showed that tissues from donors with diabetes mellitus (especially with longer disease duration) and hyperlipidemia or obesity were associated with higher failure rates in DMEK preparation. Elimination of tissues from donors either with diabetes mellitus or with hyperlipidemia or obesity reduced the failure rate. In addition, our data may provide useful initial guidelines and benchmark values for eye banks seeking to establish and maintain DMEK programs. Copyright © 2015 Elsevier Inc. All rights reserved.
A Mixed Methods Explanatory Study of the Failure/Drop Rate for Freshman STEM Calculus Students
ERIC Educational Resources Information Center
Worthley, Mary
2013-01-01
In a national context of high failure rates in freshman calculus courses, the purpose of this study was to understand who is struggling, and why. High failure rates are especially alarming given a local environment where students have access to a variety of academic, and personal, assistance. The sample consists of students at Colorado State…
Training of residents in laparoscopic tubal sterilization: Long-term failure rates
Rackow, Beth W.; Rhee, Maria C.; Taylor, Hugh S.
2011-01-01
Objectives Laparoscopic tubal sterilization with bipolar coagulation is a common and effective method of contraception, and a procedure much used to teach laparoscopic surgical skills to Obstetrics and Gynaecology residents (trainees); but it has an inherent risk of failure. This study investigated the long-term failure rate of this procedure when performed by Obstetrics and Gynaecology residents on women treated in their teaching clinics. Methods From 1991 to 1994, Obstetrics and Gynaecology residents carried out 386 laparoscopic tubal sterilizations with bipolar coagulation at Yale-New Haven Hospital. Six to nine years after the procedure, the women concerned were contacted by telephone and data were collected about sterilization failure. Results Two failures of laparoscopic tubal sterilization with bipolar coagulation were identified: an ectopic pregnancy and a spontaneous abortion. For this time period, the long-term sterilization failure rate was 1.9% (0–4.4%). Conclusions The long-term sterilization failure rate for laparoscopic tubal sterilization with bipolar coagulation performed by residents is comparable to the results of prior studies. These findings can be used to properly counsel women at a teaching clinic about the risks of sterilization failure with this procedure, and attest to the adequacy of residents’ training and supervision. PMID:18465476
An analysis of the value of spermicides in contraception.
1979-11-01
Development of the so-called modern methods of contraception has somewhat eclipsed interest in traditional methods. However, spermicides are still important for many couples and their use appears to be increasing. A brief history of the use of and research into spermicidal contraceptives is presented. The limitations of spermicides are: the necessity for use at the time of intercourse, and their high failure rate. Estimates of the failure rates of spermicides have ranged from .3 pregnancies per 100 woman-years of use to nearly 40, depending on the product used and the population tested. Just as their use depends on various social factors, so does their failure rate. Characteristics of the user deterine failure rates. Motivation is important in lowering failure rates as is education, the intracouple relationship, and previous experience with spermicides. Method failure is also caused by defects in the product, either in the active ingredient of the spermicide or in the base carrier. The main advantage of spermicidal contraception is its safety. Limited research is currently being conducted on spermicides. Areas for improvement in existing spermicides and areas for possible innovation are mentioned.
Zipkin, Elise F.; Leirness, Jeffery B.; Kinlan, Brian P.; O'Connell, Allan F.; Silverman, Emily D.
2014-01-01
Determining appropriate statistical distributions for modeling animal count data is important for accurate estimation of abundance, distribution, and trends. In the case of sea ducks along the U.S. Atlantic coast, managers want to estimate local and regional abundance to detect and track population declines, to define areas of high and low use, and to predict the impact of future habitat change on populations. In this paper, we used a modified marked point process to model survey data that recorded flock sizes of Common eiders, Long-tailed ducks, and Black, Surf, and White-winged scoters. The data come from an experimental aerial survey, conducted by the United States Fish & Wildlife Service (USFWS) Division of Migratory Bird Management, during which east-west transects were flown along the Atlantic Coast from Maine to Florida during the winters of 2009–2011. To model the number of flocks per transect (the points), we compared the fit of four statistical distributions (zero-inflated Poisson, zero-inflated geometric, zero-inflated negative binomial and negative binomial) to data on the number of species-specific sea duck flocks that were recorded for each transect flown. To model the flock sizes (the marks), we compared the fit of flock size data for each species to seven statistical distributions: positive Poisson, positive negative binomial, positive geometric, logarithmic, discretized lognormal, zeta and Yule–Simon. Akaike’s Information Criterion and Vuong’s closeness tests indicated that the negative binomial and discretized lognormal were the best distributions for all species for the points and marks, respectively. These findings have important implications for estimating sea duck abundances as the discretized lognormal is a more skewed distribution than the Poisson and negative binomial, which are frequently used to model avian counts; the lognormal is also less heavy-tailed than the power law distributions (e.g., zeta and Yule–Simon), which are becoming increasingly popular for group size modeling. Choosing appropriate statistical distributions for modeling flock size data is fundamental to accurately estimating population summaries, determining required survey effort, and assessing and propagating uncertainty through decision-making processes.
Holsclaw, Tracy; Hallgren, Kevin A; Steyvers, Mark; Smyth, Padhraic; Atkins, David C
2015-12-01
Behavioral coding is increasingly used for studying mechanisms of change in psychosocial treatments for substance use disorders (SUDs). However, behavioral coding data typically include features that can be problematic in regression analyses, including measurement error in independent variables, non normal distributions of count outcome variables, and conflation of predictor and outcome variables with third variables, such as session length. Methodological research in econometrics has shown that these issues can lead to biased parameter estimates, inaccurate standard errors, and increased Type I and Type II error rates, yet these statistical issues are not widely known within SUD treatment research, or more generally, within psychotherapy coding research. Using minimally technical language intended for a broad audience of SUD treatment researchers, the present paper illustrates the nature in which these data issues are problematic. We draw on real-world data and simulation-based examples to illustrate how these data features can bias estimation of parameters and interpretation of models. A weighted negative binomial regression is introduced as an alternative to ordinary linear regression that appropriately addresses the data characteristics common to SUD treatment behavioral coding data. We conclude by demonstrating how to use and interpret these models with data from a study of motivational interviewing. SPSS and R syntax for weighted negative binomial regression models is included in online supplemental materials. (c) 2016 APA, all rights reserved).
Holsclaw, Tracy; Hallgren, Kevin A.; Steyvers, Mark; Smyth, Padhraic; Atkins, David C.
2015-01-01
Behavioral coding is increasingly used for studying mechanisms of change in psychosocial treatments for substance use disorders (SUDs). However, behavioral coding data typically include features that can be problematic in regression analyses, including measurement error in independent variables, non-normal distributions of count outcome variables, and conflation of predictor and outcome variables with third variables, such as session length. Methodological research in econometrics has shown that these issues can lead to biased parameter estimates, inaccurate standard errors, and increased type-I and type-II error rates, yet these statistical issues are not widely known within SUD treatment research, or more generally, within psychotherapy coding research. Using minimally-technical language intended for a broad audience of SUD treatment researchers, the present paper illustrates the nature in which these data issues are problematic. We draw on real-world data and simulation-based examples to illustrate how these data features can bias estimation of parameters and interpretation of models. A weighted negative binomial regression is introduced as an alternative to ordinary linear regression that appropriately addresses the data characteristics common to SUD treatment behavioral coding data. We conclude by demonstrating how to use and interpret these models with data from a study of motivational interviewing. SPSS and R syntax for weighted negative binomial regression models is included in supplementary materials. PMID:26098126
Xiao, Chuan-Le; Chen, Xiao-Zhou; Du, Yang-Li; Sun, Xuesong; Zhang, Gong; He, Qing-Yu
2013-01-04
Mass spectrometry has become one of the most important technologies in proteomic analysis. Tandem mass spectrometry (LC-MS/MS) is a major tool for the analysis of peptide mixtures from protein samples. The key step of MS data processing is the identification of peptides from experimental spectra by searching public sequence databases. Although a number of algorithms to identify peptides from MS/MS data have been already proposed, e.g. Sequest, OMSSA, X!Tandem, Mascot, etc., they are mainly based on statistical models considering only peak-matches between experimental and theoretical spectra, but not peak intensity information. Moreover, different algorithms gave different results from the same MS data, implying their probable incompleteness and questionable reproducibility. We developed a novel peptide identification algorithm, ProVerB, based on a binomial probability distribution model of protein tandem mass spectrometry combined with a new scoring function, making full use of peak intensity information and, thus, enhancing the ability of identification. Compared with Mascot, Sequest, and SQID, ProVerB identified significantly more peptides from LC-MS/MS data sets than the current algorithms at 1% False Discovery Rate (FDR) and provided more confident peptide identifications. ProVerB is also compatible with various platforms and experimental data sets, showing its robustness and versatility. The open-source program ProVerB is available at http://bioinformatics.jnu.edu.cn/software/proverb/ .
Influence of enamel preservation on failure rates of porcelain laminate veneers.
Gurel, Galip; Sesma, Newton; Calamita, Marcelo A; Coachman, Christian; Morimoto, Susana
2013-01-01
The purpose of this study was to evaluate the failure rates of porcelain laminate veneers (PLVs) and the influence of clinical parameters on these rates in a retrospective survey of up to 12 years. Five hundred eighty laminate veneers were bonded in 66 patients. The following parameters were analyzed: type of preparation (depth and margin), crown lengthening, presence of restoration, diastema, crowding, discoloration, abrasion, and attrition. Survival was analyzed using the Kaplan-Meier method. Cox regression modeling was used to determine which factors would predict PLV failure. Forty-two veneers (7.2%) failed in 23 patients, and an overall cumulative survival rate of 86% was observed. A statistically significant association was noted between failure and the limits of the prepared tooth surface (margin and depth). The most frequent failure type was fracture (n = 20). The results revealed no significant influence of crown lengthening apically, presence of restoration, diastema, discoloration, abrasion, or attrition on failure rates. Multivariable analysis (Cox regression model) also showed that PLVs bonded to dentin and teeth with preparation margins in dentin were approximately 10 times more likely to fail than PLVs bonded to enamel. Moreover, coronal crown lengthening increased the risk of PLV failure by 2.3 times. A survival rate of 99% was observed for veneers with preparations confined to enamel and 94% for veneers with enamel only at the margins. Laminate veneers have high survival rates when bonded to enamel and provide a safe and predictable treatment option that preserves tooth structure.
Sileshi, G
2006-10-01
Researchers and regulatory agencies often make statistical inferences from insect count data using modelling approaches that assume homogeneous variance. Such models do not allow for formal appraisal of variability which in its different forms is the subject of interest in ecology. Therefore, the objectives of this paper were to (i) compare models suitable for handling variance heterogeneity and (ii) select optimal models to ensure valid statistical inferences from insect count data. The log-normal, standard Poisson, Poisson corrected for overdispersion, zero-inflated Poisson, the negative binomial distribution and zero-inflated negative binomial models were compared using six count datasets on foliage-dwelling insects and five families of soil-dwelling insects. Akaike's and Schwarz Bayesian information criteria were used for comparing the various models. Over 50% of the counts were zeros even in locally abundant species such as Ootheca bennigseni Weise, Mesoplatys ochroptera Stål and Diaecoderus spp. The Poisson model after correction for overdispersion and the standard negative binomial distribution model provided better description of the probability distribution of seven out of the 11 insects than the log-normal, standard Poisson, zero-inflated Poisson or zero-inflated negative binomial models. It is concluded that excess zeros and variance heterogeneity are common data phenomena in insect counts. If not properly modelled, these properties can invalidate the normal distribution assumptions resulting in biased estimation of ecological effects and jeopardizing the integrity of the scientific inferences. Therefore, it is recommended that statistical models appropriate for handling these data properties be selected using objective criteria to ensure efficient statistical inference.
Wang, Xuezhi; Huang, Xiaotao; Suvorova, Sofia; Moran, Bill
2018-01-01
Golay complementary waveforms can, in theory, yield radar returns of high range resolution with essentially zero sidelobes. In practice, when deployed conventionally, while high signal-to-noise ratios can be achieved for static target detection, significant range sidelobes are generated by target returns of nonzero Doppler causing unreliable detection. We consider signal processing techniques using Golay complementary waveforms to improve radar detection performance in scenarios involving multiple nonzero Doppler targets. A signal processing procedure based on an existing, so called, Binomial Design algorithm that alters the transmission order of Golay complementary waveforms and weights the returns is proposed in an attempt to achieve an enhanced illumination performance. The procedure applies one of three proposed waveform transmission ordering algorithms, followed by a pointwise nonlinear processor combining the outputs of the Binomial Design algorithm and one of the ordering algorithms. The computational complexity of the Binomial Design algorithm and the three ordering algorithms are compared, and a statistical analysis of the performance of the pointwise nonlinear processing is given. Estimation of the areas in the Delay–Doppler map occupied by significant range sidelobes for given targets are also discussed. Numerical simulations for the comparison of the performances of the Binomial Design algorithm and the three ordering algorithms are presented for both fixed and randomized target locations. The simulation results demonstrate that the proposed signal processing procedure has a better detection performance in terms of lower sidelobes and higher Doppler resolution in the presence of multiple nonzero Doppler targets compared to existing methods. PMID:29324708
Some considerations for excess zeroes in substance abuse research.
Bandyopadhyay, Dipankar; DeSantis, Stacia M; Korte, Jeffrey E; Brady, Kathleen T
2011-09-01
Count data collected in substance abuse research often come with an excess of "zeroes," which are typically handled using zero-inflated regression models. However, there is a need to consider the design aspects of those studies before using such a statistical model to ascertain the sources of zeroes. We sought to illustrate hurdle models as alternatives to zero-inflated models to validate a two-stage decision-making process in situations of "excess zeroes." We use data from a study of 45 cocaine-dependent subjects where the primary scientific question was to evaluate whether study participation influences drug-seeking behavior. The outcome, "the frequency (count) of cocaine use days per week," is bounded (ranging from 0 to 7). We fit and compare binomial, Poisson, negative binomial, and the hurdle version of these models to study the effect of gender, age, time, and study participation on cocaine use. The hurdle binomial model provides the best fit. Gender and time are not predictive of use. Higher odds of use versus no use are associated with age; however once use is experienced, odds of further use decrease with increase in age. Participation was associated with higher odds of no-cocaine use; once there is use, participation reduced the odds of further use. Age and study participation are significantly predictive of cocaine-use behavior. The two-stage decision process as modeled by a hurdle binomial model (appropriate for bounded count data with excess zeroes) provides interesting insights into the study of covariate effects on count responses of substance use, when all enrolled subjects are believed to be "at-risk" of use.
Vaughn, Josh; Cohen, Eric; Vopat, Bryan G; Kane, Patrick; Abbood, Emily; Born, Christopher
2015-05-01
Hip fractures are becoming increasingly common resulting in significant morbidity, mortality and raising healthcare costs. Both short and long cephalomedullary devices are currently employed to treat intertrochanteric hip fractures. However, which device is optimal continues to be debated as each implant has unique characteristics and theoretical advantages. This study looked to identify rates of complications associated with both long and short cephalomedullary nails for the treatment of intertrochanteric hip fractures. We retrospectively reviewed charts from 2006 to 2011, and we identified 256 patients were identified with AO class 31.1-32.3 fractures. Sixty were treated with short nails and 196 with long nails. Radiographs and charts were then analysed for failures and hardware complications. Catastrophic failure and hardware complication rates were not statistically different between short or long cephalomedullary nails. The overall catastrophic failure rate was 3.1 %; there was a 5 % failure rate in the short-nail group compared with a 2.6 % failure rate in the long-nail group (p = 0.191). There was a 3.33 % secondary femur fracture rate in the short-nail group, compared with none in the long-nail cohort (p = 0.054). The rate of proximal fixation failure was 1.67 % for the short-nail group and 2.0 % in the long-nail group (p = 0.406). Our data suggests equivocal outcomes as measured by similar catastrophic failure rate between both short and long cephalomedullary nails for intertrochanteric femur fractures. However, there was an increased risk of secondary femur fracture with short cephalomedullary nails when compared to long nails that approached statistical significance.
Goovaerts, Pierre
2009-01-01
This paper presents a geostatistical approach to incorporate individual-level data (e.g. patient residences) and area-based data (e.g. rates recorded at census tract level) into the mapping of late-stage cancer incidence, with an application to breast cancer in three Michigan counties. Spatial trends in cancer incidence are first estimated from census data using area-to-point binomial kriging. This prior model is then updated using indicator kriging and individual-level data. Simulation studies demonstrate the benefits of this two-step approach over methods (kernel density estimation and indicator kriging) that process only residence data.
The Relationship Between Gun Ownership and Firearm Homicide Rates in the United States, 1981–2010
Ross, Craig S.
2013-01-01
Objectives. We examined the relationship between levels of household firearm ownership, as measured directly and by a proxy—the percentage of suicides committed with a firearm—and age-adjusted firearm homicide rates at the state level. Methods. We conducted a negative binomial regression analysis of panel data from the Centers for Disease Control and Prevention’s Web-Based Injury Statistics Query and Reporting Systems database on gun ownership and firearm homicide rates across all 50 states during 1981 to 2010. We determined fixed effects for year, accounted for clustering within states with generalized estimating equations, and controlled for potential state-level confounders. Results. Gun ownership was a significant predictor of firearm homicide rates (incidence rate ratio = 1.009; 95% confidence interval = 1.004, 1.014). This model indicated that for each percentage point increase in gun ownership, the firearm homicide rate increased by 0.9%. Conclusions. We observed a robust correlation between higher levels of gun ownership and higher firearm homicide rates. Although we could not determine causation, we found that states with higher rates of gun ownership had disproportionately large numbers of deaths from firearm-related homicides. PMID:24028252
Ramirez, Marizen; Bedford, Ronald; Wu, Hongqian; Harland, Karisa; Cavanaugh, Joseph E; Peek-Asa, Corinne
2016-01-01
Objective To evaluate the effectiveness of roadway policies for lighting and marking of farm equipment in reducing crashes in Illinois, Iowa, Kansas, Minnesota, Missouri, Nebraska, North Dakota, South Dakota and Wisconsin. Methods In this ecological study, state policies on lighting and marking of farm equipment were scored for compliance with standards of the American Society of Agricultural and Biological Engineers (ASABE). Using generalized estimating equations negative binomial models, we estimated the relationships between lighting and marking scores, and farm equipment crash rates, per 100 000 farm operations. Results A total of 7083 crashes involving farm equipment was reported from 2005 to 2010 in the Upper Midwest and Great Plains. As the state lighting and marking score increased by 5 units, crash rates reduced by 17% (rate ratio=0.83; 95% CI 0.78 to 0.88). Lighting-only (rate ratio=0.48; 95% CI 0.45 to 0.51) and marking-only policies (rate ratio=0.89; 95% CI 0.83 to 0.96) were each associated with reduced crash rates. Conclusions Aligning lighting and marking policies with ASABE standards may effectively reduce crash rates involving farm equipment. PMID:27405602
Curry, Allison E; Pfeiffer, Melissa R; Elliott, Michael R; Durbin, Dennis R
2015-12-01
New Jersey (NJ) implemented the first-in-the-US Graduated Driver Licensing (GDL) decal provision in May 2010 for young drivers with learner's permits or intermediate licenses. Previous analyses found an association between the provision and crash reduction among intermediate drivers. The aim of this study is to examine the association between NJ's provision and GDL citation and crash rates among drivers aged <21 years with learner's permits. We estimated monthly per-driver rates from January 2006 through June 2012. Negative binomial modeling compared pre and post decal crash rates adjusted for gender, age, calendar month, and gas price. The monthly GDL citation rate was two per 10,000 drivers in the predecal and postdecal periods. Crashes were rare and rates declined similarly pre and post decal (adjusted rate ratio of postdecal vs predecal slope: 1.04 (0.97 to 1.12)). NJ's GDL decal provision was not associated with a change in citation or crash rates among young NJ drivers with learner's permits. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Shrinkage Estimators for a Composite Measure of Quality Conceptualized as a Formative Construct
Shwartz, Michael; Peköz, Erol A; Christiansen, Cindy L; Burgess, James F; Berlowitz, Dan
2013-01-01
Objective To demonstrate the value of shrinkage estimators when calculating a composite quality measure as the weighted average of a set of individual quality indicators. Data Sources Rates of 28 quality indicators (QIs) calculated from the minimum dataset from residents of 112 Veterans Health Administration nursing homes in fiscal years 2005–2008. Study Design We compared composite scores calculated from the 28 QIs using both observed rates and shrunken rates derived from a Bayesian multivariate normal-binomial model. Principal Findings Shrunken-rate composite scores, because they take into account unreliability of estimates from small samples and the correlation among QIs, have more intuitive appeal than observed-rate composite scores. Facilities can be profiled based on more policy-relevant measures than point estimates of composite scores, and interval estimates can be calculated without assuming the QIs are independent. Usually, shrunken-rate composite scores in 1 year are better able to predict the observed total number of QI events or the observed-rate composite scores in the following year than the initial year observed-rate composite scores. Conclusion Shrinkage estimators can be useful when a composite measure is conceptualized as a formative construct. PMID:22716650
Distribution pattern of public transport passenger in Yogyakarta, Indonesia
NASA Astrophysics Data System (ADS)
Narendra, Alfa; Malkhamah, Siti; Sopha, Bertha Maya
2018-03-01
The arrival and departure distribution pattern of Trans Jogja bus passenger is one of the fundamental model for simulation. The purpose of this paper is to build models of passengers flows. This research used passengers data from January to May 2014. There is no policy that change the operation system affecting the nature of this pattern nowadays. The roads, buses, land uses, schedule, and people are relatively still the same. The data then categorized based on the direction, days, and location. Moreover, each category was fitted into some well-known discrete distributions. Those distributions are compared based on its AIC value and BIC. The chosen distribution model has the smallest AIC and BIC value and the negative binomial distribution found has the smallest AIC and BIC value. Probability mass function (PMF) plots of those models were compared to draw generic model from each categorical negative binomial distribution models. The value of accepted generic negative binomial distribution is 0.7064 and 1.4504 of mu. The minimum and maximum passenger vector value of distribution are is 0 and 41.
Temporal acceleration of spatially distributed kinetic Monte Carlo simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatterjee, Abhijit; Vlachos, Dionisios G.
The computational intensity of kinetic Monte Carlo (KMC) simulation is a major impediment in simulating large length and time scales. In recent work, an approximate method for KMC simulation of spatially uniform systems, termed the binomial {tau}-leap method, was introduced [A. Chatterjee, D.G. Vlachos, M.A. Katsoulakis, Binomial distribution based {tau}-leap accelerated stochastic simulation, J. Chem. Phys. 122 (2005) 024112], where molecular bundles instead of individual processes are executed over coarse-grained time increments. This temporal coarse-graining can lead to significant computational savings but its generalization to spatially lattice KMC simulation has not been realized yet. Here we extend the binomial {tau}-leapmore » method to lattice KMC simulations by combining it with spatially adaptive coarse-graining. Absolute stability and computational speed-up analyses for spatial systems along with simulations provide insights into the conditions where accuracy and substantial acceleration of the new spatio-temporal coarse-graining method are ensured. Model systems demonstrate that the r-time increment criterion of Chatterjee et al. obeys the absolute stability limit for values of r up to near 1.« less
Artificial Immune System for Flight Envelope Estimation and Protection
2014-12-31
Throttle Failure 103 5.3. Estimation Algorithms for Sensor AC 108 5.3.1. Roll Rate Sensor Bias 108...4.13. Reference Features-Pattern for a Roll Rate Sensor Under Low Severity Failure 93 Figure 4.14. Reference Features-Pattern for a Roll Rate...Average PI for Different ACs 134 Figure 6.9. Roll Response Under High Magnitude Stabilator Failure 135 Figure 6.10. Pitch
Multiple objective optimization in reliability demonstration test
Lu, Lu; Anderson-Cook, Christine Michaela; Li, Mingyang
2016-10-01
Reliability demonstration tests are usually performed in product design or validation processes to demonstrate whether a product meets specified requirements on reliability. For binomial demonstration tests, the zero-failure test has been most commonly used due to its simplicity and use of minimum sample size to achieve an acceptable consumer’s risk level. However, this test can often result in unacceptably high risk for producers as well as a low probability of passing the test even when the product has good reliability. This paper explicitly explores the interrelationship between multiple objectives that are commonly of interest when planning a demonstration test andmore » proposes structured decision-making procedures using a Pareto front approach for selecting an optimal test plan based on simultaneously balancing multiple criteria. Different strategies are suggested for scenarios with different user priorities and graphical tools are developed to help quantify the trade-offs between choices and to facilitate informed decision making. As a result, potential impacts of some subjective user inputs on the final decision are studied to offer insights and useful guidance for general applications.« less
An ecological analysis of prison overcrowding and suicide rates in England and Wales, 2000-2014.
van Ginneken, Esther F J C; Sutherland, Alex; Molleman, Toon
Prisoners are at a greatly increased risk of suicides compared to the general population. Differences in suicide risk can be partly explained by individual risk factors, but the contribution of prison characteristics remains unclear. Overcrowded prisons have higher suicide rates, but this may be related to prison function, security level, population size and turnover. The aim of the current study was to investigate the contribution of each of these prison characteristics to suicide rates, using data from the Ministry of Justice for adult prisons in England and Wales from 2000 to 2014. Negative binomial regression analysis showed that larger population size, higher turnover, higher security and public management were associated with higher suicide rates. When controlling for these factors, overcrowding was not found to be related to suicide rates. Questions remain about the causal mechanisms underlying variation in prison suicides and the impact of the lived experience of overcrowding. Further research is needed to examine the relative contribution of prison and prisoner characteristics to suicides. Copyright © 2016 Elsevier Ltd. All rights reserved.
Kessell, Eric R; Alvidrez, Jennifer; McConnell, William A; Shumway, Martha
2009-10-01
This study investigated the association between the racial and ethnic residential composition of San Francisco neighborhoods and the rate of mental health-related 911 calls. A total of 1,341,608 emergency calls (28,197 calls related to mental health) to San Francisco's 911 system were made from January 2001 through June 2003. Police sector data in the call records were overlaid onto U.S. census tracts to estimate sector demographic and socioeconomic characteristics. Negative binomial regression was used to estimate the association between the percentage of black, Asian, Latino, and white residents and rates of mental health-related calls. A one-point increase in a sector's percentage of black residents was associated with a lower rate of mental health-related calls (incidence rate ratio=.99, p<.05). A sector's percentage of Asian and Latino residents had no significant effect. The observed relationship between the percentage of black residents and mental health-related calls is not consistent with known emergency mental health service utilization patterns.
A Bayesian method for inferring transmission chains in a partially observed epidemic.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marzouk, Youssef M.; Ray, Jaideep
2008-10-01
We present a Bayesian approach for estimating transmission chains and rates in the Abakaliki smallpox epidemic of 1967. The epidemic affected 30 individuals in a community of 74; only the dates of appearance of symptoms were recorded. Our model assumes stochastic transmission of the infections over a social network. Distinct binomial random graphs model intra- and inter-compound social connections, while disease transmission over each link is treated as a Poisson process. Link probabilities and rate parameters are objects of inference. Dates of infection and recovery comprise the remaining unknowns. Distributions for smallpox incubation and recovery periods are obtained from historicalmore » data. Using Markov chain Monte Carlo, we explore the joint posterior distribution of the scalar parameters and provide an expected connectivity pattern for the social graph and infection pathway.« less
Adelman, Ron A; Parnes, Aaron J; Ducournau, Didier
2013-09-01
To study success and failure in the treatment of uncomplicated rhegmatogenous retinal detachments (RRDs). Nonrandomized, multicenter retrospective study. One hundred seventy-six surgeons from 48 countries spanning 5 continents provided information on the primary procedures for 7678 cases of RRDs including 4179 patients with uncomplicated RRDs. Reported data included specific clinical findings, the method of repair, and the outcome after intervention. Final failure of retinal detachment repair (level 1 failure rate), remaining silicone oil at the study's conclusion (level 2 failure rate), and need for additional procedures to repair the detachment (level 3 failure rate). Four thousand one hundred seventy-nine uncomplicated cases of RRD were included. Combining phakic, pseudophakic, and aphakic groups, those treated with scleral buckle alone (n = 1341) had a significantly lower final failure rate than those treated with vitrectomy, with or without a supplemental buckle (n = 2723; P = 0.04). In phakic patients, final failure rate was lower in the scleral buckle group compared with those who had vitrectomy, with or without a supplemental buckle (P = 0.028). In pseudophakic patients, the failure rate of the initial procedure was lower in the vitrectomy group compared with the scleral buckle group (P = 3×10(-8)). There was no statistically significant difference in failure rate between segmental (n = 721) and encircling (n = 351) buckles (P = 0.5). Those who underwent vitrectomy with a supplemental scleral buckle (n = 488) had an increased failure rate compared with those who underwent vitrectomy alone (n = 2235; P = 0.048). Pneumatic retinopexy was found to be comparable with scleral buckle when a retinal hole was present (P = 0.65), but not in cases with a flap tear (P = 0.034). In the treatment of uncomplicated phakic retinal detachments, repair using scleral buckle may be a good option. There was no significant difference between segmental versus 360-degree buckle. For pseudophakic uncomplicated retinal detachments, the surgeon should balance the risks and benefits of vitrectomy versus scleral buckle and keep in mind that the single-surgery reattachment rate may be higher with vitrectomy. However, if a vitrectomy is to be performed, these data suggest that a supplemental buckle is not helpful. The author(s) have no proprietary or commercial interest in any materials discussed in this article. Copyright © 2013 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
Young, J Christopher; Roper, Brad L; Arentsen, Timothy J
2016-05-01
A survey of neuropsychologists in the Veterans Health Administration examined symptom/performance validity test (SPVT) practices and estimated base rates for patient response bias. Invitations were emailed to 387 psychologists employed within the Veterans Affairs (VA), identified as likely practicing neuropsychologists, resulting in 172 respondents (44.4% response rate). Practice areas varied, with 72% at least partially practicing in general neuropsychology clinics and 43% conducting VA disability exams. Mean estimated failure rates were 23.0% for clinical outpatient, 12.9% for inpatient, and 39.4% for disability exams. Failure rates were the highest for mTBI and PTSD referrals. Failure rates were positively correlated with the number of cases seen and frequency and number of SPVT use. Respondents disagreed regarding whether one (45%) or two (47%) failures are required to establish patient response bias, with those administering more measures employing the more stringent criterion. Frequency of the use of specific SPVTs is reported. Base rate estimates for SPVT failure in VA disability exams are comparable to those in other medicolegal settings. However, failure in routine clinical exams is much higher in the VA than in other settings, possibly reflecting the hybrid nature of the VA's role in both healthcare and disability determination. Generally speaking, VA neuropsychologists use SPVTs frequently and eschew pejorative terms to describe their failure. Practitioners who require only one SPVT failure to establish response bias may overclassify patients. Those who use few or no SPVTs may fail to identify response bias. Additional clinical and theoretical implications are discussed.
Jun, Jin; Faulkner, Kenneth M
2018-04-01
To review the current literature on hospital nursing factors associated with 30-day readmission rates of patients with heart failure. Heart failure is a common, yet debilitating chronic illness with high mortality and morbidity. One in five patients with heart failure will experience unplanned readmission to a hospital within 30 days. Given the significance of heart failure to individuals, families and healthcare system, the Center for Medicare and Medicaid Services has made reducing 30-day readmission rates a priority. Scoping review, which maps the key concepts of a research area, is used. Published primary studies in English assessing factors related to nurses in hospitals and readmission of patients with heart failure were included. Other inclusion criteria were written in English and published in peer-reviewed journals. The search resulted in 2,782 articles. After removing duplicates and reviewing the inclusion and exclusion criteria, five articles were selected. Three nursing workforce factors emerged as follows: (i) nursing staffing, (ii) nursing care and work environment, and (iii) nurses' knowledge of heart failure. This is the first scoping review examining the association between hospital nursing factors and 30-day readmission rates of patients with heart failure. Further studies examining the extent of nursing structural and process factors influencing the outcomes of patients with heart failure are needed. Nurses are an integral part of the healthcare system. Identifying the factors related to nurses in hospitals is important to ensure comprehensive delivery of care to the chronically ill population. Hospital administrators, managers and policymakers can use the findings from this review to implement strategies to reduce 30-day readmission rates of patients with heart failure. © 2018 John Wiley & Sons Ltd.
Comparative study of the failure rates among 3 implantable defibrillator leads.
van Malderen, Sophie C H; Szili-Torok, Tamas; Yap, Sing C; Hoeks, Sanne E; Zijlstra, Felix; Theuns, Dominic A M J
2016-12-01
After the introduction of the Biotronik Linox S/SD high-voltage lead, several cases of early failure have been observed. The purpose of this article was to assess the performance of the Linox S/SD lead in comparison to 2 other contemporary leads. We used the prospective Erasmus MC ICD registry to identify all implanted Linox S/SD (n = 408), Durata (St. Jude Medical, model 7122) (n = 340), and Endotak Reliance (Boston Scientific, models 0155, 0138, and 0158) (n = 343) leads. Lead failure was defined by low- or high-voltage impedance, failure to capture, sense or defibrillate, or the presence of nonphysiological signals not due to external interference. During a median follow-up of 5.1 years, 24 Linox (5.9%), 5 Endotak (1.5%), and 5 Durata (1.5%) leads failed. At 5-year follow-up, the cumulative failure rate of Linox leads (6.4%) was higher than that of Endotak (0.4%; P < .0001) and Durata (2.0%; P = .003) leads. The incidence rate was higher in Linox leads (1.3 per 100 patient-years) than in Endotak and Durata leads (0.2 and 0.3 per 100 patient-years, respectively; P < .001). A log-log analysis of the cumulative hazard for Linox leads functioning at 3-year follow-up revealed a stable failure rate of 3% per year. The majority of failures consisted of noise (62.5%) and abnormal impedance (33.3%). This study demonstrates a higher failure rate of Linox S/SD high-voltage leads compared to contemporary leads. Although the mechanism of lead failure is unclear, the majority presents with abnormal electrical parameters. Comprehensive monitoring of Linox S/SD high-voltage leads includes remote monitoring to facilitate early detection of lead failure. Copyright © 2016. Published by Elsevier Inc.
Losert, C; Schmauß, M; Becker, T; Kilian, R
2012-12-01
Studies in urban areas identified environmental risk factors for mental illness, but little research on this topic has been performed in rural areas. Hospital admission rates were computed for 174 rural municipalities in the catchment area of the state psychiatric hospital in Günzburg in years 2006 to 2009 and combined with structural and socio-economic data. Relationships of overall and diagnosis-specific admission rates with municipality characteristics were analysed by means of negative binomial regression models. Admission rates of patients with a diagnosis of schizophrenia and affective disorder combined decrease with increasing population growth, population density, average income and green areas, while admission rates are positively correlated with commuter balance, income inequality, unemployment rates and traffic areas. Admission rates for schizophrenia are negatively related to population growth, average income and agricultural areas, but positively related to mobility index, income inequality and unemployment rate. Admission rates for affective disorders are negatively related to population growth, population density, average income and green areas, while higher admission rates are correlated with commuter balance, high income inequality, unemployment rate and traffic-related areas. Effects of wealth, economic inequality, population density and structural area characteristics influence psychiatric admission rates also in rural areas.
Failure factors in non-life insurance companies in United Kingdom
NASA Astrophysics Data System (ADS)
Samsudin, Humaida Banu
2013-04-01
Failure in insurance company is a condition of financial distress where a company has difficulty paying off its financial obligations to its creditors. This study continues the research from the study in identifying the determinants for run-off non-life insurance companies in United Kingdom. The analysis continues to identify other variables that could lead companies to financial distress that is macroeconomic factors (GDP rates, inflation rates and interest rates); total companies failed a year before and average size for failed companies'. The result from the analysis indicates that inflation rates, interest rates, total companies failed a year before and average sizes for failed companies are the best predictors. An early detection of failure can prevent companies from bankruptcy and allow management to take action to reduce the failure costs.
Failure Rates and Patterns of Recurrence in Patients With Resected N1 Non-Small-Cell Lung Cancer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Varlotto, John M., E-mail: jvarlotto@hmc.psu.edu; Medford-Davis, Laura Nyshel; Recht, Abram
2011-10-01
Purpose: To examine the local and distant recurrence rates and patterns of failure in patients undergoing potentially curative resection of N1 non-small-cell lung cancer. Methods and Materials: The study included 60 consecutive unirradiated patients treated from 2000 to 2006. Median follow-up was 30 months. Failure rates were calculated by the Kaplan-Meier method. A univariate Cox proportional hazard model was used to assess factors associated with recurrence. Results: Local and distant failure rates (as the first site of failure) at 2, 3, and 5 years were 33%, 33%, and 46%; and 26%, 26%, and 32%, respectively. The most common site ofmore » local failure was in the mediastinum; 12 of 18 local recurrences would have been included within proposed postoperative radiotherapy fields. Patients who received chemotherapy were found to be at increased risk of local failure, whereas those who underwent pneumonectomy or who had more positive nodes had significantly increased risks of distant failure. Conclusions: Patients with resected non-small-cell lung cancer who have N1 disease are at substantial risk of local recurrence as the first site of relapse, which is greater than the risk of distant failure. The role of postoperative radiotherapy in such patients should be revisited in the era of adjuvant chemotherapy.« less
Modeling number of claims and prediction of total claim amount
NASA Astrophysics Data System (ADS)
Acar, Aslıhan Şentürk; Karabey, Uǧur
2017-07-01
In this study we focus on annual number of claims of a private health insurance data set which belongs to a local insurance company in Turkey. In addition to Poisson model and negative binomial model, zero-inflated Poisson model and zero-inflated negative binomial model are used to model the number of claims in order to take into account excess zeros. To investigate the impact of different distributional assumptions for the number of claims on the prediction of total claim amount, predictive performances of candidate models are compared by using root mean square error (RMSE) and mean absolute error (MAE) criteria.
Umbral Calculus and Holonomic Modules in Positive Characteristic
NASA Astrophysics Data System (ADS)
Kochubei, Anatoly N.
2006-03-01
In the framework of analysis over local fields of positive characteristic, we develop algebraic tools for introducing and investigating various polynomial systems. In this survey paper we describe a function field version of umbral calculus developed on the basis of a relation of binomial type satisfied by the Carlitz polynomials. We consider modules over the Weyl-Carlitz ring, a function field counterpart of the Weyl algebra. It is shown that some basic objects of function field arithmetic, like the Carlitz module, Thakur's hypergeometric polynomials, and analogs of binomial coefficients arising in the positive characteristic version of umbral calculus, generate holonomic modules.
Modelling parasite aggregation: disentangling statistical and ecological approaches.
Yakob, Laith; Soares Magalhães, Ricardo J; Gray, Darren J; Milinovich, Gabriel; Wardrop, Nicola; Dunning, Rebecca; Barendregt, Jan; Bieri, Franziska; Williams, Gail M; Clements, Archie C A
2014-05-01
The overdispersion in macroparasite infection intensity among host populations is commonly simulated using a constant negative binomial aggregation parameter. We describe an alternative to utilising the negative binomial approach and demonstrate important disparities in intervention efficacy projections that can come about from opting for pattern-fitting models that are not process-explicit. We present model output in the context of the epidemiology and control of soil-transmitted helminths due to the significant public health burden imposed by these parasites, but our methods are applicable to other infections with demonstrable aggregation in parasite numbers among hosts. Copyright © 2014. Published by Elsevier Ltd.
FluBreaks: early epidemic detection from Google flu trends.
Pervaiz, Fahad; Pervaiz, Mansoor; Abdur Rehman, Nabeel; Saif, Umar
2012-10-04
The Google Flu Trends service was launched in 2008 to track changes in the volume of online search queries related to flu-like symptoms. Over the last few years, the trend data produced by this service has shown a consistent relationship with the actual number of flu reports collected by the US Centers for Disease Control and Prevention (CDC), often identifying increases in flu cases weeks in advance of CDC records. However, contrary to popular belief, Google Flu Trends is not an early epidemic detection system. Instead, it is designed as a baseline indicator of the trend, or changes, in the number of disease cases. To evaluate whether these trends can be used as a basis for an early warning system for epidemics. We present the first detailed algorithmic analysis of how Google Flu Trends can be used as a basis for building a fully automated system for early warning of epidemics in advance of methods used by the CDC. Based on our work, we present a novel early epidemic detection system, called FluBreaks (dritte.org/flubreaks), based on Google Flu Trends data. We compared the accuracy and practicality of three types of algorithms: normal distribution algorithms, Poisson distribution algorithms, and negative binomial distribution algorithms. We explored the relative merits of these methods, and related our findings to changes in Internet penetration and population size for the regions in Google Flu Trends providing data. Across our performance metrics of percentage true-positives (RTP), percentage false-positives (RFP), percentage overlap (OT), and percentage early alarms (EA), Poisson- and negative binomial-based algorithms performed better in all except RFP. Poisson-based algorithms had average values of 99%, 28%, 71%, and 76% for RTP, RFP, OT, and EA, respectively, whereas negative binomial-based algorithms had average values of 97.8%, 17.8%, 60%, and 55% for RTP, RFP, OT, and EA, respectively. Moreover, the EA was also affected by the region's population size. Regions with larger populations (regions 4 and 6) had higher values of EA than region 10 (which had the smallest population) for negative binomial- and Poisson-based algorithms. The difference was 12.5% and 13.5% on average in negative binomial- and Poisson-based algorithms, respectively. We present the first detailed comparative analysis of popular early epidemic detection algorithms on Google Flu Trends data. We note that realizing this opportunity requires moving beyond the cumulative sum and historical limits method-based normal distribution approaches, traditionally employed by the CDC, to negative binomial- and Poisson-based algorithms to deal with potentially noisy search query data from regions with varying population and Internet penetrations. Based on our work, we have developed FluBreaks, an early warning system for flu epidemics using Google Flu Trends.
A Decreasing Failure Rate, Mixed Exponential Model Applied to Reliability.
1981-06-01
Trident missile systems have been observed. The mixed exponential distribu- tion has been shown to fit the life data for the electronic equipment on...these systems . This paper discusses some of the estimation problems which occur with the decreasing failure rate mixed exponential distribution when...assumption of constant or increasing failure rate seemed to be incorrect. 2. However, the design of this electronic equipment indicated that
High-Strain Rate Failure Modeling Incorporating Shear Banding and Fracture
2017-11-22
High Strain Rate Failure Modeling Incorporating Shear Banding and Fracture The views, opinions and/or findings contained in this report are those of...SECURITY CLASSIFICATION OF: 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13. SUPPLEMENTARY NOTES 12. DISTRIBUTION AVAILIBILITY STATEMENT 6. AUTHORS...Report as of 05-Dec-2017 Agreement Number: W911NF-13-1-0238 Organization: Columbia University Title: High Strain Rate Failure Modeling Incorporating
Mechanistic Considerations Used in the Development of the PROFIT PCI Failure Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pankaskie, P. J.
A fuel Pellet-Zircaloy Cladding (thermo-mechanical-chemical) Interactions (PC!) failure model for estimating the probability of failure in !ransient increases in power (PROFIT) was developed. PROFIT is based on 1) standard statistical methods applied to available PC! fuel failure data and 2) a mechanistic analysis of the environmental and strain-rate-dependent stress versus strain characteristics of Zircaloy cladding. The statistical analysis of fuel failures attributable to PCI suggested that parameters in addition to power, transient increase in power, and burnup are needed to define PCI fuel failures in terms of probability estimates with known confidence limits. The PROFIT model, therefore, introduces an environmentalmore » and strain-rate dependent strain energy absorption to failure (SEAF) concept to account for the stress versus strain anomalies attributable to interstitial-disloction interaction effects in the Zircaloy cladding. Assuming that the power ramping rate is the operating corollary of strain-rate in the Zircaloy cladding, then the variables of first order importance in the PCI fuel failure phenomenon are postulated to be: 1. pre-transient fuel rod power, P{sub I}, 2. transient increase in fuel rod power, {Delta}P, 3. fuel burnup, Bu, and 4. the constitutive material property of the Zircaloy cladding, SEAF.« less
Dent, Daniel L; Al Fayyadh, Mohammed J; Rawlings, Jeremy A; Hassan, Ramy A; Kempenich, Jason W; Willis, Ross E; Stewart, Ronald M
2018-03-01
It has been suggested that in environments where there is greater fear of litigation, resident autonomy and education is compromised. Our aim was to examine failure rates on American Board of Surgery (ABS) examinations in comparison with medical malpractice payments in 47 US states/territories that have general surgery residency programs. We hypothesized higher ABS examination failure rates for general surgery residents who graduate from residencies in states with higher malpractice risk. We conducted a retrospective review of five-year (2010-2014) pass rates of first-time examinees of the ABS examinations. States' malpractice data were adjusted based on population. ABS examinations failure rates for programs in states with above and below median malpractice payments per capita were 31 and 24 per cent (P < 0.01) respectively. This difference was seen in university and independent programs regardless of size. Pearson correlation confirmed a significant positive correlation between board failure rates and malpractice payments per capita for Qualifying Examination (P < 0.02), Certifying Examination (P < 0.02), and Qualifying and Certifying combined index (P < 0.01). Malpractice risk correlates positively with graduates' failure rates on ABS examinations regardless of program size or type. We encourage further examination of training environments and their relationship to surgical residency graduate performance.
Heavy metal and trace element concentrations in blood and follicular fluid affect ART outcome.
Tolunay, Harun Egemen; Şükür, Yavuz Emre; Ozkavukcu, Sinan; Seval, Mehmet Murat; Ateş, Can; Türksoy, Vugar Ali; Ecemiş, Tolga; Atabekoğlu, Cem Somer; Özmen, Batuhan; Berker, Bülent; Sönmezer, Murat
2016-03-01
To assess the effects of heavy metal and trace element concentrations in blood and follicular fluid on assisted reproductive technology cycle outcome. A prospective study was conducted between January 2012 and July 2012 in a university hospital infertility clinic. One hundred and one patients with unexplained infertility who underwent intracytoplasmic sperm injection using GnRH-antagonist protocol were recruited. Concentrations of four toxic metals (Cd, Pb, Hg, As) and three trace elements (Cu, Zn, Fe) were measured both in blood and follicular fluid specimens. Patients were evaluated in two groups; the study group consisted of patients with ongoing pregnancy (n=20) and the reference group consisted of patients experienced assisted reproductive technology failure, miscarriage or biochemical pregnancy (n=81). Demographics and cycle parameters were comparable between the groups except for median number of day 3 Grade A embryos. Statistically significant negative correlations were found between blood Pb levels and number of MII oocytes, implantation, clinical pregnancy and ongoing pregnancy rates. Results of the log binomial regression revealed 2.2% lower risk for ongoing pregnancy for each 1μg/dL higher blood Pb concentration while holding the other variables in the model constant (RR 0.978; 95% CI 0.956-0.998; P=.041). Also, the results revealed 71.9% lower risk for ongoing pregnancy for each 1μg/dL higher follicular fluid Cu concentration while holding the other variables in the model constant (RR 0.288; 95% CI 0.085-0.92; P=.039). Blood concentrations of Pb and follicular fluid concentrations of Cu seem to have significant impacts on assisted reproductive technology cycle outcome. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Cognitive Decline and Older Driver Crash Risk.
Fraade-Blanar, Laura A; Ebel, Beth E; Larson, Eric B; Sears, Jeanne M; Thompson, Hilaire J; Chan, Kwun Chuen G; Crane, Paul K
2018-04-17
To examine automobile crash risk associated with cognition in older drivers without dementia. Retrospective secondary analysis of longitudinal cohort study. Our study used data from the Adult Changes in Thought (ACT) Study merged with Washington State crash reports and licensure records. Data were available from 2002 to 2015. Group Health enrollees from Washington State aged 65 and older with active driver's licenses (N=2,615). Cognitive function was assessed using the Cognitive Abilities Screening Instrument scored using item response theory (CASI-IRT). The study outcome was police-reported motor vehicle crash. We used a negative binomial mixed-effects model with robust standard errors clustered on the individual and considered associations between crash risk, level of cognition, and amount of decline since the previous study visit. Covariates included age, sex, education, alcohol, depression, medical comorbidities, eyesight, hearing, and physical function. Individuals were censored at dementia diagnosis, death, or failure to renew their license. Over an average of 7 years of follow-up, 350 (13%) people had at least one crash. A 1-unit lower CASI-IRT score was associated with a higher adjusted incidence rate ratio of crash of 1.26 (95% confidence interval=1.08-1.51). Beyond level of cognition, amount of cognitive decline between study visits was not associated with crash risk. This study suggests that, in older drivers, poorer performance on the CASI-IRT may be a risk factor for motor vehicle crashes, even in individuals without diagnosed dementia. Further research is needed to understand driving behavior and inform driving decisions for older adults with poor cognitive function. © 2018, Copyright the Authors Journal compilation © 2018, The American Geriatrics Society.
40 CFR 51.352 - Basic I/M performance standard.
Code of Federal Regulations, 2010 CFR
2010-07-01
...% emission test failure rate among pre-1981 model year vehicles. (10) Waiver rate. A 0% waiver rate. (11... 20% emission test failure rate among pre-1981 model year vehicles. (11) Waiver rate. A 0% waiver rate... Requirements § 51.352 Basic I/M performance standard. (a) Basic I/M programs shall be designed and implemented...
40 CFR 51.352 - Basic I/M performance standard.
Code of Federal Regulations, 2014 CFR
2014-07-01
...% emission test failure rate among pre-1981 model year vehicles. (10) Waiver rate. A 0% waiver rate. (11... 20% emission test failure rate among pre-1981 model year vehicles. (11) Waiver rate. A 0% waiver rate... Requirements § 51.352 Basic I/M performance standard. (a) Basic I/M programs shall be designed and implemented...
40 CFR 51.352 - Basic I/M performance standard.
Code of Federal Regulations, 2011 CFR
2011-07-01
...% emission test failure rate among pre-1981 model year vehicles. (10) Waiver rate. A 0% waiver rate. (11... 20% emission test failure rate among pre-1981 model year vehicles. (11) Waiver rate. A 0% waiver rate... Requirements § 51.352 Basic I/M performance standard. (a) Basic I/M programs shall be designed and implemented...
40 CFR 51.352 - Basic I/M performance standard.
Code of Federal Regulations, 2012 CFR
2012-07-01
...% emission test failure rate among pre-1981 model year vehicles. (10) Waiver rate. A 0% waiver rate. (11... 20% emission test failure rate among pre-1981 model year vehicles. (11) Waiver rate. A 0% waiver rate... Requirements § 51.352 Basic I/M performance standard. (a) Basic I/M programs shall be designed and implemented...
40 CFR 51.352 - Basic I/M performance standard.
Code of Federal Regulations, 2013 CFR
2013-07-01
...% emission test failure rate among pre-1981 model year vehicles. (10) Waiver rate. A 0% waiver rate. (11... 20% emission test failure rate among pre-1981 model year vehicles. (11) Waiver rate. A 0% waiver rate... Requirements § 51.352 Basic I/M performance standard. (a) Basic I/M programs shall be designed and implemented...
RELAV - RELIABILITY/AVAILABILITY ANALYSIS PROGRAM
NASA Technical Reports Server (NTRS)
Bowerman, P. N.
1994-01-01
RELAV (Reliability/Availability Analysis Program) is a comprehensive analytical tool to determine the reliability or availability of any general system which can be modeled as embedded k-out-of-n groups of items (components) and/or subgroups. Both ground and flight systems at NASA's Jet Propulsion Laboratory have utilized this program. RELAV can assess current system performance during the later testing phases of a system design, as well as model candidate designs/architectures or validate and form predictions during the early phases of a design. Systems are commonly modeled as System Block Diagrams (SBDs). RELAV calculates the success probability of each group of items and/or subgroups within the system assuming k-out-of-n operating rules apply for each group. The program operates on a folding basis; i.e. it works its way towards the system level from the most embedded level by folding related groups into single components. The entire folding process involves probabilities; therefore, availability problems are performed in terms of the probability of success, and reliability problems are performed for specific mission lengths. An enhanced cumulative binomial algorithm is used for groups where all probabilities are equal, while a fast algorithm based upon "Computing k-out-of-n System Reliability", Barlow & Heidtmann, IEEE TRANSACTIONS ON RELIABILITY, October 1984, is used for groups with unequal probabilities. Inputs to the program include a description of the system and any one of the following: 1) availabilities of the items, 2) mean time between failures and mean time to repairs for the items from which availabilities are calculated, 3) mean time between failures and mission length(s) from which reliabilities are calculated, or 4) failure rates and mission length(s) from which reliabilities are calculated. The results are probabilities of success of each group and the system in the given configuration. RELAV assumes exponential failure distributions for reliability calculations and infinite repair resources for availability calculations. No more than 967 items or groups can be modeled by RELAV. If larger problems can be broken into subsystems of 967 items or less, the subsystem results can be used as item inputs to a system problem. The calculated availabilities are steady-state values. Group results are presented in the order in which they were calculated (from the most embedded level out to the system level). This provides a good mechanism to perform trade studies. Starting from the system result and working backwards, the granularity gets finer; therefore, system elements that contribute most to system degradation are detected quickly. RELAV is a C-language program originally developed under the UNIX operating system on a MASSCOMP MC500 computer. It has been modified, as necessary, and ported to an IBM PC compatible with a math coprocessor. The current version of the program runs in the DOS environment and requires a Turbo C vers. 2.0 compiler. RELAV has a memory requirement of 103 KB and was developed in 1989. RELAV is a copyrighted work with all copyright vested in NASA.
On rate-state and Coulomb failure models
Gomberg, J.; Beeler, N.; Blanpied, M.
2000-01-01
We examine the predictions of Coulomb failure stress and rate-state frictional models. We study the change in failure time (clock advance) Δt due to stress step perturbations (i.e., coseismic static stress increases) added to "background" stressing at a constant rate (i.e., tectonic loading) at time t0. The predictability of Δt implies a predictable change in seismicity rate r(t)/r0, testable using earthquake catalogs, where r0 is the constant rate resulting from tectonic stressing. Models of r(t)/r0, consistent with general properties of aftershock sequences, must predict an Omori law seismicity decay rate, a sequence duration that is less than a few percent of the mainshock cycle time and a return directly to the background rate. A Coulomb model requires that a fault remains locked during loading, that failure occur instantaneously, and that Δt is independent of t0. These characteristics imply an instantaneous infinite seismicity rate increase of zero duration. Numerical calculations of r(t)/r0 for different state evolution laws show that aftershocks occur on faults extremely close to failure at the mainshock origin time, that these faults must be "Coulomb-like," and that the slip evolution law can be precluded. Real aftershock population characteristics also may constrain rate-state constitutive parameters; a may be lower than laboratory values, the stiffness may be high, and/or normal stress may be lower than lithostatic. We also compare Coulomb and rate-state models theoretically. Rate-state model fault behavior becomes more Coulomb-like as constitutive parameter a decreases relative to parameter b. This is because the slip initially decelerates, representing an initial healing of fault contacts. The deceleration is more pronounced for smaller a, more closely simulating a locked fault. Even when the rate-state Δt has Coulomb characteristics, its magnitude may differ by some constant dependent on b. In this case, a rate-state model behaves like a modified Coulomb failure model in which the failure stress threshold is lowered due to weakening, increasing the clock advance. The deviation from a non-Coulomb response also depends on the loading rate, elastic stiffness, initial conditions, and assumptions about how state evolves.
Mechanical Properties of Transgenic Silkworm Silk Under High Strain Rate Tensile Loading
NASA Astrophysics Data System (ADS)
Chu, J.-M.; Claus, B.; Chen, W.
2017-12-01
Studies have shown that transgenic silkworm silk may be capable of having similar properties of spider silk while being mass-producible. In this research, the tensile stress-strain response of transgenic silkworm silk fiber is systematically characterized using a quasi-static load frame and a tension Kolsky bar over a range of strain-rates between 10^{-3} and 700/s. The results show that transgenic silkworm silk tends to have higher overall ultimate stress and failure strain at high strain rate (700/s) compared to quasi-static strain rates, indicating rate sensitivity of the material. The failure strain at the high strain rate is higher than that of spider silk. However, the stress levels are significantly below that of spider silk, and far below that of high-performance fiber. Failure surfaces are examined via scanning electron microscopy and reveal that the failure modes are similar to those of spider silk.
Vocal fold tissue failure: preliminary data and constitutive modeling.
Chan, Roger W; Siegmund, Thomas
2004-08-01
In human voice production (phonation), linear small-amplitude vocal fold oscillation occurs only under restricted conditions. Physiologically, phonation more often involves large-amplitude oscillation associated with tissue stresses and strains beyond their linear viscoelastic limits, particularly in the lamina propria extracellular matrix (ECM). This study reports some preliminary measurements of tissue deformation and failure response of the vocal fold ECM under large-strain shear The primary goal was to formulate and test a novel constitutive model for vocal fold tissue failure, based on a standard-linear cohesive-zone (SL-CZ) approach. Tissue specimens of the sheep vocal fold mucosa were subjected to torsional deformation in vitro, at constant strain rates corresponding to twist rates of 0.01, 0.1, and 1.0 rad/s. The vocal fold ECM demonstrated nonlinear stress-strain and rate-dependent failure response with a failure strain as low as 0.40 rad. A finite-element implementation of the SL-CZ model was capable of capturing the rate dependence in these preliminary data, demonstrating the model's potential for describing tissue failure. Further studies with additional tissue specimens and model improvements are needed to better understand vocal fold tissue failure.
Zago, Mauro; Bozzo, Samantha; Carrara, Giulia; Mariani, Diego
2017-01-01
To explore the current literature on the failure to rescue and rescue surgery concepts, to identify the key items for decreasing the failure to rescue rate and improve outcome, to verify if there is a rationale for centralization of patients suffering postoperative complications. There is a growing awareness about the need to assess and measure the failure to rescue rate, on institutional, regional and national basis. Many factors affect failure to rescue, and all should be individually analyzed and considered. Rescue surgery is one of these factors. Rescue surgery assumes an acute care surgery background. Measurement of failure to rescue rate should become a standard for quality improvement programs. Implementation of all clinical and organizational items involved is the key for better outcomes. Preparedness for rescue surgery is a main pillar in this process. Centralization of management, audit, and communication are important as much as patient centralization. Celsius.
An examination of sources of sensitivity of consumer surplus estimates in travel cost models.
Blaine, Thomas W; Lichtkoppler, Frank R; Bader, Timothy J; Hartman, Travis J; Lucente, Joseph E
2015-03-15
We examine sensitivity of estimates of recreation demand using the Travel Cost Method (TCM) to four factors. Three of the four have been routinely and widely discussed in the TCM literature: a) Poisson verses negative binomial regression; b) application of Englin correction to account for endogenous stratification; c) truncation of the data set to eliminate outliers. A fourth issue we address has not been widely modeled: the potential effect on recreation demand of the interaction between income and travel cost. We provide a straightforward comparison of all four factors, analyzing the impact of each on regression parameters and consumer surplus estimates. Truncation has a modest effect on estimates obtained from the Poisson models but a radical effect on the estimates obtained by way of the negative binomial. Inclusion of an income-travel cost interaction term generally produces a more conservative but not a statistically significantly different estimate of consumer surplus in both Poisson and negative binomial models. It also generates broader confidence intervals. Application of truncation, the Englin correction and the income-travel cost interaction produced the most conservative estimates of consumer surplus and eliminated the statistical difference between the Poisson and the negative binomial. Use of the income-travel cost interaction term reveals that for visitors who face relatively low travel costs, the relationship between income and travel demand is negative, while it is positive for those who face high travel costs. This provides an explanation of the ambiguities on the findings regarding the role of income widely observed in the TCM literature. Our results suggest that policies that reduce access to publicly owned resources inordinately impact local low income recreationists and are contrary to environmental justice. Copyright © 2014 Elsevier Ltd. All rights reserved.
O’Donnell, Katherine M.; Thompson, Frank R.; Semlitsch, Raymond D.
2015-01-01
Detectability of individual animals is highly variable and nearly always < 1; imperfect detection must be accounted for to reliably estimate population sizes and trends. Hierarchical models can simultaneously estimate abundance and effective detection probability, but there are several different mechanisms that cause variation in detectability. Neglecting temporary emigration can lead to biased population estimates because availability and conditional detection probability are confounded. In this study, we extend previous hierarchical binomial mixture models to account for multiple sources of variation in detectability. The state process of the hierarchical model describes ecological mechanisms that generate spatial and temporal patterns in abundance, while the observation model accounts for the imperfect nature of counting individuals due to temporary emigration and false absences. We illustrate our model’s potential advantages, including the allowance of temporary emigration between sampling periods, with a case study of southern red-backed salamanders Plethodon serratus. We fit our model and a standard binomial mixture model to counts of terrestrial salamanders surveyed at 40 sites during 3–5 surveys each spring and fall 2010–2012. Our models generated similar parameter estimates to standard binomial mixture models. Aspect was the best predictor of salamander abundance in our case study; abundance increased as aspect became more northeasterly. Increased time-since-rainfall strongly decreased salamander surface activity (i.e. availability for sampling), while higher amounts of woody cover objects and rocks increased conditional detection probability (i.e. probability of capture, given an animal is exposed to sampling). By explicitly accounting for both components of detectability, we increased congruence between our statistical modeling and our ecological understanding of the system. We stress the importance of choosing survey locations and protocols that maximize species availability and conditional detection probability to increase population parameter estimate reliability. PMID:25775182
A big data approach to the development of mixed-effects models for seizure count data.
Tharayil, Joseph J; Chiang, Sharon; Moss, Robert; Stern, John M; Theodore, William H; Goldenholz, Daniel M
2017-05-01
Our objective was to develop a generalized linear mixed model for predicting seizure count that is useful in the design and analysis of clinical trials. This model also may benefit the design and interpretation of seizure-recording paradigms. Most existing seizure count models do not include children, and there is currently no consensus regarding the most suitable model that can be applied to children and adults. Therefore, an additional objective was to develop a model that accounts for both adult and pediatric epilepsy. Using data from SeizureTracker.com, a patient-reported seizure diary tool with >1.2 million recorded seizures across 8 years, we evaluated the appropriateness of Poisson, negative binomial, zero-inflated negative binomial, and modified negative binomial models for seizure count data based on minimization of the Bayesian information criterion. Generalized linear mixed-effects models were used to account for demographic and etiologic covariates and for autocorrelation structure. Holdout cross-validation was used to evaluate predictive accuracy in simulating seizure frequencies. For both adults and children, we found that a negative binomial model with autocorrelation over 1 day was optimal. Using holdout cross-validation, the proposed model was found to provide accurate simulation of seizure counts for patients with up to four seizures per day. The optimal model can be used to generate more realistic simulated patient data with very few input parameters. The availability of a parsimonious, realistic virtual patient model can be of great utility in simulations of phase II/III clinical trials, epilepsy monitoring units, outpatient biosensors, and mobile Health (mHealth) applications. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.
Kabaluk, J Todd; Binns, Michael R; Vernon, Robert S
2006-06-01
Counts of green peach aphid, Myzus persicae (Sulzer) (Hemiptera: Aphididae), in potato, Solanum tuberosum L., fields were used to evaluate the performance of the sampling plan from a pest management company. The counts were further used to develop a binomial sampling method, and both full count and binomial plans were evaluated using operating characteristic curves. Taylor's power law provided a good fit of the data (r2 = 0.95), with the relationship between the variance (s2) and mean (m) as ln(s2) = 1.81(+/- 0.02) + 1.55(+/- 0.01) ln(m). A binomial sampling method was developed using the empirical model ln(m) = c + dln(-ln(1 - P(T))), to which the data fit well for tally numbers (T) of 0, 1, 3, 5, 7, and 10. Although T = 3 was considered the most reasonable given its operating characteristics and presumed ease of classification above or below critical densities (i.e., action thresholds) of one and 10 M. persicae per leaf, the full count method is shown to be superior. The mean number of sample sites per field visit by the pest management company was 42 +/- 19, with more than one-half (54%) of the field visits involving sampling 31-50 sample sites, which was acceptable in the context of operating characteristic curves for a critical density of 10 M. persicae per leaf. Based on operating characteristics, actual sample sizes used by the pest management company can be reduced by at least 50%, on average, for a critical density of 10 M. persicae per leaf. For a critical density of one M. persicae per leaf used to avert the spread of potato leaf roll virus, sample sizes from 50 to 100 were considered more suitable.
CUMBIN - CUMULATIVE BINOMIAL PROGRAMS
NASA Technical Reports Server (NTRS)
Bowerman, P. N.
1994-01-01
The cumulative binomial program, CUMBIN, is one of a set of three programs which calculate cumulative binomial probability distributions for arbitrary inputs. The three programs, CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), can be used independently of one another. CUMBIN can be used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. The program has been used for reliability/availability calculations. CUMBIN calculates the probability that a system of n components has at least k operating if the probability that any one operating is p and the components are independent. Equivalently, this is the reliability of a k-out-of-n system having independent components with common reliability p. CUMBIN can evaluate the incomplete beta distribution for two positive integer arguments. CUMBIN can also evaluate the cumulative F distribution and the negative binomial distribution, and can determine the sample size in a test design. CUMBIN is designed to work well with all integer values 0 < k <= n. To run the program, the user simply runs the executable version and inputs the information requested by the program. The program is not designed to weed out incorrect inputs, so the user must take care to make sure the inputs are correct. Once all input has been entered, the program calculates and lists the result. The CUMBIN program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly with most C compilers. The program format is interactive. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CUMBIN was developed in 1988.
Talic, Nabeel F
2016-08-01
This comparative prospective randomized clinical trial examined the in vivo failure rates of fixed mandibular and maxillary lingual retainers bonded with two light-cured flowable composites over 6 months. Consecutive patients were divided into two groups on a 1:1 basis. Two hundred fixed lingual retainers were included, and their failures were followed for 6 months. One group (n = 50) received retainers bonded with a nano-hybrid composite based on nano-optimized technology (Tetric-N-Flow, Ivoclar Vivadent). Another group (n = 50) received retainers bonded with a low viscosity (LV) composite (Transbond Supreme LV, 3M Unitek). There was no significant difference between the overall failure rates of mandibular retainers bonded with Transbond (8%) and those bonded with Tetric-N-Flow (18%). However, the odds ratio for failure using Tetric-N-flow was 2.52-fold greater than that of Transbond. The failure rate of maxillary retainers bonded with Transbond was higher (14%), but not significantly different, than that of maxillary retainers bonded with Tetric-N-flow (10%). There was no significant difference in the estimated mean survival times of the maxillary and mandibular retainers bonded with the two composites. Both types of composites tested in the current study can be used to bond fixed maxillary and mandibular lingual retainers, with low failure rates.
NASA Technical Reports Server (NTRS)
White, A. L.
1983-01-01
This paper examines the reliability of three architectures for six components. For each architecture, the probabilities of the failure states are given by algebraic formulas involving the component fault rate, the system recovery rate, and the operating time. The dominant failure modes are identified, and the change in reliability is considered with respect to changes in fault rate, recovery rate, and operating time. The major conclusions concern the influence of system architecture on failure modes and parameter requirements. Without this knowledge, a system designer may pick an inappropriate structure.
Remote operation of an orbital maneuvering vehicle in simulated docking maneuvers
NASA Technical Reports Server (NTRS)
Brody, Adam R.
1990-01-01
Simulated docking maneuvers were performed to assess the effect of initial velocity on docking failure rate, mission duration, and delta v (fuel consumption). Subjects performed simulated docking maneuvers of an orbital maneuvering vehicle (OMV) to a space station. The effect of the removal of the range and rate displays (simulating a ranging instrumentation failure) was also examined. Naive subjects were capable of achieving a high success rate in performing simulated docking maneuvers without extensive training. Failure rate was a function of individual differences; there was no treatment effect on failure rate. The amount of time subjects reserved for final approach increased with starting velocity. Piloting of docking maneuvers was not significantly affected in any way by the removal of range and rate displays. Radial impulse was significant both by subject and by treatment. NASA's 0.1 percent rule, dictating an approach rate no greater than 0.1 percent of the range, is seen to be overly conservative for nominal docking missions.
Analysis of factors affecting failure of glass cermet tunnel restorations in a multi-center study.
Pilebro, C E; van Dijken, J W
2001-06-01
The aim of this study was to analyze factors influencing the failures of tunnel restorations performed with a glass cermet cement (Ketac Silver). Caries activity, lesion size, tunnel cavity opening size, partial or total tunnel, composite lamination or operating time showed no significant correlation to failure rate. Twelve dentists in eight clinics clinically experienced and familiar with the tunnel technique placed 374 restorations. The occlusal sections of fifty percent of the restorations were laminated with hybrid resin composite. The results of the yearly clinical and radiographic evaluations over the course of 3 years were correlated to factors that could influence the failure rate using logistic regression analysis. At the 3-year recall a cumulative number of 305 restorations were available. The cumulative replacement rate was 20%. The main reasons for replacement were marginal ridge fracture (14%) and dentin caries (3%). Another 7% of the restorations which had not been replaced were classified as failures because of untreated dentin caries. The only significant variable observed was the individual failure rate of the participating dentists varying between 9 and 50% (p=0.013).
Increase in hospital admission rates for heart failure in The Netherlands, 1980-1993.
Reitsma, J. B.; Mosterd, A.; de Craen, A. J.; Koster, R. W.; van Capelle, F. J.; Grobbee, D. E.; Tijssen, J. G.
1996-01-01
OBJECTIVE: To study the trend in hospital admission rates for heart failure in the Netherlands from 1980 to 1993. DESIGN: All hospital admissions in the Netherlands with a principal discharge diagnosis of heart failure were analysed. In addition, individual records of heart failure patients from a subset of 7 hospitals were analysed to estimate the frequency and timing of readmissions. RESULTS: The total number of discharges for men increased from 7377 in 1980 to 13 022 in 1993, and for women from 7064 to 12 944. From 1980 through 1993 age adjusted discharge rates rose 48% for men and 40% for women. Age adjusted in-hospital mortality for heart failure decreased from 19% in 1980 to 15% in 1993. For all age groups in-hospital mortality for men was higher than for women. The mean length of hospital admissions in 1993 was 14.0 days for men and 16.4 days for women. A review of individual patient records from a 6.3% sample of all hospital admissions in the Netherlands indicated that within a 2 year period 18% of the heart failure patients were admitted more than once and 5% more than twice. CONCLUSIONS: For both men and women a pronounced increase in age adjusted discharge rates for heart failure was observed in the Netherlands from 1980 to 1993. Readmissions were a prominent feature among heart failure patients. Higher survival rates after acute myocardial infarction and the longer survival of patients with heart disease, including heart failure may have contributed to the observed increase. The importance of advances in diagnostic tools and of possible changes in admission policy remain uncertain. PMID:8944582
Weberndörfer, Vanessa; Nyffenegger, Tobias; Russi, Ian; Brinkert, Miriam; Berte, Benjamin; Toggweiler, Stefan; Kobza, Richard
2018-05-01
Early lead failure has recently been reported in ICD patients with Linox SD leads. We aimed to compare the long-term performance of the following lead model Linox Smart SD with other contemporary high-voltage leads. All patients receiving high-voltage leads at our center between November 2009 and May 2017 were retrospectively analyzed. Lead failure was defined as the occurrence of one or more of the following: non-physiological high-rate episodes, low- or high-voltage impedance anomalies, undersensing, or non-capture. In total, 220 patients were included (Linox Smart SD, n = 113; contemporary lead, n = 107). During a median follow-up of 3.8 years (IQR 1.6-5.9 years), a total of 16 (14 in Linox Smart SD and 2 in contemporary group) lead failures occurred, mostly due to non-physiological high-rate sensing or impedance abnormalities. Lead failure incidence rates per 100 person-years were 2.9 (95% CI 1.7-4.9) and 0.6 (95% CI 0.1-2.3) for Linox Smart SD compared to contemporary leads respectively. Kaplan Meier estimates of 5-year lead failure rates were 14.0% (95% CI 8.1-23.6%) and 1.3% (95% CI 0.2-8.9%), respectively (log-rank p = 0.028). Implantation of a Linox Smart SD lead increased the risk of lead failure with a hazard ratio (HR) of 4.53 (95% CI 1.03-19.95, p = 0.046) and 4.44 (95% CI 1.00-19.77, p = 0.05) in uni- and multivariable Cox models. The new Linox Smart SD lead model was associated with high failure rates and should be monitored closely to detect early signs of lead failure.
Lipscomb, Hester J; Schoenfisch, Ashley L; Cameron, Wilfrid; Kucera, Kristen L; Adams, Darrin; Silverstein, Barbara A
2014-09-01
Falls from height (FFH) are a longstanding, serious problem in construction. We report workers' compensation (WC) payments associated with FFH among a cohort (n = 24,830; 1989-2008) of carpenters. Mean/median payments, cost rates, and adjusted rate ratios based on hours worked were calculated using negative-binomial regression. Over the 20-year period FFH accounted for $66.6 million in WC payments or $700 per year for each full-time equivalent (2,000 hr of work). FFH were responsible for 5.5% of injuries but 15.1% of costs. Cost declines were observed, but not monotonically. Reductions were more pronounced for indemnity than medical care. Mean costs were 2.3 times greater among carpenters over 50 than those under 30; cost rates were only modestly higher. Significant progress has been made in reducing WC payments associated with FFH in this cohort particularly through 1996; primary gains reflect reduction in frequency of falls. FFH that occur remain costly. © 2014 Wiley Periodicals, Inc.
Cosh, Suzanne; Zenter, Nadja; Ay, Esra-Sultan; Loos, Sabine; Slade, Mike; De Rosa, Corrado; Luciano, Mario; Berecz, Roland; Glaub, Theodora; Munk-Jørgensen, Povl; Krogsgaard Bording, Malene; Rössler, Wulf; Kawohl, Wolfram; Puschner, Bernd
2017-09-01
The study explored relationships between preferences for and experiences of clinical decision making (CDM) with service use among persons with severe mental illness. Data from a prospective observational study in six European countries were examined. Associations of baseline staff-rated (N=213) and patient-rated (N=588) preferred and experienced decision making with service use were examined at baseline by using binomial regressions and at 12-month follow-up by using multilevel models. A preference by patients and staff for active patient involvement in decision making, rather than shared or passive decision making, was associated with longer hospital admissions and higher costs at baseline and with increases in admissions over 12 months (p=.043). Low patient-rated satisfaction with an experienced clinical decision was also related to increased costs over the study period (p=.005). A preference for shared decision making may reduce health care costs by reducing inpatient admissions. Patient satisfaction with decisions was a predictor of costs, and clinicians should maximize patient satisfaction with CDM.
Failure to activate the in-hospital emergency team: causes and outcomes.
Barbosa, Vera; Gomes, Ernestina; Vaz, Senio; Azevedo, Gustavo; Fernandes, Gonçalo; Ferreira, Amélia; Araujo, Rui
2016-01-01
To determine the incidence of afferent limb failure of the in-hospital Medical Emergency Team, characterizing it and comparing the mortality between the population experiencing afferent limb failure and the population not experiencing afferent limb failure. A total of 478 activations of the Medical Emergency Team of Hospital Pedro Hispano occurred from January 2013 to July 2015. A sample of 285 activations was obtained after excluding incomplete records and activations for patients with less than 6 hours of hospitalization. The sample was divided into two groups: the group experiencing afferent limb failure and the group not experiencing afferent limb failure of the Medical Emergency Team. Both populations were characterized and compared. Statistical significance was set at p ≤ 0.05. Afferent limb failure was observed in 22.1% of activations. The causal analysis revealed significant differences in Medical Emergency Team activation criteria (p = 0.003) in the group experiencing afferent limb failure, with higher rates of Medical Emergency Team activation for cardiac arrest and cardiovascular dysfunction. Regarding patient outcomes, the group experiencing afferent limb failure had higher immediate mortality rates and higher mortality rates at hospital discharge, with no significant differences. No significant differences were found for the other parameters. The incidence of cardiac arrest and the mortality rate were higher in patients experiencing failure of the afferent limb of the Medical Emergency Team. This study highlights the need for health units to invest in the training of all healthcare professionals regarding the Medical Emergency Team activation criteria and emergency medical response system operations.
The failure of earthquake failure models
Gomberg, J.
2001-01-01
In this study I show that simple heuristic models and numerical calculations suggest that an entire class of commonly invoked models of earthquake failure processes cannot explain triggering of seismicity by transient or "dynamic" stress changes, such as stress changes associated with passing seismic waves. The models of this class have the common feature that the physical property characterizing failure increases at an accelerating rate when a fault is loaded (stressed) at a constant rate. Examples include models that invoke rate state friction or subcritical crack growth, in which the properties characterizing failure are slip or crack length, respectively. Failure occurs when the rate at which these grow accelerates to values exceeding some critical threshold. These accelerating failure models do not predict the finite durations of dynamically triggered earthquake sequences (e.g., at aftershock or remote distances). Some of the failure models belonging to this class have been used to explain static stress triggering of aftershocks. This may imply that the physical processes underlying dynamic triggering differs or that currently applied models of static triggering require modification. If the former is the case, we might appeal to physical mechanisms relying on oscillatory deformations such as compaction of saturated fault gouge leading to pore pressure increase, or cyclic fatigue. However, if dynamic and static triggering mechanisms differ, one still needs to ask why static triggering models that neglect these dynamic mechanisms appear to explain many observations. If the static and dynamic triggering mechanisms are the same, perhaps assumptions about accelerating failure and/or that triggering advances the failure times of a population of inevitable earthquakes are incorrect.
An Efficient Implementation of Fixed Failure-Rate Ratio Test for GNSS Ambiguity Resolution.
Hou, Yanqing; Verhagen, Sandra; Wu, Jie
2016-06-23
Ambiguity Resolution (AR) plays a vital role in precise GNSS positioning. Correctly-fixed integer ambiguities can significantly improve the positioning solution, while incorrectly-fixed integer ambiguities can bring large positioning errors and, therefore, should be avoided. The ratio test is an extensively used test to validate the fixed integer ambiguities. To choose proper critical values of the ratio test, the Fixed Failure-rate Ratio Test (FFRT) has been proposed, which generates critical values according to user-defined tolerable failure rates. This contribution provides easy-to-implement fitting functions to calculate the critical values. With a massive Monte Carlo simulation, the functions for many different tolerable failure rates are provided, which enriches the choices of critical values for users. Moreover, the fitting functions for the fix rate are also provided, which for the first time allows users to evaluate the conditional success rate, i.e., the success rate once the integer candidates are accepted by FFRT. The superiority of FFRT over the traditional ratio test regarding controlling the failure rate and preventing unnecessary false alarms is shown by a simulation and a real data experiment. In the real data experiment with a baseline of 182.7 km, FFRT achieved much higher fix rates (up to 30% higher) and the same level of positioning accuracy from fixed solutions as compared to the traditional critical value.
Microcircuit Device Reliability. Digital Failure Rate Data
1981-01-01
Center Staff I IT Research Institute Under Contract to: Rome Air Development Center Griffiss AFB, NY 13441 fortes Ordering No. MDR- 17 biKi frbi...r ■■ ■—■ — SECURITY CLASSIFICATION Or THIS PAGE (Whin Dmlm Enlti»<l) REPORT DOCUMENTATION PAGE «EPO«TNUMBER MDR- 17 4. TITLE (md...MDR- 17 presents com- parisons between actual field experienced failure rates and MIL-HDBK-217C, Notice 1, predicted failure rates. The use of
Quast, Michaela B; Sviggum, Hans P; Hanson, Andrew C; Stoike, David E; Martin, David P; Niesen, Adam D
2018-05-01
Continuous brachial plexus catheters are often used to decrease pain following elbow surgery. This investigation aimed to assess the rate of early failure of infraclavicular (IC) and axillary (AX) nerve catheters following elbow surgery. Retrospective study. Postoperative recovery unit and inpatient hospital floor. 328 patients who received IC or AX nerve catheters and underwent elbow surgery were identified by retrospective query of our institution's database. Data collected included unplanned catheter dislodgement, catheter replacement rate, postoperative pain scores, and opioid administration on postoperative day 1. Catheter failure was defined as unplanned dislodging within 24 h of placement or requirement for catheter replacement and evaluated using a covariate adjusted model. 119 IC catheters and 209 AX catheters were evaluated. There were 8 (6.7%) failed IC catheters versus 13 (6.2%) failed AX catheters. After adjusting for age, BMI, and gender there was no difference in catheter failure rate between IC and AX nerve catheters (p = 0.449). These results suggest that IC and AX nerve catheters do not differ in the rate of early catheter failure, despite differences in anatomic location and catheter placement techniques. Both techniques provided effective postoperative analgesia with median pain scores < 3/10 for patients following elbow surgery. Reasons other than rate of early catheter failure should dictate which approach is performed. Copyright © 2018. Published by Elsevier Inc.
Tu, Jack V.; Nardi, Lorelei; Fang, Jiming; Liu, Juan; Khalid, Laila; Johansen, Helen
2009-01-01
Background Rates of death from cardiovascular and cerebrovascular diseases have been steadily declining over the past few decades. Whether such declines are occurring to a similar degree for common disorders such as acute myocardial infarction, heart failure and stroke is uncertain. We examined recent national trends in mortality and rates of hospital admission for these 3 conditions. Methods We analyzed mortality data from Statistic Canada’s Canadian Mortality Database and data on hospital admissions from the Canadian Institute for Health Information’s Hospital Morbidity Database for the period 1994–2004. We determined age- and sex-standardized rates of death and hospital admissions per 100 000 population aged 20 years and over as well as in-hospital case-fatality rates. Results The overall age- and sex-standardized rate of death from cardiovascular disease in Canada declined 30.0%, from 360.6 per 100 000 in 1994 to 252.5 per 100 000 in 2004. During the same period, the rate fell 38.1% for acute myocardial infarction, 23.5% for heart failure and 28.2% for stroke, with improvements observed across most age and sex groups. The age- and sex-standardized rate of hospital admissions decreased 27.6% for stroke and 27.2% for heart failure. The rate for acute myocardial infarction fell only 9.2%. In contrast, the relative decline in the inhospital case-fatality rate was greatest for acute myocardial infarction (33.1%; p < 0.001). Much smaller relative improvements in case-fatality rates were noted for heart failure (8.1%) and stroke (8.9%). Interpretation The rates of death and hospital admissions for acute myocardial infarction, heart failure and stroke in Canada changed at different rates over the 10-year study period. Awareness of these trends may guide future efforts for health promotion and health care planning and help to determine priorities for research and treatment. PMID:19546444
Using Generic Data to Establish Dormancy Failure Rates
NASA Technical Reports Server (NTRS)
Reistle, Bruce
2014-01-01
Many hardware items are dormant prior to being operated. The dormant period might be especially long, for example during missions to the moon or Mars. In missions with long dormant periods the risk incurred during dormancy can exceed the active risk contribution. Probabilistic Risk Assessments (PRAs) need to account for the dormant risk contribution as well as the active contribution. A typical method for calculating a dormant failure rate is to multiply the active failure rate by a constant, the dormancy factor. For example, some practitioners use a heuristic and divide the active failure rate by 30 to obtain an estimate of the dormant failure rate. To obtain a more empirical estimate of the dormancy factor, this paper uses the recently updated database NPRD-2011 [1] to arrive at a set of distributions for the dormancy factor. The resulting dormancy factor distributions are significantly different depending on whether the item is electrical, mechanical, or electro-mechanical. Additionally, this paper will show that using a heuristic constant fails to capture the uncertainty of the possible dormancy factors.
NASA Technical Reports Server (NTRS)
Owens, Andrew; De Weck, Olivier L.; Stromgren, Chel; Goodliff, Kandyce; Cirillo, William
2017-01-01
Future crewed missions to Mars present a maintenance logistics challenge that is unprecedented in human spaceflight. Mission endurance – defined as the time between resupply opportunities – will be significantly longer than previous missions, and therefore logistics planning horizons are longer and the impact of uncertainty is magnified. Maintenance logistics forecasting typically assumes that component failure rates are deterministically known and uses them to represent aleatory uncertainty, or uncertainty that is inherent to the process being examined. However, failure rates cannot be directly measured; rather, they are estimated based on similarity to other components or statistical analysis of observed failures. As a result, epistemic uncertainty – that is, uncertainty in knowledge of the process – exists in failure rate estimates that must be accounted for. Analyses that neglect epistemic uncertainty tend to significantly underestimate risk. Epistemic uncertainty can be reduced via operational experience; for example, the International Space Station (ISS) failure rate estimates are refined using a Bayesian update process. However, design changes may re-introduce epistemic uncertainty. Thus, there is a tradeoff between changing a design to reduce failure rates and operating a fixed design to reduce uncertainty. This paper examines the impact of epistemic uncertainty on maintenance logistics requirements for future Mars missions, using data from the ISS Environmental Control and Life Support System (ECLS) as a baseline for a case study. Sensitivity analyses are performed to investigate the impact of variations in failure rate estimates and epistemic uncertainty on spares mass. The results of these analyses and their implications for future system design and mission planning are discussed.
NASA Technical Reports Server (NTRS)
Anderson, Leif F.; Harrington, Sean P.; Omeke, Ojei, II; Schwaab, Douglas G.
2009-01-01
This is a case study on revised estimates of induced failure for International Space Station (ISS) on-orbit replacement units (ORUs). We devise a heuristic to leverage operational experience data by aggregating ORU, associated function (vehicle sub -system), and vehicle effective' k-factors using actual failure experience. With this input, we determine a significant failure threshold and minimize the difference between the actual and predicted failure rates. We conclude with a discussion on both qualitative and quantitative improvements the heuristic methods and potential benefits to ISS supportability engineering analysis.
Bayesian inference for disease prevalence using negative binomial group testing
Pritchard, Nicholas A.; Tebbs, Joshua M.
2011-01-01
Group testing, also known as pooled testing, and inverse sampling are both widely used methods of data collection when the goal is to estimate a small proportion. Taking a Bayesian approach, we consider the new problem of estimating disease prevalence from group testing when inverse (negative binomial) sampling is used. Using different distributions to incorporate prior knowledge of disease incidence and different loss functions, we derive closed form expressions for posterior distributions and resulting point and credible interval estimators. We then evaluate our new estimators, on Bayesian and classical grounds, and apply our methods to a West Nile Virus data set. PMID:21259308
Use of negative binomial distribution to describe the presence of Anisakis in Thyrsites atun.
Peña-Rehbein, Patricio; De los Ríos-Escalante, Patricio
2012-01-01
Nematodes of the genus Anisakis have marine fishes as intermediate hosts. One of these hosts is Thyrsites atun, an important fishery resource in Chile between 38 and 41° S. This paper describes the frequency and number of Anisakis nematodes in the internal organs of Thyrsites atun. An analysis based on spatial distribution models showed that the parasites tend to be clustered. The variation in the number of parasites per host could be described by the negative binomial distribution. The maximum observed number of parasites was nine parasites per host. The environmental and zoonotic aspects of the study are also discussed.
Kadam, Shantanu; Vanka, Kumar
2013-02-15
Methods based on the stochastic formulation of chemical kinetics have the potential to accurately reproduce the dynamical behavior of various biochemical systems of interest. However, the computational expense makes them impractical for the study of real systems. Attempts to render these methods practical have led to the development of accelerated methods, where the reaction numbers are modeled by Poisson random numbers. However, for certain systems, such methods give rise to physically unrealistic negative numbers for species populations. The methods which make use of binomial variables, in place of Poisson random numbers, have since become popular, and have been partially successful in addressing this problem. In this manuscript, the development of two new computational methods, based on the representative reaction approach (RRA), has been discussed. The new methods endeavor to solve the problem of negative numbers, by making use of tools like the stochastic simulation algorithm and the binomial method, in conjunction with the RRA. It is found that these newly developed methods perform better than other binomial methods used for stochastic simulations, in resolving the problem of negative populations. Copyright © 2012 Wiley Periodicals, Inc.
Dorazio, Robert M.; Martin, Juulien; Edwards, Holly H.
2013-01-01
The class of N-mixture models allows abundance to be estimated from repeated, point count surveys while adjusting for imperfect detection of individuals. We developed an extension of N-mixture models to account for two commonly observed phenomena in point count surveys: rarity and lack of independence induced by unmeasurable sources of variation in the detectability of individuals. Rarity increases the number of locations with zero detections in excess of those expected under simple models of abundance (e.g., Poisson or negative binomial). Correlated behavior of individuals and other phenomena, though difficult to measure, increases the variation in detection probabilities among surveys. Our extension of N-mixture models includes a hurdle model of abundance and a beta-binomial model of detectability that accounts for additional (extra-binomial) sources of variation in detections among surveys. As an illustration, we fit this model to repeated point counts of the West Indian manatee, which was observed in a pilot study using aerial surveys. Our extension of N-mixture models provides increased flexibility. The effects of different sets of covariates may be estimated for the probability of occurrence of a species, for its mean abundance at occupied locations, and for its detectability.
Dorazio, Robert M; Martin, Julien; Edwards, Holly H
2013-07-01
The class of N-mixture models allows abundance to be estimated from repeated, point count surveys while adjusting for imperfect detection of individuals. We developed an extension of N-mixture models to account for two commonly observed phenomena in point count surveys: rarity and lack of independence induced by unmeasurable sources of variation in the detectability of individuals. Rarity increases the number of locations with zero detections in excess of those expected under simple models of abundance (e.g., Poisson or negative binomial). Correlated behavior of individuals and other phenomena, though difficult to measure, increases the variation in detection probabilities among surveys. Our extension of N-mixture models includes a hurdle model of abundance and a beta-binomial model of detectability that accounts for additional (extra-binomial) sources of variation in detections among surveys. As an illustration, we fit this model to repeated point counts of the West Indian manatee, which was observed in a pilot study using aerial surveys. Our extension of N-mixture models provides increased flexibility. The effects of different sets of covariates may be estimated for the probability of occurrence of a species, for its mean abundance at occupied locations, and for its detectability.
Modelling infant mortality rate in Central Java, Indonesia use generalized poisson regression method
NASA Astrophysics Data System (ADS)
Prahutama, Alan; Sudarno
2018-05-01
The infant mortality rate is the number of deaths under one year of age occurring among the live births in a given geographical area during a given year, per 1,000 live births occurring among the population of the given geographical area during the same year. This problem needs to be addressed because it is an important element of a country’s economic development. High infant mortality rate will disrupt the stability of a country as it relates to the sustainability of the population in the country. One of regression model that can be used to analyze the relationship between dependent variable Y in the form of discrete data and independent variable X is Poisson regression model. Recently The regression modeling used for data with dependent variable is discrete, among others, poisson regression, negative binomial regression and generalized poisson regression. In this research, generalized poisson regression modeling gives better AIC value than poisson regression. The most significant variable is the Number of health facilities (X1), while the variable that gives the most influence to infant mortality rate is the average breastfeeding (X9).
Pearl, D L; Louie, M; Chui, L; Doré, K; Grimsrud, K M; Martin, S W; Michel, P; Svenson, L W; McEwen, S A
2009-10-01
Using negative binomial and multi-level Poisson models, the authors determined the statistical significance of agricultural and socio-economic risk factors for rates of reported disease associated with Escherichia coli O157 in census subdivisions (CSDs) in Alberta, Canada, 2000-2002. Variables relating to population stability, aboriginal composition of the CSDs, and the economic relationship between CSDs and urban centres were significant risk factors. The percentage of individuals living in low-income households was not a statistically significant risk factor for rates of disease. The statistical significance of cattle density, recorded at a higher geographical level, depended on the method used to correct for overdispersion, the number of levels included in the multi-level models, and the choice of using all reported cases or only sporadic cases. Our results highlight the importance of local socio-economic risk factors in determining rates of disease associated with E. coli O157, but their relationship with individual risk factors requires further evaluation.
Kessell, Eric R.; Alvidrez, Jennifer; McConnell, William A.; Shumway, Martha
2010-01-01
Objective This study investigated the association between San Francisco neighborhoods’ racial/ethnic residential composition and the rate of mental-health-related 911 calls. Methods Calls to the San Francisco 911 system from January 2001 through June 2003 (n=1,341,608) were divided into mental-health-related and other calls. Police sector data in the call records were overlaid onto U.S. Census tracts to estimate sector demographic and socioeconomic characteristics. Negative binomial regression was used to estimate the association between black, Asian, Latino and white resident percentage and rates of mental-health-related calls. Results Percent of black residents was associated with a lower rate of mental-health-related calls (IRR=.99, 95% CI .98–1.00). Percent of Asian and Latino residents had no significant effect. Conclusions The observed relationship between black residents and mental-health-related calls is not consistent with known emergency mental health service utilization patterns. The paradox between underutilization of the 911 system and overutilization of psychiatric emergency services deserves further investigation. PMID:19797379
Predicting Quarantine Failure Rates
2004-01-01
Preemptive quarantine through contact-tracing effectively controls emerging infectious diseases. Occasionally this quarantine fails, however, and infected persons are released. The probability of quarantine failure is typically estimated from disease-specific data. Here a simple, exact estimate of the failure rate is derived that does not depend on disease-specific parameters. This estimate is universally applicable to all infectious diseases. PMID:15109418
Harries, Anthony D.; Kumar, Ajay M. V.; Oo, Myo Minn; Kyaw, Khine Wut Yee; Win, Than; Aung, Thet Ko; Min, Aung Chan; Oo, Htun Nyunt
2017-01-01
Background The number of people living with HIV on antiretroviral treatment (ART) in Myanmar has been increasing rapidly in recent years. This study aimed to estimate rates of virological failure on first-line ART and switching to second-line ART due to treatment failure at the Integrated HIV Care program (IHC). Methods Routinely collected data of all adolescent and adult patients living with HIV who were initiated on first-line ART at IHC between 2005 and 2015 were retrospectively analyzed. The cumulative hazard of virological failure on first-line ART and switching to second-line ART were estimated. Crude and adjusted hazard ratios were calculated using the Cox regression model to identify risk factors associated with the two outcomes. Results Of 23,248 adults and adolescents, 7,888 (34%) were tested for HIV viral load. The incidence rate of virological failure among those tested was 3.2 per 100 person-years follow-up and the rate of switching to second-line ART among all patients was 1.4 per 100 person-years follow-up. Factors associated with virological failure included: being adolescent; being lost to follow-up at least once; having WHO stage 3 and 4 at ART initiation; and having taken first-line ART elsewhere before coming to IHC. Of the 1032 patients who met virological failure criteria, 762 (74%) switched to second-line ART. Conclusions We found high rates of virological failure among one third of patients in the cohort who were tested for viral load. Of those failing virologically on first-line ART, about one quarter were not switched to second-line ART. Routine viral load monitoring, especially for those identified as having a higher risk of treatment failure, should be considered in this setting to detect all patients failing on first-line ART. Strategies also need to be put in place to prevent treatment failure and to treat more of those patients who are actually failing. PMID:28182786
Kong, Melissa H; Shaw, Linda K; O'Connor, Christopher; Califf, Robert M; Blazing, Michael A; Al-Khatib, Sana M
2010-07-01
Although no clinical trial data exist on the optimal management of atrial fibrillation (AF) in patients with diastolic heart failure, it has been hypothesized that rhythm-control is more advantageous than rate-control due to the dependence of these patients' left ventricular filling on atrial contraction. We aimed to determine whether patients with AF and heart failure with preserved ejection fraction (EF) survive longer with rhythm versus rate-control strategy. The Duke Cardiovascular Disease Database was queried to identify patients with EF > 50%, heart failure symptoms and AF between January 1,1995 and June 30, 2005. We compared baseline characteristics and survival of patients managed with rate- versus rhythm-control strategies. Using a 60-day landmark view, Kaplan-Meier curves were generated and results were adjusted for baseline differences using Cox proportional hazards modeling. Three hundred eighty-two patients met the inclusion criteria (285 treated with rate-control and 97 treated with rhythm-control). The 1-, 3-, and 5-year survival rates were 93.2%, 69.3%, and 56.8%, respectively in rate-controlled patients and 94.8%, 78.0%, and 59.9%, respectively in rhythm-controlled patients (P > 0.10). After adjustments for baseline differences, no significant difference in mortality was detected (hazard ratio for rhythm-control vs rate-control = 0.696, 95% CI 0.453-1.07, P = 0.098). Based on our observational data, rhythm-control seems to offer no survival advantage over rate-control in patients with heart failure and preserved EF. Randomized clinical trials are needed to verify these findings and examine the effect of each strategy on stroke risk, heart failure decompensation, and quality of life.
Cyclical absenteeism among private sector, public sector and self-employed workers.
Pfeifer, Christian
2013-03-01
This research note analyzes differences in the number of absent working days and doctor visits and in their cyclicality between private sector, public sector and self-employed workers. For this purpose, I used large-scale German survey data for the years 1995 to 2007 to estimate random effects negative binomial (count data) models. The main findings are as follows. (i) Public sector workers have on average more absent working days than private sector and self-employed workers. Self-employed workers have fewer absent working days and doctor visits than dependent employed workers. (ii) The regional unemployment rate is on average negatively correlated with the number of absent working days among private and public sector workers as well as among self-employed men. The correlations between regional unemployment rate and doctor visits are only significantly negative among private sector workers. Copyright © 2012 John Wiley & Sons, Ltd.
Bio-Ecology of the Louse, Upupicola upupae, Infesting the Common Hoopoe, Upupa epops
Agarwal, G. P; Ahmad, Aftab; Rashmi, Archna; Arya, Gaurav; Bansal, Nayanci; Saxena, A.K.
2011-01-01
The population characteristics of the louse, Upupicola upupae (Shrank) (Mallophaga: Philopteridae: Ishnocera), infesting the Common Hoopae, Upupa epops L. (Aves: Upupiformes), were recorded during 2007–08 in District Rampur, Uttar Pradesh India. The pattern of frequency distribution of the louse conformed to the negative binomial model. The lice and its nits were reared in vitro at 35 ± 1° C, 75–82 % RH, on a feather diet. The data obtained was used to construct the life table and to determine the intrinsic rate of natural increase (0.035 female/day), the net reproductive rate was 3.67 female eggs/female, the generation time was 37 days, and the doubling time of the population was 19 days. The chaetotaxy of the three nymphal instars has also been noted to record their diagnostic characteristics. Information on egg morphology and antennal sensilla is also presented. PMID:21861650
Limits on rock strength under high confinement
NASA Astrophysics Data System (ADS)
Renshaw, Carl E.; Schulson, Erland M.
2007-06-01
Understanding of deep earthquake source mechanisms requires knowledge of failure processes active under high confinement. Under low confinement the compressive strength of rock is well known to be limited by frictional sliding along stress-concentrating flaws. Under higher confinement strength is usually assumed limited by power-law creep associated with the movement of dislocations. In a review of existing experimental data, we find that when the confinement is high enough to suppress frictional sliding, rock strength increases as a power-law function only up to a critical normalized strain rate. Within the regime where frictional sliding is suppressed and the normalized strain rate is below the critical rate, both globally distributed ductile flow and localized brittle-like failure are observed. When frictional sliding is suppressed and the normalized strain rate is above the critical rate, failure is always localized in a brittle-like manner at a stress that is independent of the degree of confinement. Within the high-confinement, high-strain rate regime, the similarity in normalized failure strengths across a variety of rock types and minerals precludes both transformational faulting and dehydration embrittlement as strength-limiting mechanisms. The magnitude of the normalized failure strength corresponding to the transition to the high-confinement, high-strain rate regime and the observed weak dependence of failure strength on strain rate within this regime are consistent with a localized Peierls-type strength-limiting mechanism. At the highest strain rates the normalized strengths approach the theoretical limit for crystalline materials. Near-theoretical strengths have previously been observed only in nano- and micro-scale regions of materials that are effectively defect-free. Results are summarized in a new deformation mechanism map revealing that when confinement and strain rate are sufficient, strengths approaching the theoretical limit can be achieved in cm-scale sized samples of rocks rich in defects. Thus, non-frictional failure processes must be considered when interpreting rock deformation data collected under high confinement and low temperature. Further, even at higher temperatures the load-bearing ability of crustal rocks under high confinement may not be limited by a frictional process under typical geologic strain rates.
Adams, Rachel Sayko; Larson, Mary Jo; Corrigan, John D.; Ritter, Grant A.; Williams, Thomas V.
2013-01-01
This study used the 2008 Department of Defense Survey of Health Related Behaviors among Active Duty Military Personnel to determine whether traumatic brain injury (TBI) is associated with past year drinking-related consequences. The study sample included currently-drinking personnel who had a combat deployment in the past year and were home for ≥6 months (N = 3,350). Negative binomial regression models were used to assess the incidence rate ratios of consequences, by TBI-level. Experiencing a TBI with a loss of consciousness >20 minutes was significantly associated with consequences independent of demographics, combat exposure, posttraumatic stress disorder, and binge drinking. The study’s limitations are noted. PMID:23869456
Liu, L; Peng, D B; Liu, Y; Deng, W N; Liu, Y L; Li, J J
2001-05-01
To study changes of DNA content in the kidney cellule of rats and relationship with the postmortem interval. This experiment chose seven parameter of cell nuclear, including the area and integral optical density, determined the changes of DNA content in the kidney cellule of 15 rats at different intervals between 0 and 48 h postmortem with auto-TV-image system. The degradation rate of DNA in nuclear has a certainty relationship to early PMI(in 48 h) of rat, and get binomial regress equation. Determining the quantity of DNA in nuclear should be an objective and exact way to estimate the PMI.
On a Stochastic Failure Model under Random Shocks
NASA Astrophysics Data System (ADS)
Cha, Ji Hwan
2013-02-01
In most conventional settings, the events caused by an external shock are initiated at the moments of its occurrence. In this paper, we study a new classes of shock model, where each shock from a nonhomogeneous Poisson processes can trigger a failure of a system not immediately, as in classical extreme shock models, but with delay of some random time. We derive the corresponding survival and failure rate functions. Furthermore, we study the limiting behaviour of the failure rate function where it is applicable.
Relation between lowered colloid osmotic pressure, respiratory failure, and death.
Tonnesen, A S; Gabel, J C; McLeavey, C A
1977-01-01
Plasma colloid osmotic pressure was measured each day in 84 intensive care unit patients. Probit analysis demonstrated a direct relationship between colloid osmotic pressure (COP) and survival. The COP associated with a 50% survival rate was 15.0 torr. COP was higher in survivors than in nonsurvivors without respiratory failure and in patients who recovered from respiratory failure. We conclude that lowered COP is associated with an elevated mortality rate. However, the relationship to death is not explained by the relationship to respiratory failure.
Hardcastle, Thomas J
2016-01-15
High-throughput data are now commonplace in biological research. Rapidly changing technologies and application mean that novel methods for detecting differential behaviour that account for a 'large P, small n' setting are required at an increasing rate. The development of such methods is, in general, being done on an ad hoc basis, requiring further development cycles and a lack of standardization between analyses. We present here a generalized method for identifying differential behaviour within high-throughput biological data through empirical Bayesian methods. This approach is based on our baySeq algorithm for identification of differential expression in RNA-seq data based on a negative binomial distribution, and in paired data based on a beta-binomial distribution. Here we show how the same empirical Bayesian approach can be applied to any parametric distribution, removing the need for lengthy development of novel methods for differently distributed data. Comparisons with existing methods developed to address specific problems in high-throughput biological data show that these generic methods can achieve equivalent or better performance. A number of enhancements to the basic algorithm are also presented to increase flexibility and reduce computational costs. The methods are implemented in the R baySeq (v2) package, available on Bioconductor http://www.bioconductor.org/packages/release/bioc/html/baySeq.html. tjh48@cam.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Kakudate, Naoki; Yokoyama, Yoko; Sumida, Futoshi; Matsumoto, Yuki; Gordan, Valeria V; Gilbert, Gregg H
2017-02-01
The objectives of this study were to: (1) examine differences in the use of dental clinical practice guidelines among Japanese dentists, and (2) identify characteristics associated with the number of guidelines used by participating dentists. We conducted a cross-sectional study consisting of a questionnaire survey in Japan between July 2014 and May 2015. The study queried dentists working in outpatient dental practices who are affiliated with the Dental Practice-Based Research Network Japan (n = 148). They were asked whether they have used each of 15 Japanese dental clinical guidelines. Associations between the number of guidelines used by participants and specific characteristics were analysed via negative binomial regression analysis. The mean number of guidelines used by participating dentists was 2.5 ± 2.9 [standard deviation (SD)]. Rate of use of guidelines showed substantial variation, from 5% to 34% among dentists. The proportion of dentists that used guidelines was the highest among oral medicine specialists, who had the highest proportion for 10 of 15 guidelines. Negative binomial regression analysis identified three factors significantly associated with the number of guidelines used: 'years since graduation from dental school', 'specialty practice' and 'practice busyness'. These results suggest that the use of clinical practice guidelines by Japanese dentists may still be inadequate. Training in the use of the guidelines could be given to dental students as undergraduate education and to young clinicians as continuing education. © 2016 John Wiley & Sons, Ltd.
Burkness, Eric C; Hutchison, W D
2009-10-01
Populations of cabbage looper, Trichoplusiani (Lepidoptera: Noctuidae), were sampled in experimental plots and commercial fields of cabbage (Brasicca spp.) in Minnesota during 1998-1999 as part of a larger effort to implement an integrated pest management program. Using a resampling approach and the Wald's sequential probability ratio test, sampling plans with different sampling parameters were evaluated using independent presence/absence and enumerative data. Evaluations and comparisons of the different sampling plans were made based on the operating characteristic and average sample number functions generated for each plan and through the use of a decision probability matrix. Values for upper and lower decision boundaries, sequential error rates (alpha, beta), and tally threshold were modified to determine parameter influence on the operating characteristic and average sample number functions. The following parameters resulted in the most desirable operating characteristic and average sample number functions; action threshold of 0.1 proportion of plants infested, tally threshold of 1, alpha = beta = 0.1, upper boundary of 0.15, lower boundary of 0.05, and resampling with replacement. We found that sampling parameters can be modified and evaluated using resampling software to achieve desirable operating characteristic and average sample number functions. Moreover, management of T. ni by using binomial sequential sampling should provide a good balance between cost and reliability by minimizing sample size and maintaining a high level of correct decisions (>95%) to treat or not treat.
Linear algebra of the permutation invariant Crow-Kimura model of prebiotic evolution.
Bratus, Alexander S; Novozhilov, Artem S; Semenov, Yuri S
2014-10-01
A particular case of the famous quasispecies model - the Crow-Kimura model with a permutation invariant fitness landscape - is investigated. Using the fact that the mutation matrix in the case of a permutation invariant fitness landscape has a special tridiagonal form, a change of the basis is suggested such that in the new coordinates a number of analytical results can be obtained. In particular, using the eigenvectors of the mutation matrix as the new basis, we show that the quasispecies distribution approaches a binomial one and give simple estimates for the speed of convergence. Another consequence of the suggested approach is a parametric solution to the system of equations determining the quasispecies. Using this parametric solution we show that our approach leads to exact asymptotic results in some cases, which are not covered by the existing methods. In particular, we are able to present not only the limit behavior of the leading eigenvalue (mean population fitness), but also the exact formulas for the limit quasispecies eigenvector for special cases. For instance, this eigenvector has a geometric distribution in the case of the classical single peaked fitness landscape. On the biological side, we propose a mathematical definition, based on the closeness of the quasispecies to the binomial distribution, which can be used as an operational definition of the notorious error threshold. Using this definition, we suggest two approximate formulas to estimate the critical mutation rate after which the quasispecies delocalization occurs. Copyright © 2014 Elsevier Inc. All rights reserved.
Failure analysis and modeling of a VAXcluster system
NASA Technical Reports Server (NTRS)
Tang, Dong; Iyer, Ravishankar K.; Subramani, Sujatha S.
1990-01-01
This paper discusses the results of a measurement-based analysis of real error data collected from a DEC VAXcluster multicomputer system. In addition to evaluating basic system dependability characteristics such as error and failure distributions and hazard rates for both individual machines and for the VAXcluster, reward models were developed to analyze the impact of failures on the system as a whole. The results show that more than 46 percent of all failures were due to errors in shared resources. This is despite the fact that these errors have a recovery probability greater than 0.99. The hazard rate calculations show that not only errors, but also failures occur in bursts. Approximately 40 percent of all failures occur in bursts and involved multiple machines. This result indicates that correlated failures are significant. Analysis of rewards shows that software errors have the lowest reward (0.05 vs 0.74 for disk errors). The expected reward rate (reliability measure) of the VAXcluster drops to 0.5 in 18 hours for the 7-out-of-7 model and in 80 days for the 3-out-of-7 model.
Historic and Current Launcher Success Rates
NASA Technical Reports Server (NTRS)
Rust, Randy
2002-01-01
This presentation reviews historic and current space launcher success rates from all nations with a mature launcher industry. Data from the 1950's through present day is reviewed for possible trends such as when in the launch timeline a failure occurred, which stages had the highest failure rate, overall launcher reliability, a decade by decade look at launcher reliability, when in a launchers history did failures occur, and the reliability of United States human-rated launchers. This information is useful in determining where launcher reliability can be improved and where additional measures for crew survival (i.e., Crew Escape systems) will have the greatest emphasis
Labranche, D; Mestre-Fernandes, C; Delahaye, F; Sanchez, S
2016-11-01
Heart failure was a public health problem for one million of French patients. Patients are particularly concerned in rehospitalisation for this chronic pathology. A specific healthcare network was created to take care of patients with heart failure directly at home. This healthcare network (named VISage) brings a specific and adapted monitoring in heart failure. The main of this study was to evaluate the impact of healthcare network in rehospitalisation rate for heart failure of patients. We conducted a retrospective cohort study with patients' hospital files of the CH Vienne. Patients who were included in our healthcare network (VISage) were screened. Primary endpoint was 30days, 6 months, and 1year rehospitalisation rate for heart failure before and after using healthcare network. One hundred and four patients with comorbidities were included between February 2009 and November 2015. A significant reduction of rehospitalisation rate for heart failure was observed before and after network: 0.65 (±0.52) vs. 0.17 (±0.43) 30days, 1.17 (±0.74) vs. 0.42 (±0.71) at 6 months and 1.35 (±0.95) vs. 0.47 (±0.74) at 1 year (P<0.0001). Results were significant for global rehospitalisation rate too. No significant differences were shown on hospital length of stay. Coordinated healthcare by a specific network at home for elderly is benefic for the rehospitalisation rate. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Local Failure in Resected N1 Lung Cancer: Implications for Adjuvant Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Higgins, Kristin A., E-mail: kristin.higgins@duke.edu; Chino, Junzo P.; Berry, Mark
2012-06-01
Purpose: To evaluate actuarial rates of local failure in patients with pathologic N1 non-small-cell lung cancer and to identify clinical and pathologic factors associated with an increased risk of local failure after resection. Methods and Materials: All patients who underwent surgery for non-small-cell lung cancer with pathologically confirmed N1 disease at Duke University Medical Center from 1995-2008 were identified. Patients receiving any preoperative therapy or postoperative radiotherapy or with positive surgical margins were excluded. Local failure was defined as disease recurrence within the ipsilateral hilum, mediastinum, or bronchial stump/staple line. Actuarial rates of local failure were calculated with the Kaplan-Meiermore » method. A Cox multivariate analysis was used to identify factors independently associated with a higher risk of local recurrence. Results: Among 1,559 patients who underwent surgery during the time interval, 198 met the inclusion criteria. Of these patients, 50 (25%) received adjuvant chemotherapy. Actuarial (5-year) rates of local failure, distant failure, and overall survival were 40%, 55%, and 33%, respectively. On multivariate analysis, factors associated with an increased risk of local failure included a video-assisted thoracoscopic surgery approach (hazard ratio [HR], 2.5; p = 0.01), visceral pleural invasion (HR, 2.1; p = 0.04), and increasing number of positive N1 lymph nodes (HR, 1.3 per involved lymph node; p = 0.02). Chemotherapy was associated with a trend toward decreased risk of local failure that was not statistically significant (HR, 0.61; p = 0.2). Conclusions: Actuarial rates of local failure in pN1 disease are high. Further investigation of conformal postoperative radiotherapy may be warranted.« less
Selecting a distributional assumption for modelling relative densities of benthic macroinvertebrates
Gray, B.R.
2005-01-01
The selection of a distributional assumption suitable for modelling macroinvertebrate density data is typically challenging. Macroinvertebrate data often exhibit substantially larger variances than expected under a standard count assumption, that of the Poisson distribution. Such overdispersion may derive from multiple sources, including heterogeneity of habitat (historically and spatially), differing life histories for organisms collected within a single collection in space and time, and autocorrelation. Taken to extreme, heterogeneity of habitat may be argued to explain the frequent large proportions of zero observations in macroinvertebrate data. Sampling locations may consist of habitats defined qualitatively as either suitable or unsuitable. The former category may yield random or stochastic zeroes and the latter structural zeroes. Heterogeneity among counts may be accommodated by treating the count mean itself as a random variable, while extra zeroes may be accommodated using zero-modified count assumptions, including zero-inflated and two-stage (or hurdle) approaches. These and linear assumptions (following log- and square root-transformations) were evaluated using 9 years of mayfly density data from a 52 km, ninth-order reach of the Upper Mississippi River (n = 959). The data exhibited substantial overdispersion relative to that expected under a Poisson assumption (i.e. variance:mean ratio = 23 ??? 1), and 43% of the sampling locations yielded zero mayflies. Based on the Akaike Information Criterion (AIC), count models were improved most by treating the count mean as a random variable (via a Poisson-gamma distributional assumption) and secondarily by zero modification (i.e. improvements in AIC values = 9184 units and 47-48 units, respectively). Zeroes were underestimated by the Poisson, log-transform and square root-transform models, slightly by the standard negative binomial model but not by the zero-modified models (61%, 24%, 32%, 7%, and 0%, respectively). However, the zero-modified Poisson models underestimated small counts (1 ??? y ??? 4) and overestimated intermediate counts (7 ??? y ??? 23). Counts greater than zero were estimated well by zero-modified negative binomial models, while counts greater than one were also estimated well by the standard negative binomial model. Based on AIC and percent zero estimation criteria, the two-stage and zero-inflated models performed similarly. The above inferences were largely confirmed when the models were used to predict values from a separate, evaluation data set (n = 110). An exception was that, using the evaluation data set, the standard negative binomial model appeared superior to its zero-modified counterparts using the AIC (but not percent zero criteria). This and other evidence suggest that a negative binomial distributional assumption should be routinely considered when modelling benthic macroinvertebrate data from low flow environments. Whether negative binomial models should themselves be routinely examined for extra zeroes requires, from a statistical perspective, more investigation. However, this question may best be answered by ecological arguments that may be specific to the sampled species and locations. ?? 2004 Elsevier B.V. All rights reserved.
Molecular Dynamics Modeling of PPTA Crystals in Aramid Fibers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mercer, Brian Scott
2016-05-19
In this work, molecular dynamics modeling is used to study the mechanical properties of PPTA crystallites, which are the fundamental microstructural building blocks of polymer aramid bers such as Kevlar. Particular focus is given to constant strain rate axial loading simulations of PPTA crystallites, which is motivated by the rate-dependent mechanical properties observed in some experiments with aramid bers. In order to accommodate the covalent bond rupture that occurs in loading a crystallite to failure, the reactive bond order force eld ReaxFF is employed to conduct the simulations. Two major topics are addressed: The rst is the general behavior ofmore » PPTA crystallites under strain rate loading. Constant strain rate loading simulations of crystalline PPTA reveal that the crystal failure strain increases with increasing strain rate, while the modulus is not a ected by the strain rate. Increasing temperature lowers both the modulus and the failure strain. The simulations also identify the C N bond connecting the aromatic rings as weakest primary bond along the backbone of the PPTA chain. The e ect of chain-end defects on PPTA micromechanics is explored, and it is found that the presence of a chain-end defect transfers load to the adjacent chains in the hydrogen-bonded sheet in which the defect resides, but does not in uence the behavior of any other chains in the crystal. Chain-end defects are found to lower the strength of the crystal when clustered together, inducing bond failure via stress concentrations arising from the load transfer to bonds in adjacent chains near the defect site. The second topic addressed is the nature of primary and secondary bond failure in crystalline PPTA. Failure of both types of bonds is found to be stochastic in nature and driven by thermal uctuations of the bonds within the crystal. A model is proposed which uses reliability theory to model bonds under constant strain rate loading as components with time-dependent failure rate functions. The model is shown to work well for predicting the onset of primary backbone bond failure, as well as the onset of secondary bond failure via chain slippage for the case of isolated non-interacting chain-end defects.« less
Katzman, Lee R; Hoover, Caroline K; Khalifa, Yousuf M; Jeng, Bennie H
2015-11-01
To evaluate the accuracy of eye bank-prepared precut donor corneas over time by comparing cut-failure rates and corneal thickness measurements in 2010 and 2013. A total of 2511 human corneas cut by a technician-operated mechanical microkeratome intended for endothelial keratoplasty were evaluated prospectively at one large eye bank facility in 2010 and in 2013. The endothelium was evaluated by slit lamp, and specular microscopy both before and after cutting was performed. Graft thickness as measured by pachymetry and/or optical coherence tomography was collected to assess the accuracy of the cut tissue. Cut-failure rates were compared between normal donor tissue and tissue with significant preexisting scarring. The combined cut-failure rate in 2010 and 2013 was 2.3% (23/1000) and 1.6% (24/1511), respectively (P = 0.23). The cut-failure rate among normal tissue in 2010 and 2013 was 2.0% (19/927) and 1.4% (19/1400), respectively (P = 0.24). The cut-failure rate among previously scarred tissue in 2010 and 2013 was 5.5% (4/73) and 4.5% (5/111), respectively (P = 0.74). The mean surgeon-requested graft thickness was 144.7 μm (range 100-150, SD 13.6) and 127.2 μm (range 75-150, SD 25.2) in 2010 and 2013, respectively (P < 0.0001). The mean deviation from target graft thickness was 21.3 μm (SD 16.3) and 13.6 μm (SD 12.5) in 2010 and 2013, respectively (P < 0.0001). From 2010 to 2013, the combined cut-failure rates trended toward improvement, while the accuracy of graft thickness improved. This study suggests that the accuracy and success rates of tissue preparation for endothelial keratoplasty improve with experience and volume.
Risk factors for in-hospital post-hip fracture mortality.
Frost, Steven A; Nguyen, Nguyen D; Black, Deborah A; Eisman, John A; Nguyen, Tuan V
2011-09-01
Approximately 10% of hip fracture patients die during hospitalization; however, it is not clear what risk factors contribute to the excess mortality. This study sought to examine risk factors of, and to develop prognostic model for, predicting in-hospital mortality among hip fracture patients. We studied outcomes among 410 men and 1094 women with a hip fracture who were admitted to a major-teaching-hospital in Sydney (Australia) between 1997 and 2007. Clinical data, including concomitant illnesses, were obtained from inpatient data. The primary outcome of the study was in-hospital mortality regardless of length of stay. A Log-binomial regression model was used to identify risk factors for in-hospital mortality. Using the identified risk factors, prognostic nomograms were developed for predicting short term risk of mortality for an individual. The median duration of hospitalization was 9 days. During hospitalization, the risk of mortality was higher in men (9%) than in women (4%). After adjusting for multiple risk factors, increased risk of in-hospital mortality was associated with advancing age (rate ratio [RR] for each 10-year increase in age: 1.91 95% confidence interval [CI]: 1.47 to 2.49), in men (RR 2.13; 95% CI 1.41 to 3.22), and the presence of comorbid conditions on admission (RR for one or more comorbid conditions vs. none: 2.30; 95% CI 1.52 to 3.48). Specifically, the risk of mortality was increased in patients with a pre-existing congestive heart failure (RR 3.02; 95% CI: 1.65 to 5.54), and liver disease (RR 4.75; 95% CI: 1.87 to 12.1). These factors collectively accounted for 69% of the risk for in-hospital mortality. A nomogram was developed from these risk factors to individualize the risk of in-hospital death following a hip fracture. The area under the receiver operating characteristic curve of the final model containing age, sex and comorbid conditions was 0.76. These data suggest that among hip fracture patients, advancing age, gender (men), and pre-existing concomitant diseases such as congestive heart failure and liver disease were the main risk factors for in-hospital mortality. The nomogram developed from this study can be used to convey useful prognostic information to help guide treatment decisions. Copyright © 2011 Elsevier Inc. All rights reserved.
The impact of daily temperature on renal disease incidence: an ecological study.
Borg, Matthew; Bi, Peng; Nitschke, Monika; Williams, Susan; McDonald, Stephen
2017-10-27
Extremely high temperatures over many consecutive days have been linked to an increase in renal disease in several cities. This is becoming increasingly relevant with heatwaves becoming longer, more intense, and more frequent with climate change. This study aimed to extend the known relationship between daily temperature and kidney disease to include the incidence of eight temperature-prone specific renal disease categories - total renal disease, urolithiasis, renal failure, acute kidney injury (AKI), chronic kidney disease (CKD), urinary tract infections (UTIs), lower urinary tract infections (LUTIs) and pyelonephritis. Daily data was acquired for maximum, minimum and average temperature over the period of 1 July 2003 to 31 March 2014 during the warm season (October to March) in Adelaide, South Australia. Data for daily admissions to all metropolitan hospitals for renal disease, including 83,519 emergency department admissions and 42,957 inpatient admissions, was also obtained. Renal outcomes were analyzed using time-stratified negative binomial regression models, with the results aggregated by day. Incidence rate ratios (IRR) and 95% confidence intervals (CI) were estimated for associations between the number of admissions and daily temperature. Increases in daily temperature per 1 °C were associated with an increased incidence for all renal disease categories except for pyelonephritis. Minimum temperature was associated with the greatest increase in renal disease followed by average temperature and then maximum temperature. A 1°C increase in daily minimum temperature was associated with an increase in daily emergency department admissions for AKI (IRR 1.037, 95% CI: 1.026-1.048), renal failure (IRR 1.030, 95% CI: 1.022-1.039), CKD (IRR 1.017, 95% CI: 1.001-1.033) urolithiasis (IRR 1.015, 95% CI: 1.010-1.020), total renal disease (IRR 1.009, 95% CI: 1.006-1.011), UTIs (IRR 1.004, 95% CI: 1.000-1.007) and LUTIs (IRR 1.003, 95% CI: 1.000-1.006). An increased frequency of renal disease, including urolithiasis, acute kidney injury and urinary tract infections, is predicted with increasing temperatures from climate change. These results have clinical and public health implications for the management of renal diseases and demand tailored health services. Future research is warranted to analyze individual renal diseases with more comprehensive information regarding renal risk factors, and studies examining mortality for specific renal diseases.
NASA Technical Reports Server (NTRS)
Kennedy, Barbara J.
2004-01-01
The purposes of this study are to compare the current Space Shuttle Ground Support Equipment (GSE) infrastructure with the proposed GSE infrastructure upgrade modification. The methodology will include analyzing the first prototype installation equipment at Launch PAD B called the "Pathfinder". This study will begin by comparing the failure rate of the current components associated with the "Hardware interface module (HIM)" at the Kennedy Space Center to the failure rate of the neW Pathfinder components. Quantitative data will be gathered specifically on HIM components and the PAD B Hypergolic Fuel facility and Hypergolic Oxidizer facility areas which has the upgraded pathfinder equipment installed. The proposed upgrades include utilizing industrial controlled modules, software, and a fiber optic network. The results of this study provide evidence that there is a significant difference in the failure rates of the two studied infrastructure equipment components. There is also evidence that the support staff for each infrastructure system is not equal. A recommendation to continue with future upgrades is based on a significant reduction of failures in the new' installed ground system components.
Mechanisms, predictors, and trends of electrical failure of Riata leads.
Cheung, Jim W; Al-Kazaz, Mohamed; Thomas, George; Liu, Christopher F; Ip, James E; Bender, Seth R; Siddiqi, Faisal K; Markowitz, Steven M; Lerman, Bruce B
2013-10-01
Riata and Riata ST implantable cardioverter-defibrillator leads have been shown to be prone to structural and electrical failure. To determine predictors, mechanisms, and temporal patterns of Riata/ST lead electrical failure. All 314 patients who underwent Riata/ST lead implantation at our institution with greater than or equal to 90 days of follow-up were studied. The Kaplan-Meier analysis of lead survival was performed. Results from the returned product analysis of explanted leads with electrical lead failure were recorded. During a median follow-up of 4.1 years, the Riata lead electrical failure rate was 6.6%. The rate of externalized conductors among failed leads was 57%. The engineering analysis of 10 explanted leads revealed 5 (50%) leads with electrical failure owing to breach of ethylene tetrafluoroethylene conductor coating. Female gender (hazard ratio 2.7; 95% confidence interval 1.1-6.7; P = .04) and age (hazard ratio 0.95; 95% confidence interval 0.92-0.97; P < .001) were multivariate predictors of lead failure. By using log-log analysis, we noted that the rate of Riata lead failure initially increased exponentially with a power of 2.1 but leads surviving past 4 years had a linear pattern of lead failure with a power of 1.0. Younger age and female gender are independent predictors of Riata lead failure. Loss of integrity of conductor cables with ethylene tetrafluoroethylene coating is an important mode of electrical failure of the Riata lead. Further study of Riata lead failure trends is warranted to guide lead management. © 2013 Heart Rhythm Society. All rights reserved.
Kang, T W; Lee, M W; Hye, M J; Song, K D; Lim, S; Rhim, H; Lim, H K; Cha, D I
2014-12-01
To evaluate the technical feasibility of artificial ascites formation using an angiosheath before percutaneous radiofrequency ablation (RFA) for hepatic tumours and to determine predictive factors affecting the technical failure of artificial ascites formation. This retrospective study was approved by the institutional review board. One hundred and thirteen patients underwent percutaneous RFA of hepatic tumours after trying to make artificial ascites using an angiosheath to avoid collateral thermal damage. The technical success rate of making artificial ascites using an angiosheath and conversion rate to other techniques after initial failure of making artificial ascites were evaluated. The technical success rate for RFA was assessed. In addition, potential factors associated with technical failure including previous history of transcatheter arterial chemoembolization (TACE) or RFA, type of abdominal surgery, and adjacent perihepatic structures were reviewed. Predictive factors for the technical failure of artificial ascites formation were analysed using multivariate analysis. The technical success rates of artificial ascites formation by angiosheath and that of RFA were 84.1% (95/113) and 97.3% (110/113), respectively. The conversion rate to other techniques after the failure of artificial ascites formation using an angiosheath was 15.9% (18/113). Previous hepatic resection was the sole independent predictive factor affecting the technical failure of artificial ascites formation (p<0.001, odds ratio = 29.03, 95% confidence interval: 4.56-184.69). Making artificial ascites for RFA of hepatic tumours using an angiosheath was technically feasible in most cases. However, history of hepatic resection was a significant predictive factor affecting the technical failure of artificial ascites formation. Copyright © 2014 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
Failure rate of inferior alveolar nerve block among dental students and interns
AlHindi, Maryam; Rashed, Bayan; AlOtaibi, Noura
2016-01-01
Objectives: To report the failure rate of inferior alveolar nerve block (IANB) among dental students and interns, causes of failure, investigate awareness of different IANB techniques, and to report IANB-associated complications. Methods: A 3-page questionnaire containing 13 questions was distributed to a random sample of 350 third to fifth years students and interns at the College of Dentistry, King Saud University, Riyadh, Saudi Arabia on January 2011. It included demographic questions (age, gender, and academic level) and questions on IANB failure frequency and reasons, actions taken to overcome the failure, and awareness of different anesthetic techniques, supplementary techniques, and complications. Results: Of the 250 distributed questionnaires, 238 were returned (68% response rate). Most (85.7%) of surveyed sample had experienced IANB failure once or twice. The participants attributed the failures most commonly (66.45%) to anatomical variations. The most common alternative technique used was intraligamentary injection (57.1%), although 42.8% of the sample never attempted any alternatives. Large portion of the samples stated that they either lacked both knowledge of and training for other techniques (44.9%), or that they had knowledge of them but not enough training to perform them (45.8%). Conclusion: To decrease IANB failure rates for dental students and interns, knowledge of landmarks, anatomical variation and their training in alternatives to IANB, such as the Gow-Gates and Akinosi techniques, both theoretically and clinically in the dental curriculum should be enhanced. PMID:26739980
Failure rate of inferior alveolar nerve block among dental students and interns.
AlHindi, Maryam; Rashed, Bayan; AlOtaibi, Noura
2016-01-01
To report the failure rate of inferior alveolar nerve block (IANB) among dental students and interns, causes of failure, investigate awareness of different IANB techniques, and to report IANB-associated complications. A 3-page questionnaire containing 13 questions was distributed to a random sample of 350 third to fifth years students and interns at the College of Dentistry, King Saud University, Riyadh, Saudi Arabia on January 2011. It included demographic questions (age, gender, and academic level) and questions on IANB failure frequency and reasons, actions taken to overcome the failure, and awareness of different anesthetic techniques, supplementary techniques, and complications. Of the 250 distributed questionnaires, 238 were returned (68% response rate). Most (85.7%) of surveyed sample had experienced IANB failure once or twice. The participants attributed the failures most commonly (66.45%) to anatomical variations. The most common alternative technique used was intraligamentary injection (57.1%), although 42.8% of the sample never attempted any alternatives. Large portion of the samples stated that they either lacked both knowledge of and training for other techniques (44.9%), or that they had knowledge of them but not enough training to perform them (45.8%). To decrease IANB failure rates for dental students and interns, knowledge of landmarks, anatomical variation and their training in alternatives to IANB, such as the Gow-Gates and Akinosi techniques, both theoretically and clinically in the dental curriculum should be enhanced.
Abraham, William T
2013-06-01
Heart failure represents a major public health concern, associated with high rates of morbidity and mortality. A particular focus of contemporary heart failure management is reduction of hospital admission and readmission rates. While optimal medical therapy favourably impacts the natural history of the disease, devices such as cardiac resynchronization therapy devices and implantable cardioverter defibrillators have added incremental value in improving heart failure outcomes. These devices also enable remote patient monitoring via device-based diagnostics. Device-based measurement of physiological parameters, such as intrathoracic impedance and heart rate variability, provide a means to assess risk of worsening heart failure and the possibility of future hospitalization. Beyond this capability, implantable haemodynamic monitors have the potential to direct day-to-day management of heart failure patients to significantly reduce hospitalization rates. The use of a pulmonary artery pressure measurement system has been shown to significantly reduce the risk of heart failure hospitalization in a large randomized controlled study, the CardioMEMS Heart Sensor Allows Monitoring of Pressure to Improve Outcomes in NYHA Class III Heart Failure Patients (CHAMPION) trial. Observations from a pilot study also support the potential use of a left atrial pressure monitoring system and physician-directed patient self-management paradigm; these observations are under further investigation in the ongoing LAPTOP-HF trial. All these devices depend upon high-intensity remote monitoring for successful detection of parameter deviations and for directing and following therapy.
Nordgren, Lena; Söderlund, Anne
2015-01-01
Younger people with heart failure often experience poor self-rated health. Furthermore, poor self-rated health is associated with long-term sick leave and disability pension. Socio-demographic factors affect the ability to return to work. However, little is known about people on sick leave due to heart failure. The aim of this study was to investigate associations between self-rated health, mood, socio-demographic factors, sick leave compensation, encounters with healthcare professionals and social insurance officers and self-estimated ability to return to work, for people on sick leave due to heart failure. This population-based investigation had a cross-sectional design. Data were collected in Sweden in 2012 from two official registries and from a postal questionnaire. In total, 590 subjects, aged 23-67, responded (response rate 45.8%). Descriptive statistics, correlation analyses (Spearman bivariate analysis) and logistic regression analyses were used to investigate associations. Poor self-rated health was strongly associated with full sick leave compensation (OR = 4.1, p < .001). Compared self-rated health was moderately associated with low income (OR = .6, p = .003). Good self-rated health was strongly associated with positive encounters with healthcare professionals (OR = 3.0, p = .022) and to the impact of positive encounters with healthcare professionals on self-estimated ability to return to work (OR = 3.3, p < .001). People with heart failure are sicklisted for long periods of time and to a great extent receive disability pension. Not being able to work imposes reduced quality of life. Positive encounters with healthcare professionals and social insurance officers can be supportive when people with heart failure struggle to remain in working life.
Scaling of coupled dilatancy-diffusion processes in space and time
NASA Astrophysics Data System (ADS)
Main, I. G.; Bell, A. F.; Meredith, P. G.; Brantut, N.; Heap, M.
2012-04-01
Coupled dilatancy-diffusion processes resulting from microscopically brittle damage due to precursory cracking have been observed in the laboratory and suggested as a mechanism for earthquake precursors. One reason precursors have proven elusive may be the scaling in space: recent geodetic and seismic data placing strong limits on the spatial extent of the nucleation zone for recent earthquakes. Another may be the scaling in time: recent laboratory results on axi-symmetric samples show both a systematic decrease in circumferential extensional strain at failure and a delayed and a sharper acceleration of acoustic emission event rate as strain rate is decreased. Here we examine the scaling of such processes in time from laboratory to field conditions using brittle creep (constant stress loading) to failure tests, in an attempt to bridge part of the strain rate gap to natural conditions, and discuss the implications for forecasting the failure time. Dilatancy rate is strongly correlated to strain rate, and decreases to zero in the steady-rate creep phase at strain rates around 10-9 s-1 for a basalt from Mount Etna. The data are well described by a creep model based on the linear superposition of transient (decelerating) and accelerating micro-crack growth due to stress corrosion. The model produces good fits to the failure time in retrospect using the accelerating acoustic emission event rate, but in prospective tests on synthetic data with the same properties we find failure-time forecasting is subject to systematic epistemic and aleatory uncertainties that degrade predictability. The next stage is to use the technology developed to attempt failure forecasting in real time, using live streamed data and a public web-based portal to quantify the prospective forecast quality under such controlled laboratory conditions.
Syndromic surveillance for health information system failures: a feasibility study.
Ong, Mei-Sing; Magrabi, Farah; Coiera, Enrico
2013-05-01
To explore the applicability of a syndromic surveillance method to the early detection of health information technology (HIT) system failures. A syndromic surveillance system was developed to monitor a laboratory information system at a tertiary hospital. Four indices were monitored: (1) total laboratory records being created; (2) total records with missing results; (3) average serum potassium results; and (4) total duplicated tests on a patient. The goal was to detect HIT system failures causing: data loss at the record level; data loss at the field level; erroneous data; and unintended duplication of data. Time-series models of the indices were constructed, and statistical process control charts were used to detect unexpected behaviors. The ability of the models to detect HIT system failures was evaluated using simulated failures, each lasting for 24 h, with error rates ranging from 1% to 35%. In detecting data loss at the record level, the model achieved a sensitivity of 0.26 when the simulated error rate was 1%, while maintaining a specificity of 0.98. Detection performance improved with increasing error rates, achieving a perfect sensitivity when the error rate was 35%. In the detection of missing results, erroneous serum potassium results and unintended repetition of tests, perfect sensitivity was attained when the error rate was as small as 5%. Decreasing the error rate to 1% resulted in a drop in sensitivity to 0.65-0.85. Syndromic surveillance methods can potentially be applied to monitor HIT systems, to facilitate the early detection of failures.
Shetty, Amith L; Shankar Raju, Savitha Banagar; Hermiz, Arsalan; Vaghasiya, Milan; Vukasovic, Matthew
2015-02-01
Discharge-stream emergency short-stay units (ESSU) improve ED and hospital efficiency. Age of patients and time of hospital presentations have been shown to correlate with increasing complexity of care. We aim to determine whether an age and time cut-off could be derived to subsequently improve short-stay unit success rates. We conducted a retrospective audit on 6703 (5522 inclusions) patients admitted to our discharge-stream short-stay unit. Patients were classified as appropriate or inappropriate admissions, and deemed successful if discharged out of the unit within 24 h; and failures if they needed inpatient admission into the hospital. We calculated short-stay unit length of stay for patients in each of these groups. A 15% failure rate was deemed as acceptable key performance indicator (KPI) for our unit. There were 197 out of 4621 (4.3%, 95% CI 3.7-4.9%) patients up to the age of 70 who failed admission to ESSU compared with 67 out of 901 (7.4%, 95% CI 5.9-9.3%, P < 0.01) of patients over the age of 70, reflecting an increased failure rate in geriatric population. When grouped according to times of admission to the ESSU (in-office 06.00-22.00 hours vs out-of-office 22.00-06.00 hours) no significant difference rates in discharge failure (4.7% vs 5.2%, P = 0.46) were noted. Patients >70 years of age have higher rates of failure after admission to discharge-stream ESSU. Although in appropriately selected discharge-stream patients, no age group or time-band of presentation was associated with increased failure rate beyond the stipulated KPI. © 2014 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.
Rosinger, Asher Y; Lawman, Hannah G; Akinbami, Lara J; Ogden, Cynthia L
2016-12-01
Adequate water intake is critical to physiologic and cognitive functioning. Although water requirements increase with body size, it remains unclear whether weight status modifies the relation between water intake and hydration status. We examined how the association between water intake and urine osmolality, which is a hydration biomarker, varied by weight status. NHANES cross-sectional data (2009-2012) were analyzed in 9601 nonpregnant adults aged ≥20 y who did not have kidney failure. Weight status was categorized with the use of body mass index on the basis of measured height and weight (underweight or normal weight, overweight, and obesity). Urine osmolality was determined with the use of freezing-point depression osmometry. Hypohydration was classified according to the following age-dependent formula: ≥831 mOsm/kg - [3.4 × (age - 20 y)]. Total water intake was determined with the use of a 24-h dietary recall and was dichotomized as adequate or low on the basis of the Institute of Medicine's adequate intake recommendations for men and women (men: ≥3.7 or <3.7 L; nonlactating women: ≥2.7 or <2.7 L; lactating women: ≥3.8 or <3.8 L for adequate or low intakes, respectively). We tested interactions and conducted linear and log-binomial regressions. Total water intake (P = 0.002), urine osmolality (P < 0.001), and hypohydration prevalence (P < 0.001) all increased with higher weight status. Interactions between weight status and water intake status were significant in linear (P = 0.005) and log-binomial (P = 0.015) models, which were then stratified. The prevalence ratio of hypohydration between subjects with adequate water intake and those with low water intake was 0.56 (95% CI: 0.43, 0.73) in adults who were underweight or normal weight, 0.67 (95% CI: 0.57, 0.79) in adults who were overweight, and 0.78 (95% CI: 0.70, 0.88) in adults who were obese. On a population level, obesity modifies the association between water intake and hydration status. © 2016 American Society for Nutrition.
Forecasting overhaul or replacement intervals based on estimated system failure intensity
NASA Astrophysics Data System (ADS)
Gannon, James M.
1994-12-01
System reliability can be expressed in terms of the pattern of failure events over time. Assuming a nonhomogeneous Poisson process and Weibull intensity function for complex repairable system failures, the degree of system deterioration can be approximated. Maximum likelihood estimators (MLE's) for the system Rate of Occurrence of Failure (ROCOF) function are presented. Evaluating the integral of the ROCOF over annual usage intervals yields the expected number of annual system failures. By associating a cost of failure with the expected number of failures, budget and program policy decisions can be made based on expected future maintenance costs. Monte Carlo simulation is used to estimate the range and the distribution of the net present value and internal rate of return of alternative cash flows based on the distributions of the cost inputs and confidence intervals of the MLE's.
NASA Technical Reports Server (NTRS)
Kalinowski, Kevin F.; Tucker, George E.; Moralez, Ernesto, III
2006-01-01
Engineering development and qualification of a Research Flight Control System (RFCS) for the Rotorcraft Aircrew Systems Concepts Airborne Laboratory (RASCAL) JUH-60A has motivated the development of a pilot rating scale for evaluating failure transients in fly-by-wire flight control systems. The RASCAL RFCS includes a highly-reliable, dual-channel Servo Control Unit (SCU) to command and monitor the performance of the fly-by-wire actuators and protect against the effects of erroneous commands from the flexible, but single-thread Flight Control Computer. During the design phase of the RFCS, two piloted simulations were conducted on the Ames Research Center Vertical Motion Simulator (VMS) to help define the required performance characteristics of the safety monitoring algorithms in the SCU. Simulated failures, including hard-over and slow-over commands, were injected into the command path, and the aircraft response and safety monitor performance were evaluated. A subjective Failure/Recovery Rating (F/RR) scale was developed as a means of quantifying the effects of the injected failures on the aircraft state and the degree of pilot effort required to safely recover the aircraft. A brief evaluation of the rating scale was also conducted on the Army/NASA CH-47B variable stability helicopter to confirm that the rating scale was likely to be equally applicable to in-flight evaluations. Following the initial research flight qualification of the RFCS in 2002, a flight test effort was begun to validate the performance of the safety monitors and to validate their design for the safe conduct of research flight testing. Simulated failures were injected into the SCU, and the F/RR scale was applied to assess the results. The results validate the performance of the monitors, and indicate that the Failure/Recovery Rating scale is a very useful tool for evaluating failure transients in fly-by-wire flight control systems.
A new kid on the block: The Memory Validity Profile (MVP) in children with neurological conditions.
Brooks, Brian L; Fay-McClymont, Taryn B; MacAllister, William S; Vasserman, Marsha; Sherman, Elisabeth M S
2018-06-06
Determining the validity of obtained data is an inherent part of a neuropsychological assessment. The purpose of this study was investigate the failure rate of the Memory Validity Profile (MVP) in a large clinical sample of children and adolescents with neurological diagnoses. Data were obtained from 261 consecutive patients (mean age = 12.0, SD = 3.9, range = 5-19) who were referred for a neuropsychological assessment in a tertiary care pediatric hospital and were administered the MVP. In this sample, 4.6% of youth failed the MVP. Mean administration time for the MVP was 7.4 min, although time to complete was not associated with failure rates. Failure rates were held relatively consistent at approximately 5% across age ranges, diagnoses, and psychomotor processing speed abilities. Having very low, below normal, or above normal intellectual abilities did not alter failure rate on the MVP. However, those with intellectual disability (i.e., IQ<70) had a higher fail rate at 12% on MVP Total Score, but only 6% on the MVP Visual portion. Failure rates on the MVP were associated with lower scores on memory tests. This study provides support for using the MVP in children as young as 5 years with neurological diagnoses.
Effect of Strain Rate on Joint Strength and Failure Mode of Lead-Free Solder Joints
NASA Astrophysics Data System (ADS)
Lin, Jian; Lei, Yongping; Fu, Hanguang; Guo, Fu
2018-03-01
In surface mount technology, the Sn-3.0Ag-0.5Cu solder joint has a shorter impact lifetime than a traditional lead-tin solder joint. In order to improve the impact property of SnAgCu lead-free solder joints and identify the effect of silver content on tensile strength and impact property, impact experiments were conducted at various strain rates on three selected SnAgCu based solder joints. It was found that joint failure mainly occurred in the solder material with large plastic deformation under low strain rate, while joint failure occurred at the brittle intermetallic compound layer without any plastic deformation at a high strain rate. Joint strength increased with the silver content in SnAgCu alloys in static tensile tests, while the impact property of the solder joint decreased with increasing silver content. When the strain rate was low, plastic deformation occurred with failure and the tensile strength of the Sn-3.0Ag-0.5Cu solder joint was higher than that of Sn-0.3Ag-0.7Cu; when the strain rate was high, joint failure mainly occurred at the brittle interface layer and the Sn-0.3Ag-0.7Cu solder joint had a better impact resistance with a thinner intermetallic compound layer.
Siegel, Michael; Ross, Craig S; King, Charles
2014-12-01
Determining the relationship between gun ownership levels and firearm homicide rates is critical to inform public health policy. Previous research has shown that state-level gun ownership, as measured by a widely used proxy, is positively associated with firearm homicide rates. A newly developed proxy measure that incorporates the hunting license rate in addition to the proportion of firearm suicides correlates more highly with state-level gun ownership. To corroborate previous research, we used this new proxy to estimate the association of state-level gun ownership with total, firearm, and non-firearm homicides. Using state-specific data for the years 1981-2010, we modelled these rates as a function of gun ownership level, controlling for potential confounding factors. We used a negative binomial regression model and accounted for clustering of observations among states. We found that state-level gun ownership as measured by the new proxy, is significantly associated with firearm and total homicides but not with non-firearm homicides. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Brien, M.H.; Coon, D.M.
Time-dependent failure at elevated temperatures currently governs the service life of oxynitride glass-joined silicon nitride. Creep, devitrification, stress- aided oxidation-controlled slow crack growth, and viscous cabitation-controlled failure are examined as possible controlling mechanisms. Creep deformation failure is observed above 1000{degrees}C. Fractographic evidence indicates cavity formation and growth below 1000{degrees}C. Auger electron spectroscopy verified that the oxidation rate of the joining glass is governed by the oxygen supply rate. Time-to-failure data and those predicted using the Tsai and Raj, and Raj and Dang viscous cavitation models. It is concluded that viscous relaxation and isolated cavity growth control the rate of failuremore » in oxynitride glass-filled silicon nitride joints below 1000{degrees}C. Several possible methods are also proposed for increasing the service lives of these joints.« less
De Maria, Elia; Borghi, Ambra; Bonetti, Lorenzo; Fontana, Pier Luigi; Cappelli, Stefano
2017-02-16
Conductor externalization and insulation failure are frequent complications with the recalled St. Jude Medical Riata implantable cardioverter-defibrillator (ICD) leads. Conductor externalization is a "unique" failure mechanism: Cables externalize through the insulation ("inside-out" abrasion) and appear outside the lead body. Recently, single reports described a similar failure also for Biotronik leads. Moreover, some studies reported a high rate of electrical dysfunction (not only insulation failure) with Biotronik Linox leads and a reduced survival rate in comparison with the competitors. In this paper we describe the case of a patient with a Biotronik Kentrox ICD lead presenting with signs of insulation failure and conductor externalization at fluoroscopy. Due to the high risk of extraction we decided to implant a new lead, abandoning the damaged one; lead reimplant was uneventful. Subsequently, we review currently available literature about Biotronik Kentrox and Linox ICD lead failure and in particular externalized conductors. Some single-center studies and a non-prospective registry reported a survival rate between 88% and 91% at 5 years for Linox leads, significantly worse than that of other manufacturers. However, the preliminary results of two ongoing multicenter, prospective registries (GALAXY and CELESTIAL) showed 96% survival rate at 5 years after implant, well within industry standards. Ongoing data collection is needed to confirm longer-term performance of this family of ICD leads.
De Maria, Elia; Borghi, Ambra; Bonetti, Lorenzo; Fontana, Pier Luigi; Cappelli, Stefano
2017-01-01
Conductor externalization and insulation failure are frequent complications with the recalled St. Jude Medical Riata implantable cardioverter-defibrillator (ICD) leads. Conductor externalization is a “unique” failure mechanism: Cables externalize through the insulation (“inside-out” abrasion) and appear outside the lead body. Recently, single reports described a similar failure also for Biotronik leads. Moreover, some studies reported a high rate of electrical dysfunction (not only insulation failure) with Biotronik Linox leads and a reduced survival rate in comparison with the competitors. In this paper we describe the case of a patient with a Biotronik Kentrox ICD lead presenting with signs of insulation failure and conductor externalization at fluoroscopy. Due to the high risk of extraction we decided to implant a new lead, abandoning the damaged one; lead reimplant was uneventful. Subsequently, we review currently available literature about Biotronik Kentrox and Linox ICD lead failure and in particular externalized conductors. Some single-center studies and a non-prospective registry reported a survival rate between 88% and 91% at 5 years for Linox leads, significantly worse than that of other manufacturers. However, the preliminary results of two ongoing multicenter, prospective registries (GALAXY and CELESTIAL) showed 96% survival rate at 5 years after implant, well within industry standards. Ongoing data collection is needed to confirm longer-term performance of this family of ICD leads. PMID:28255544
Narayan, Kailash; van Dyk, Sylvia; Bernshaw, David; Rajasooriyar, Chrishanthi; Kondalsamy-Chennakesavan, Srinivas
2009-08-01
To compare patterns of failure, late toxicities, and survival in locally advanced cervical cancer patients treated by either low-dose-rate (LDR) or conformal high-dose-rate (HDRc) brachytherapy as a part of curative radiotherapy. A retrospective comparative study of 217 advanced cervix cancer patients was conducted; 90 of these patients received LDR and 127 received HDRc brachytherapy. All patients were staged using International Federation of Gynecology and Obstetrics (FIGO) rules, had pretreatment magnetic resonance imaging (MRI), and were treated with concurrent cisplatin chemoradiotherapy. Both groups matched for FIGO stage, MRI tumor volume, and uterine invasion status. Local and pelvic failures were similar 12-13% and 14% both in both groups. Abdominal and systemic failures in LDR group were 21% and 24%, whereas corresponding failures in HDRc group were 20% and 24%. Sixty-eight percent (87/127) of patients treated by HDRc remained asymptomatic, whereas 42% (38/90) of patients were asymptomatic from the bowel and bladder symptoms after treatment with LDR. The 5-year OS rate was 60% (SE = 4%). The 5-year failure-free survival rate was 55% (SE = 3%). There was no significant difference between the groups. Image-guided HDRc planning led to a large decrease in late radiation effects in patients treated by HDRc. Patterns of failure and survival were similar in patients treated either by LDR or HDRc.
Analyzing hospitalization data: potential limitations of Poisson regression.
Weaver, Colin G; Ravani, Pietro; Oliver, Matthew J; Austin, Peter C; Quinn, Robert R
2015-08-01
Poisson regression is commonly used to analyze hospitalization data when outcomes are expressed as counts (e.g. number of days in hospital). However, data often violate the assumptions on which Poisson regression is based. More appropriate extensions of this model, while available, are rarely used. We compared hospitalization data between 206 patients treated with hemodialysis (HD) and 107 treated with peritoneal dialysis (PD) using Poisson regression and compared results from standard Poisson regression with those obtained using three other approaches for modeling count data: negative binomial (NB) regression, zero-inflated Poisson (ZIP) regression and zero-inflated negative binomial (ZINB) regression. We examined the appropriateness of each model and compared the results obtained with each approach. During a mean 1.9 years of follow-up, 183 of 313 patients (58%) were never hospitalized (indicating an excess of 'zeros'). The data also displayed overdispersion (variance greater than mean), violating another assumption of the Poisson model. Using four criteria, we determined that the NB and ZINB models performed best. According to these two models, patients treated with HD experienced similar hospitalization rates as those receiving PD {NB rate ratio (RR): 1.04 [bootstrapped 95% confidence interval (CI): 0.49-2.20]; ZINB summary RR: 1.21 (bootstrapped 95% CI 0.60-2.46)}. Poisson and ZIP models fit the data poorly and had much larger point estimates than the NB and ZINB models [Poisson RR: 1.93 (bootstrapped 95% CI 0.88-4.23); ZIP summary RR: 1.84 (bootstrapped 95% CI 0.88-3.84)]. We found substantially different results when modeling hospitalization data, depending on the approach used. Our results argue strongly for a sound model selection process and improved reporting around statistical methods used for modeling count data. © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
Chan, Ta-Chien; Teng, Yung-Chu; Hwang, Jing-Shiang
2015-02-21
Emerging novel influenza outbreaks have increasingly been a threat to the public and a major concern of public health departments. Real-time data in seamless surveillance systems such as health insurance claims data for influenza-like illnesses (ILI) are ready for analysis, making it highly desirable to develop practical techniques to analyze such readymade data for outbreak detection so that the public can receive timely influenza epidemic warnings. This study proposes a simple and effective approach to analyze area-based health insurance claims data including outpatient and emergency department (ED) visits for early detection of any aberrations of ILI. The health insurance claims data during 2004-2009 from a national health insurance research database were used for developing early detection methods. The proposed approach fitted the daily new ILI visits and monitored the Pearson residuals directly for aberration detection. First, negative binomial regression was used for both outpatient and ED visits to adjust for potentially influential factors such as holidays, weekends, seasons, temporal dependence and temperature. Second, if the Pearson residuals exceeded 1.96, aberration signals were issued. The empirical validation of the model was done in 2008 and 2009. In addition, we designed a simulation study to compare the time of outbreak detection, non-detection probability and false alarm rate between the proposed method and modified CUSUM. The model successfully detected the aberrations of 2009 pandemic (H1N1) influenza virus in northern, central and southern Taiwan. The proposed approach was more sensitive in identifying aberrations in ED visits than those in outpatient visits. Simulation studies demonstrated that the proposed approach could detect the aberrations earlier, and with lower non-detection probability and mean false alarm rate in detecting aberrations compared to modified CUSUM methods. The proposed simple approach was able to filter out temporal trends, adjust for temperature, and issue warning signals for the first wave of the influenza epidemic in a timely and accurate manner.
SU-E-T-495: Neutron Induced Electronics Failure Rate Analysis for a Single Room Proton Accelerator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knutson, N; DeWees, T; Klein, E
2014-06-01
Purpose: To determine the failure rate as a function of neutron dose of the range modulator's servo motor controller system (SMCS) while shielded with Borated Polyethylene (BPE) and unshielded in a single room proton accelerator. Methods: Two experimental setups were constructed using two servo motor controllers and two motors. Each SMCS was then placed 30 cm from the end of the plugged proton accelerator applicator. The motor was then turned on and observed from outside of the vault while being irradiated to known neutron doses determined from bubble detector measurements. Anytime the motor deviated from the programmed motion a failuremore » was recorded along with the delivered dose. The experiment was repeated using 9 cm of BPE shielding surrounding the SMCS. Results: Ten SMCS failures were recorded in each experiment. The dose per monitor unit for the unshielded SMCS was 0.0211 mSv/MU and 0.0144 mSv/MU for the shielded SMCS. The mean dose to produce a failure for the unshielded SMCS was 63.5 ± 58.3 mSv versus 17.0 ±12.2 mSv for the shielded. The mean number of MUs between failures were 2297 ± 1891 MU for the unshielded SMCS and 2122 ± 1523 MU for the shielded. A Wilcoxon Signed Ranked test showed the dose between failures were significantly different (P value = 0.044) while the number of MUs between failures were not (P value = 1.000). Statistical analysis determined a SMCS neutron dose of 5.3 mSv produces a 5% chance of failure. Depending on the workload and location of the SMCS, this failure rate could impede clinical workflow. Conclusion: BPE shielding was shown to not reduce the average failure of the SMCS and relocation of the system outside of the accelerator vault was required to lower the failure rate enough to avoid impeding clinical work flow.« less
Comparison of mode of failure between primary and revision total knee arthroplasties.
Liang, H; Bae, J K; Park, C H; Kim, K I; Bae, D K; Song, S J
2018-04-01
Cognizance of common reasons for failure in primary and revision TKA, together with their time course, facilitates prevention. However, there have been few reports specifically comparing modes of failure for primary vs. revision TKA using a single prosthesis. The goal of the study was to compare the survival rates, modes of failure, and time periods associated with each mode of failure, of primary vs. revision TKA. The survival rates, modes of failure, time period for each mode of failure, and risk factors would differ between primary and revision TKA. Data from a consecutive cohort comprising 1606 knees (1174 patients) of primary TKA patients, and 258 knees (224 patients) of revision TKA patients, in all of whom surgery involved a P.F.C ® prosthesis (Depuy, Johnson & Johnson, Warsaw, IN), was retrospectively reviewed. The mean follow-up periods of primary and revision TKAs were 9.2 and 9.8 years, respectively. The average 10- and 15-year survival rates for primary TKA were 96.7% (CI 95%,±0.7%) and 85.4% (CI 95%,±2.0%), and for revision TKA 91.4% (CI 95%,±2.5%) and 80.5% (CI 95%,±4.5%). Common modes of failure included polyethylene wear, loosening, and infection. The most common mode of failure was polyethylene wear in primary TKA, and infection in revision TKA. The mean periods (i.e., latencies) of polyethylene wear and loosening did not differ between primary and revision TKAs, but the mean period of infection was significantly longer for revision TKA (1.2 vs. 4.8 years, P=0.003). Survival rates decreased with time, particularly more than 10 years post-surgery, for both primary and revision TKAs. Continuous efforts are required to prevent and detect the various modes of failure during long-term follow-up. Greater attention is necessary to detect late infection-induced failure following revision TKA. Case-control study, Level III. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Microcircuit Device Reliability Memory/Digital LSI
1982-01-01
has been performed. Each failure event record reveals the particular device and test characteristics, as well as associated stress values and other...given by: « s logio (Vxp) where X0 is the observed failure rate Xp is the predicted failure rate « is the residual Values of « are then plotted...n...... ||^||tpMMMWiWMM*i»""l’’ iŕŕ" ’• of failures per point). Some "funnelling" in Figure 17 shows this, although there is a fair amount of
Pogorzelska, Monika; Stone, PatriciaW.; Larson, Elaine L.
2012-01-01
Background The study objective is to describe infection control policies aimed at multidrug-resistant organisms (MDRO) in California hospitals and assess the relationship among these policies, structural characteristics, and rates of methicillin-resistant Staphylococcus aureus (MRSA) or vancomycin-resistant Enterococcus (VRE) bloodstream infections and Clostridium difficile infections. Methods Data on infection control policies, structural characteristics, and MDRO rates were collected through a 2010 survey of California infection control departments. Bivariate and multivariable Poisson and negative binomial regressions were conducted. Results One hundred eighty hospitals provided data (response rate, 54%). Targeted MRSA screening upon admission was reported by the majority of hospitals (87%). The majority of hospitals implemented contact precautions for confirmed MDRO and C difficile patients; presumptive isolation/contact precautions for patients with pending screens were less frequently implemented. Few infection control policies were associated with lower MDRO rates. Hospitals with a certified infection control director had significantly lower rates of MRSA bloodstream infections (P < .05). Conclusion Although most California hospitals are involved in activities to decrease MDRO, there is variation in specific activities utilized with the most focus placed on MRSA. This study highlights the importance of certification and its significant impact on infection rates. Additional research is needed to confirm these findings. PMID:22381222
Cundill, Bonnie; Alexander, Neal; Bethony, Jeff M; Diemert, David; Pullan, Rachel L; Brooker, Simon
2011-09-01
This study quantifies the rate and intensity of re-infection with human hookworm and Schistosoma mansoni infection 12 months following successful treatment, and investigates the influence of socio-economic, geographical and environmental factors. A longitudinal study of 642 individuals aged over 5 years was conducted in Minas Gerais State, Brazil from June 2004 to March 2006. Risk factors were assessed using interval censored regression for the rate and negative binomial regression for intensity. The crude rate and intensity of hookworm re-infection was 0·21 per year (95% confidence interval (CI) 0·15-0·29) and 70·9 epg (95% CI 47·2-106·6). For S. mansoni the rate was 0·06 per year (95% CI 0·03-0·10) and intensity 6·51 epg (95% CI 3·82-11·11). Rate and intensity of re-infection with hookworm were highest among males and positively associated with previous infection status, absence of a toilet and house structure. Rate and intensity of S. mansoni re-infection were associated with previous infection status as well as geographical, environmental and socio-economic factors. The implications of findings for the design of anti-helminth vaccine trials are discussed.
24 CFR 902.62 - Failure to submit data.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 24 Housing and Urban Development 4 2013-04-01 2013-04-01 false Failure to submit data. 902.62... DEVELOPMENT PUBLIC HOUSING ASSESSMENT SYSTEM PHAS Scoring § 902.62 Failure to submit data. (a) Failure to... receive a presumptive rating of failure for its unaudited information and shall receive zero points for...
24 CFR 902.62 - Failure to submit data.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 24 Housing and Urban Development 4 2012-04-01 2012-04-01 false Failure to submit data. 902.62... DEVELOPMENT PUBLIC HOUSING ASSESSMENT SYSTEM PHAS Scoring § 902.62 Failure to submit data. (a) Failure to... receive a presumptive rating of failure for its unaudited information and shall receive zero points for...
24 CFR 902.62 - Failure to submit data.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 24 Housing and Urban Development 4 2014-04-01 2014-04-01 false Failure to submit data. 902.62... DEVELOPMENT PUBLIC HOUSING ASSESSMENT SYSTEM PHAS Scoring § 902.62 Failure to submit data. (a) Failure to... receive a presumptive rating of failure for its unaudited information and shall receive zero points for...
Mehra, R K; Dhingra, V K; Nish, Aggarwal; Vashist, R P
2008-10-01
To analyse the treatment outcome of Cat I smear positive relapse and failure cases and their fate when treated with Cat II regimen under RNTCP. All Cat I smear positive relapse and failure TB patients treated with Category II regimen from 1994 to 2005 in a chest clinic of Delhi were analysed in this retrospective study. The re-treatment outcome data for relapse and failure cases of Cat I when treated with Cat II regimen was reviewed. The study population included 5576 registered as Cat I sputum positive cases in Gulabi Bagh chest clinic from 1994 to 2005. A total of 190 (3.4%) failed on Cat I regimen. Further out of 4905 (87.9%) successfully treated Cat I patients, 442 (9%) presented as relapses. The treatment success rate for relapse and failure cases of Cat I when subsequently treated with Cat II regimen were 76.4% and 48.8% respectively, with a significantly higher failure rate (27.6%) among Cat I failures subsequently treated with Cat II regimen. The failure cases of Cat I subsequently treated with Cat II were observed to have a significantly lower success rates (p < 0.05) as compared to relapse cases. The need for reappraisal of Cat II re-treatment regimen for failure cases among Cat I is suggested.
1982-06-01
STOCKATZC LV AaMIQ.YN 0gp M@lIm iii s m -r ANAs WgLMSZIb 940=04 WoeU-O PolytechnicInstitute June 1982 Stochastic Availability of a Repairable System ...STOCHASTIC AVAILABILITY OF A REPAIRABLE SYSTEM WITH AN AGE AND MAINTENANCE DEPENDENT FAILURE RATE by JACK-KANG CHAN June 1982 Report No..Poly EE/CS 82-004...1.1 Concepts of System Availability 1 1.2 Maintenance and Failure Rate 7 1.3 Summary Chapter 2 SYSTEM4 MODEL 2.1 A Repairable System with Lintenance
NASA Astrophysics Data System (ADS)
Suhir, E.
2014-05-01
The well known and widely used experimental reliability "passport" of a mass manufactured electronic or a photonic product — the bathtub curve — reflects the combined contribution of the statistics-related and reliability-physics (physics-of-failure)-related processes. When time progresses, the first process results in a decreasing failure rate, while the second process associated with the material aging and degradation leads to an increased failure rate. An attempt has been made in this analysis to assess the level of the reliability physics-related aging process from the available bathtub curve (diagram). It is assumed that the products of interest underwent the burn-in testing and therefore the obtained bathtub curve does not contain the infant mortality portion. It has been also assumed that the two random processes in question are statistically independent, and that the failure rate of the physical process can be obtained by deducting the theoretically assessed statistical failure rate from the bathtub curve ordinates. In the carried out numerical example, the Raleigh distribution for the statistical failure rate was used, for the sake of a relatively simple illustration. The developed methodology can be used in reliability physics evaluations, when there is a need to better understand the roles of the statistics-related and reliability-physics-related irreversible random processes in reliability evaluations. The future work should include investigations on how powerful and flexible methods and approaches of the statistical mechanics can be effectively employed, in addition to reliability physics techniques, to model the operational reliability of electronic and photonic products.
Generalization of multifractal theory within quantum calculus
NASA Astrophysics Data System (ADS)
Olemskoi, A.; Shuda, I.; Borisyuk, V.
2010-03-01
On the basis of the deformed series in quantum calculus, we generalize the partition function and the mass exponent of a multifractal, as well as the average of a random variable distributed over a self-similar set. For the partition function, such expansion is shown to be determined by binomial-type combinations of the Tsallis entropies related to manifold deformations, while the mass exponent expansion generalizes the known relation τq=Dq(q-1). We find the equation for the set of averages related to ordinary, escort, and generalized probabilities in terms of the deformed expansion as well. Multifractals related to the Cantor binomial set, exchange currency series, and porous-surface condensates are considered as examples.
Using real options analysis to support strategic management decisions
NASA Astrophysics Data System (ADS)
Kabaivanov, Stanimir; Markovska, Veneta; Milev, Mariyan
2013-12-01
Decision making is a complex process that requires taking into consideration multiple heterogeneous sources of uncertainty. Standard valuation and financial analysis techniques often fail to properly account for all these sources of risk as well as for all sources of additional flexibility. In this paper we explore applications of a modified binomial tree method for real options analysis (ROA) in an effort to improve decision making process. Usual cases of use of real options are analyzed with elaborate study on the applications and advantages that company management can derive from their application. A numeric results based on extending simple binomial tree approach for multiple sources of uncertainty are provided to demonstrate the improvement effects on management decisions.
Distribution pattern of phthirapterans infesting certain common Indian birds.
Saxena, A K; Kumar, Sandeep; Gupta, Nidhi; Mitra, J D; Ali, S A; Srivastava, Roshni
2007-08-01
The prevalence and frequency distribution patterns of 10 phthirapteran species infesting house sparrows, Indian parakeets, common mynas, and white breasted kingfishers were recorded in the district of Rampur, India, during 2004-05. The sample mean abundances, mean intensities, range of infestations, variance to mean ratios, values of the exponent of the negative binomial distribution, and the indices of discrepancy were also computed. Frequency distribution patterns of all phthirapteran species were skewed, but the observed frequencies did not correspond to the negative binomial distribution. Thus, adult-nymph ratios varied in different species from 1:0.53 to 1:1.25. Sex ratios of different phthirapteran species ranged from 1:1.10 to 1:1.65 and were female biased.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jenkins, M.G.; Kohles, S.S.; Stevens, T.L.
1996-12-31
Duality of failure mechanisms (slow crack growth from pre-existing defects versus cumulative creep damage) is examined in a silicon nitride advanced ceramic recently tested at elevated-temperatures. Static (constant stress over time), dynamic (monotonically-increasing stress over time), and cyclic (fluctuating stress over time) fatigue behaviors were evaluated in tension in ambient air at temperatures of 1150, 1260, and 1370{degrees}C for a hot-isostatically pressed monolithic {beta}-silicon nitride. At 1150{degrees}C, all three types of fatigue results showed the similar failure mechanism of slow crack growth (SCG). At 1260 and 1370{degrees}C the failure mechanism was more complex. Failure under static fatigue was dominated bymore » the accumulation of creep damage via diffusion-controlled cavities. In dynamic fatigue, failure occurred by SCG at high stress rates (>10{sup {minus}2}MPa/s) and by creep damage at low stress rates ({le}10{sup {minus}2} MPa/s). For cyclic fatigue, such rate effects influenced the stress rupture results in which times to failure were greater for dynamic and cyclic fatigue than for static fatigue. Elucidation of failure mechanisms is necessary for accurate prediction of long-term survivability and reliability of structural ceramics.« less
Duque, Juan C; Tabbara, Marwan; Martinez, Laisel; Cardona, Jose; Vazquez-Padron, Roberto I; Salman, Loay H
2017-01-01
The arteriovenous fistula (AVF) is the preferred hemodialysis access type because it has better patency rates and fewer complications than other access types. However, primary failure remains a common problem impeding AVF maturation and adding to patients' morbidity and mortality. Juxta-anastomotic (or inflow) stenosis is the most common reason leading to primary failure, and percutaneous transluminal angioplasty continues to be the gold-standard treatment with excellent success rates. Intimal hyperplasia (IH) has been traditionally blamed as the main pathophysiologic culprit, but new evidence raises doubts regarding the contribution of IH alone to primary failure. We report a 64-year-old man with a 2-stage brachiobasilic AVF that was complicated by failure 4 months after creation. An angiogram showed multiple juxta-anastomotic and midfistula stenotic lesions. Percutaneous transluminal angioplasty was successful in assisting maturation and subsequently cannulating the AVF for hemodialysis treatment. We failed to identify the underlying cause of stenosis because biopsy specimens from fistula tissue obtained at the time of transposition revealed no occlusive IH. This case emphasizes the need for additional research on factors contributing to AVF failure besides IH and highlights the need for more therapeutic options to reduce AVF failure rate. Copyright © 2016 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.
Material properties of rat middle cerebral arteries at high strain rates.
Bell, E David; Converse, Matthew; Mao, Haojie; Unnikrishnan, Ginu; Reifman, Jaques; Monson, Kenneth L
2018-03-19
Traumatic brain injury (TBI), resulting from either impact- or non-impact blast-related mechanisms, is a devastating cause of death and disability. The cerebral blood vessels, which provide critical support for brain tissue in both health and disease, are commonly injured in TBI. However, little is known about how vessels respond to traumatic loading, particularly at rates relevant to blast. To better understand vessel responses to trauma, the objective of this project was to characterize the high-rate response of passive cerebral arteries. Rat middle cerebral arteries were isolated and subjected to high-rate deformation in the axial direction. Vessels were perfused at physiological pressures and stretched to failure at strain rates ranging from approximately 100 to 1300 s-1. Although both in vivo stiffness and failure stress increased significantly with strain rate, failure stretch did not depend on rate.
Li, Duo-Jie; Li, Hong-Wei; He, Bin; Wang, Geng-Ming; Cai, Han-Fei; Duan, Shi-Miao; Liu, Jing-Jing; Zhang, Ya-Jun; Cui, Zhen; Jiang, Hao
2016-01-01
To retrospectively analyze the patterns of failure and the treatment effects of involved-field irradiation (IFI) on patients treated with locally advanced esophageal squamous cell carcinoma (ESCC) and to determine whether IFI is practicable in these patients. A total of 79 patients with locally advanced ESCC underwent three dimensional conformal (3D)CRT) or intensity modulated radiotherapy (IMRT) using IFI or elective nodal irradiation (ENI) according to the target volume. The patterns of failure were defined as local/regional, in-field, out)of)field regional lymph node (LN) and distant failure. With a median follow)up of 32.0 months, failures were observed in 66 (83.6%) patients. The cumulative incidence of local/regional failure (55.8 vs 52.8%) and in)field regional lymph node failure (25.6 vs 19.4%) showed no statistically significant difference between the IFI and the ENI group (p=0.526 and 0.215, respectively). Out)of)field nodal relapse rate of only 7.0% was seen in the IFI group. Three)year survival rates for the ENI and IFI group were 22.2 and 18.6%, respectively (p=0.240), and 3)year distant metastasis rates were 27.8 and 32.6%, respectively (p=0.180). The lung V10, V20, V30 and mean lung dose of the ENI group were greater than those of the IFI group, while the mean lung dose and V10 had statistically significant difference. The patterns of failure and survival rates in the IFI group were similar as in the ENI group; the regional recurrence and distant metastasis are the main cause of treatment failure. IFI is feasible for locally advanced ESCC. Further investigation is needed to increase local control and decrease distant metastasis in these patients.
Does an inter-flaw length control the accuracy of rupture forecasting in geological materials?
NASA Astrophysics Data System (ADS)
Vasseur, Jérémie; Wadsworth, Fabian B.; Heap, Michael J.; Main, Ian G.; Lavallée, Yan; Dingwell, Donald B.
2017-10-01
Multi-scale failure of porous materials is an important phenomenon in nature and in material physics - from controlled laboratory tests to rockbursts, landslides, volcanic eruptions and earthquakes. A key unsolved research question is how to accurately forecast the time of system-sized catastrophic failure, based on observations of precursory events such as acoustic emissions (AE) in laboratory samples, or, on a larger scale, small earthquakes. Until now, the length scale associated with precursory events has not been well quantified, resulting in forecasting tools that are often unreliable. Here we test the hypothesis that the accuracy of the forecast failure time depends on the inter-flaw distance in the starting material. We use new experimental datasets for the deformation of porous materials to infer the critical crack length at failure from a static damage mechanics model. The style of acceleration of AE rate prior to failure, and the accuracy of forecast failure time, both depend on whether the cracks can span the inter-flaw length or not. A smooth inverse power-law acceleration of AE rate to failure, and an accurate forecast, occurs when the cracks are sufficiently long to bridge pore spaces. When this is not the case, the predicted failure time is much less accurate and failure is preceded by an exponential AE rate trend. Finally, we provide a quantitative and pragmatic correction for the systematic error in the forecast failure time, valid for structurally isotropic porous materials, which could be tested against larger-scale natural failure events, with suitable scaling for the relevant inter-flaw distances.
Analysis of the Factors Affecting Surgical Success of Implants Placed in Iranian Warfare Victims
Jafarian, Mohammad; Bayat, Mohammad; Pakravan, Amir-Hossein; Emadi, Naghmeh
2016-01-01
Objective The aim was to evaluate the survival time and success rates of dental implants in warfare victims and factors that affect implant success. Subjects and Methods This retrospective study involved 250 Iranian warfare victims who received dental implants from 2003 to 2013. Patients' demographic characteristics, as well as the brand, diameter, length, location and failure rate of the implants were retrieved from patients' dental records and radiographs. The associations between these data and the survival rate were analyzed. Statistical analysis was carried out with χ2 and log-rank tests. Results Overall, out of the 1,533 dental implants, 61 (4s%) failed. The maxillary canine area had the highest failure rate [9 of 132 implants (6.8s%)], while the mandibular incisor region had the least number of failures [3 of 147 implants (2.0s%)] and the longest survival time (approximately 3,182 days). Maxillary canine areas had the shortest survival (about 2,996 days). The longest survival time was observed in implants with 11 mm length (3,179.72 ± 30.139 days) and 3.75-4 mm diameter (3,131.161 ± 35.96 days), and the shortest survival was found in implants with 11.5 mm length (2,317.79 ± 18.71 days) and 6.5 mm diameter (2,241.45 ± 182.21 days). Moreover, implants with 10 mm length (10.7s%) and 5.5-6 mm diameter (22.2s%) had the highest failure rate; however, the least failure rate occurred when the implants were 11.5 mm in length (1.9s%) and 3-3.5 mm in diameter (3.1s%). Conclusions The brand, length and diameter of implants affected the survival time, failure rate and time to failure. The location of the implant was not statistically significant regarding the mentioned factors, although it has clinical significance. PMID:27322534
Doggrell, Sheila Anne; Schaffer, Sally
2016-02-01
To reduce nursing shortages, accelerated nursing programs are available for domestic and international students. However, the withdrawal and failure rates from these programs may be different than for the traditional programs. The main aim of our study was to improve the retention and experience of accelerated nursing students. The academic background, age, withdrawal and failure rates of the accelerated and traditional students were determined. Data from 2009 and 2010 were collected prior to intervention. In an attempt to reduce the withdrawal of accelerated students, we set up an intervention, which was available to all students. The assessment of the intervention was a pre-post-test design with non-equivalent groups (the traditional and the accelerated students). The elements of the intervention were a) a formative website activity of some basic concepts in anatomy, physiology and pharmacology, b) a workshop addressing study skills and online resources, and c) resource lectures in anatomy/physiology and microbiology. The formative website and workshop was evaluated using questionnaires. The accelerated nursing students were five years older than the traditional students (p < 0.0001). The withdrawal rates from a pharmacology course are higher for accelerated nursing students, than for traditional students who have undertaken first year courses in anatomy and physiology (p = 0.04 in 2010). The withdrawing students were predominantly the domestic students with non-university qualifications or equivalent experience. The failure rates were also higher for this group, compared to the traditional students (p = 0.05 in 2009 and 0.03 in 2010). In contrast, the withdrawal rates for the international and domestic graduate accelerated students were very low. After the intervention, the withdrawal and failure rates in pharmacology for domestic accelerated students with non-university qualifications were not significantly different than those of traditional students. The accelerated international and domestic graduate nursing students have low withdrawal rates and high success rates in a pharmacology course. However, domestic students with non-university qualifications have higher withdrawal and failure rates than other nursing students and may be underprepared for university study in pharmacology in nursing programs. The introduction of an intervention was associated with reduced withdrawal and failure rates for these students in the pharmacology course.
Duan, Jun; Han, Xiaoli; Bai, Linfu; Zhou, Lintong; Huang, Shicong
2017-02-01
To develop and validate a scale using variables easily obtained at the bedside for prediction of failure of noninvasive ventilation (NIV) in hypoxemic patients. The test cohort comprised 449 patients with hypoxemia who were receiving NIV. This cohort was used to develop a scale that considers heart rate, acidosis, consciousness, oxygenation, and respiratory rate (referred to as the HACOR scale) to predict NIV failure, defined as need for intubation after NIV intervention. The highest possible score was 25 points. To validate the scale, a separate group of 358 hypoxemic patients were enrolled in the validation cohort. The failure rate of NIV was 47.8 and 39.4% in the test and validation cohorts, respectively. In the test cohort, patients with NIV failure had higher HACOR scores at initiation and after 1, 12, 24, and 48 h of NIV than those with successful NIV. At 1 h of NIV the area under the receiver operating characteristic curve was 0.88, showing good predictive power for NIV failure. Using 5 points as the cutoff value, the sensitivity, specificity, positive predictive value, negative predictive value, and diagnostic accuracy for NIV failure were 72.6, 90.2, 87.2, 78.1, and 81.8%, respectively. These results were confirmed in the validation cohort. Moreover, the diagnostic accuracy for NIV failure exceeded 80% in subgroups classified by diagnosis, age, or disease severity and also at 1, 12, 24, and 48 h of NIV. Among patients with NIV failure with a HACOR score of >5 at 1 h of NIV, hospital mortality was lower in those who received intubation at ≤12 h of NIV than in those intubated later [58/88 (66%) vs. 138/175 (79%); p = 0.03). The HACOR scale variables are easily obtained at the bedside. The scale appears to be an effective way of predicting NIV failure in hypoxemic patients. Early intubation in high-risk patients may reduce hospital mortality.
Schmeida, Mary; Savrin, Ronald A
2012-01-01
Heart failure readmission among the elderly is frequent and costly to both the patient and the Medicare trust fund. In this study, the authors explore the factors that are associated with states having heart failure readmission rates that are higher than the U.S. national rate. Acute inpatient hospital settings. 50 state-level data and multivariate regression analysis is used. The dependent variable Heart Failure 30-day Readmission Worse than U.S. Rate is based on adult Medicare Fee-for-Service patients hospitalized with a primary discharge diagnosis of heart failure and for which a subsequent inpatient readmission occurred within 30 days of their last discharge. One key variable found--states with a higher resident population speaking a primary language other than English at home--that is significantly associated with a decrease in probability in states ranking "worse" on heart failure 30-day readmission. Whereas, states with a higher median income, more total days of care per 1,000 Medicare enrollees, and a greater percentage of Medicare enrollees with prescription drug coverage have a greater probability for heart failure 30-day readmission to be "worse" than the U.S. national rate. Case management interventions targeting health literacy may be more effective than other factors to improve state-level hospital status on heart failure 30-day readmission. Factors such as total days of care per 1,000 Medicare enrollees and improving patient access to postdischarge medication(s) may not be as important as literacy. Interventions aimed to prevent disparities should consider higher income population groups as vulnerable for readmission.
Optimal Measurement Interval for Emergency Department Crowding Estimation Tools.
Wang, Hao; Ojha, Rohit P; Robinson, Richard D; Jackson, Bradford E; Shaikh, Sajid A; Cowden, Chad D; Shyamanand, Rath; Leuck, JoAnna; Schrader, Chet D; Zenarosa, Nestor R
2017-11-01
Emergency department (ED) crowding is a barrier to timely care. Several crowding estimation tools have been developed to facilitate early identification of and intervention for crowding. Nevertheless, the ideal frequency is unclear for measuring ED crowding by using these tools. Short intervals may be resource intensive, whereas long ones may not be suitable for early identification. Therefore, we aim to assess whether outcomes vary by measurement interval for 4 crowding estimation tools. Our eligible population included all patients between July 1, 2015, and June 30, 2016, who were admitted to the JPS Health Network ED, which serves an urban population. We generated 1-, 2-, 3-, and 4-hour ED crowding scores for each patient, using 4 crowding estimation tools (National Emergency Department Overcrowding Scale [NEDOCS], Severely Overcrowded, Overcrowded, and Not Overcrowded Estimation Tool [SONET], Emergency Department Work Index [EDWIN], and ED Occupancy Rate). Our outcomes of interest included ED length of stay (minutes) and left without being seen or eloped within 4 hours. We used accelerated failure time models to estimate interval-specific time ratios and corresponding 95% confidence limits for length of stay, in which the 1-hour interval was the reference. In addition, we used binomial regression with a log link to estimate risk ratios (RRs) and corresponding confidence limit for left without being seen. Our study population comprised 117,442 patients. The time ratios for length of stay were similar across intervals for each crowding estimation tool (time ratio=1.37 to 1.30 for NEDOCS, 1.44 to 1.37 for SONET, 1.32 to 1.27 for EDWIN, and 1.28 to 1.23 for ED Occupancy Rate). The RRs of left without being seen differences were also similar across intervals for each tool (RR=2.92 to 2.56 for NEDOCS, 3.61 to 3.36 for SONET, 2.65 to 2.40 for EDWIN, and 2.44 to 2.14 for ED Occupancy Rate). Our findings suggest limited variation in length of stay or left without being seen between intervals (1 to 4 hours) regardless of which of the 4 crowding estimation tools were used. Consequently, 4 hours may be a reasonable interval for assessing crowding with these tools, which could substantially reduce the burden on ED personnel by requiring less frequent assessment of crowding. Copyright © 2017 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
P-Hacking in Orthopaedic Literature: A Twist to the Tail.
Bin Abd Razak, Hamid Rahmatullah; Ang, Jin-Guang Ernest; Attal, Hersh; Howe, Tet-Sen; Allen, John Carson
2016-10-19
"P-hacking" occurs when researchers preferentially select data or statistical analyses until nonsignificant results become significant. We wanted to evaluate if the phenomenon of p-hacking was evident in orthopaedic literature. We text-mined through all articles published in three top orthopaedic journals in 2015. For anonymity, we cipher-coded the three journals. We included all studies that reported a single p value to answer their main hypothesis. These p values were then charted and frequency graphs were generated to illustrate any evidence of p-hacking. Binomial tests were employed to look for evidence of evidential value and significance of p-hacking. Frequency plots for all three journals revealed evidence of p-hacking. Binomial tests for all three journals were significant for evidence of evidential value (p < 0.0001 for all). However, the binomial test for p-hacking was significant only for one journal (p = 0.0092). P-hacking is an evolving phenomenon that threatens to jeopardize the evidence-based practice of medicine. Although our results show that there is good evidential value for orthopaedic literature published in our top journals, there is some evidence of p-hacking of which authors and readers should be wary. Copyright © 2016 by The Journal of Bone and Joint Surgery, Incorporated.
Chang, Yu-Wei; Tsong, Yi; Zhao, Zhigen
2017-01-01
Assessing equivalence or similarity has drawn much attention recently as many drug products have lost or will lose their patents in the next few years, especially certain best-selling biologics. To claim equivalence between the test treatment and the reference treatment when assay sensitivity is well established from historical data, one has to demonstrate both superiority of the test treatment over placebo and equivalence between the test treatment and the reference treatment. Thus, there is urgency for practitioners to derive a practical way to calculate sample size for a three-arm equivalence trial. The primary endpoints of a clinical trial may not always be continuous, but may be discrete. In this paper, the authors derive power function and discuss sample size requirement for a three-arm equivalence trial with Poisson and negative binomial clinical endpoints. In addition, the authors examine the effect of the dispersion parameter on the power and the sample size by varying its coefficient from small to large. In extensive numerical studies, the authors demonstrate that required sample size heavily depends on the dispersion parameter. Therefore, misusing a Poisson model for negative binomial data may easily lose power up to 20%, depending on the value of the dispersion parameter.
Statistical tests to compare motif count exceptionalities
Robin, Stéphane; Schbath, Sophie; Vandewalle, Vincent
2007-01-01
Background Finding over- or under-represented motifs in biological sequences is now a common task in genomics. Thanks to p-value calculation for motif counts, exceptional motifs are identified and represent candidate functional motifs. The present work addresses the related question of comparing the exceptionality of one motif in two different sequences. Just comparing the motif count p-values in each sequence is indeed not sufficient to decide if this motif is significantly more exceptional in one sequence compared to the other one. A statistical test is required. Results We develop and analyze two statistical tests, an exact binomial one and an asymptotic likelihood ratio test, to decide whether the exceptionality of a given motif is equivalent or significantly different in two sequences of interest. For that purpose, motif occurrences are modeled by Poisson processes, with a special care for overlapping motifs. Both tests can take the sequence compositions into account. As an illustration, we compare the octamer exceptionalities in the Escherichia coli K-12 backbone versus variable strain-specific loops. Conclusion The exact binomial test is particularly adapted for small counts. For large counts, we advise to use the likelihood ratio test which is asymptotic but strongly correlated with the exact binomial test and very simple to use. PMID:17346349
Cocco, Arturo; Serra, Giuseppe; Lentini, Andrea; Deliperi, Salvatore; Delrio, Gavino
2015-09-01
The within- and between-plant distribution of the tomato leafminer, Tuta absoluta (Meyrick), was investigated in order to define action thresholds based on leaf infestation and to propose enumerative and binomial sequential sampling plans for pest management applications in protected crops. The pest spatial distribution was aggregated between plants, and median leaves were the most suitable sample to evaluate the pest density. Action thresholds of 36 and 48%, 43 and 56% and 60 and 73% infested leaves, corresponding to economic thresholds of 1 and 3% damaged fruits, were defined for tomato cultivars with big, medium and small fruits respectively. Green's method was a more suitable enumerative sampling plan as it required a lower sampling effort. Binomial sampling plans needed lower average sample sizes than enumerative plans to make a treatment decision, with probabilities of error of <0.10. The enumerative sampling plan required 87 or 343 leaves to estimate the population density in extensive or intensive ecological studies respectively. Binomial plans would be more practical and efficient for control purposes, needing average sample sizes of 17, 20 and 14 leaves to take a pest management decision in order to avoid fruit damage higher than 1% in cultivars with big, medium and small fruits respectively. © 2014 Society of Chemical Industry.
Syndromic surveillance for health information system failures: a feasibility study
Ong, Mei-Sing; Magrabi, Farah; Coiera, Enrico
2013-01-01
Objective To explore the applicability of a syndromic surveillance method to the early detection of health information technology (HIT) system failures. Methods A syndromic surveillance system was developed to monitor a laboratory information system at a tertiary hospital. Four indices were monitored: (1) total laboratory records being created; (2) total records with missing results; (3) average serum potassium results; and (4) total duplicated tests on a patient. The goal was to detect HIT system failures causing: data loss at the record level; data loss at the field level; erroneous data; and unintended duplication of data. Time-series models of the indices were constructed, and statistical process control charts were used to detect unexpected behaviors. The ability of the models to detect HIT system failures was evaluated using simulated failures, each lasting for 24 h, with error rates ranging from 1% to 35%. Results In detecting data loss at the record level, the model achieved a sensitivity of 0.26 when the simulated error rate was 1%, while maintaining a specificity of 0.98. Detection performance improved with increasing error rates, achieving a perfect sensitivity when the error rate was 35%. In the detection of missing results, erroneous serum potassium results and unintended repetition of tests, perfect sensitivity was attained when the error rate was as small as 5%. Decreasing the error rate to 1% resulted in a drop in sensitivity to 0.65–0.85. Conclusions Syndromic surveillance methods can potentially be applied to monitor HIT systems, to facilitate the early detection of failures. PMID:23184193
Patterns of recurrence after trimodality therapy for esophageal cancer.
Dorth, Jennifer A; Pura, John A; Palta, Manisha; Willett, Christopher G; Uronis, Hope E; D'Amico, Thomas A; Czito, Brian G
2014-07-15
Patterns of failure after neoadjuvant chemoradiotherapy and surgery for esophageal cancer are poorly defined. All patients in the current study were treated with trimodality therapy for nonmetastatic esophageal cancer from 1995 to 2009. Locoregional failure included lymph node failure (NF), anastomotic failure, or both. Abdominal paraaortic failure (PAF) was defined as disease recurrence at or below the superior mesenteric artery. Among 155 patients, the primary tumor location was the upper/middle esophagus in 18%, the lower esophagus in 32%, and the gastroesophageal junction in 50% (adenocarcinoma in 79% and squamous cell carcinoma in 21%) of patients. Staging methods included endoscopic ultrasound (73%), computed tomography (46%), and positron emission tomography/computed tomography (54%). Approximately 40% of patients had American Joint Committee on Cancer stage II disease and 60% had stage III disease. The median follow-up was 1.3 years. The 2-year locoregional control, event-free survival, and overall survival rates were 86%, 36%, and 48%, respectively. The 2-year NF rate was 14%, the isolated NF rate was 3%, and the anastomotic failure rate was 6%. The 2-year PAF rate was 9% and the isolated PAF rate was 5%. PAF was found to be increased among patients with gastroesophageal junction tumors (12% vs 6%), especially for the subset with ≥ 2 clinically involved lymph nodes at the time of diagnosis (19% vs 4%). Few patients experience isolated NF or PAF as their first disease recurrence. Therefore, it is unlikely that targeting additional regional lymph node basins with radiotherapy would significantly improve clinical outcomes. © 2014 American Cancer Society.
[Assessment of medical management of heart failure at National Hospital Blaise COMPAORE].
Kambiré, Y; Konaté, L; Diallo, I; Millogo, G R C; Kologo, K J; Tougouma, J B; Samadoulougou, A K; Zabsonré, P
2018-05-09
The aim of this study was to assess the quality of medical management of heart failure at the National Hospital Blaise Compaoré according to the international guidelines. A retrospective study was performed including consecutive patients admitted for heart failure documented sonographically from October 2012 to March 2015 in the Medicine and Medical Specialties Department of National Hospital Blaise Compaore with a minimum follow-up of six weeks. Data analysis was made by the SPSS 20.0 software. Eighty-four patients, mean age of 57.61±18.24 years, were included. It was an acute heart failure in 84.5% of patients with systolic left ventricular function impaired (77.4%). The rate of prescription of different drugs in heart failure any type was 88.1% for loop diuretics; 77.1% for angiotensin-converting enzyme inhibitors/angiotensin receptor blockers and 65.5% for betablockers. In patients with systolic dysfunction, 84.62% of patients were received the combination of angiotensin-converting enzyme inhibitors/angiotensin receptor blockers and 75.38% for betablockers. Exercise rehabilitation was undergoing in 10.7% of patients. The death rate was 16.7% and hospital readmission rate of 16.7%. The prescription rate of major heart failure drugs is satisfactory. Cardiac rehabilitation should be developed. Copyright © 2018 Elsevier Masson SAS. All rights reserved.
Skarupskiene, Inga; Kuzminskis, Vytautas; Ziginskiene, Edita
2007-01-01
The aim of this study was to determine the frequency, etiology, and outcomes of acute renal failure. We retrospectively collected data on all patients (n=1653) who received renal replacement therapy for acute renal failure at the Kaunas University of Medicine Hospital during 1995-2006. The number of patients with acute renal failure increased nine times during the 11-year period. The mean age of patients was 59.76+/-17.52 years and increased from 44.97+/-17.1 years in 1995 to 62.84+/-16.49 years in 2006. The most common causes of acute renal failure were renal (n=646, 39%), prerenal (n=380, 23%), and obstructive (n=145, 9%). The renal replacement therapy was discontinued because of recovery of renal function in 49.9% of cases. The overall hospital mortality rate was 45.1%. Renal function did not recover in 6.7% of patients. The mortality rate over the 11-year period varied from 37.8 to 57.5%. The highest mortality rate was in the neurosurgical (62.3%) and cardiac surgical (61.8%) intensive care units. High mortality rate (more than 50%) was in the groups of patients with acute renal failure that was caused by hepatorenal syndrome, shock, sepsis, and reduced cardiac output.
PREDICE score as a predictor of 90 days mortality in patients with heart failure
NASA Astrophysics Data System (ADS)
Purba, D. P. S.; Hasan, R.
2018-03-01
Hospitalization in chronic heart failure patients associated with high mortality and morbidity rate. The 90 days post-discharge period following hospitalization in heart failure patients is known as the vulnerable phase, it carries the high risk of poor outcomes. Identification of high-risk individuals by using prognostic evaluation was intended to do a closer follow up and more intensive to decreasing the morbidity and mortality rate of heart failure.To determine whether PREDICE score could predict mortality within 90 days in patients with heart failure, an observational cohort study in patients with heart failure who were hospitalized due to worsening chronic heart failure. Patients were in following-up for up to 90 days after initial evaluation with the primary endpoint is death.We found a difference of the significantstatistical between PREDICE score in survival and mortality group (p=0.001) of 84% (95% CI: 60.9% - 97.4%).In conclusion, PREDICE score has a good ability to predict mortality within 90 days in patients with heart failure.
Implantable Hemodynamic Monitoring for Heart Failure Patients.
Abraham, William T; Perl, Leor
2017-07-18
Rates of heart failure hospitalization remain unacceptably high. Such hospitalizations are associated with substantial patient, caregiver, and economic costs. Randomized controlled trials of noninvasive telemedical systems have failed to demonstrate reduced rates of hospitalization. The failure of these technologies may be due to the limitations of the signals measured. Intracardiac and pulmonary artery pressure-guided management has become a focus of hospitalization reduction in heart failure. Early studies using implantable hemodynamic monitors demonstrated the potential of pressure-based heart failure management, whereas subsequent studies confirmed the clinical utility of this approach. One large pivotal trial proved the safety and efficacy of pulmonary artery pressure-guided heart failure management, showing a marked reduction in heart failure hospitalizations in patients randomized to active pressure-guided management. "Next-generation" implantable hemodynamic monitors are in development, and novel approaches for the use of this data promise to expand the use of pressure-guided heart failure management. Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Motyka, P.
1983-01-01
A methodology is developed and applied for quantitatively analyzing the reliability of a dual, fail-operational redundant strapdown inertial measurement unit (RSDIMU). A Markov evaluation model is defined in terms of the operational states of the RSDIMU to predict system reliability. A 27 state model is defined based upon a candidate redundancy management system which can detect and isolate a spectrum of failure magnitudes. The results of parametric studies are presented which show the effect on reliability of the gyro failure rate, both the gyro and accelerometer failure rates together, false alarms, probability of failure detection, probability of failure isolation, and probability of damage effects and mission time. A technique is developed and evaluated for generating dynamic thresholds for detecting and isolating failures of the dual, separated IMU. Special emphasis is given to the detection of multiple, nonconcurrent failures. Digital simulation time histories are presented which show the thresholds obtained and their effectiveness in detecting and isolating sensor failures.
Miller, Preston R; Chang, Michael C; Hoth, J Jason; Mowery, Nathan T; Hildreth, Amy N; Martin, R Shayn; Holmes, James H; Meredith, J Wayne; Requarth, Jay A
2014-04-01
Nonoperative management (NOM) of blunt splenic injury is well accepted. Substantial failure rates in higher injury grades remain common, with one large study reporting rates of 19.6%, 33.3%, and 75% for grades III, IV, and V, respectively. Retrospective data show angiography and embolization can increase salvage rates in these severe injuries. We developed a protocol requiring referral of all blunt splenic injuries, grades III to V, without indication for immediate operation for angiography and embolization. We hypothesized that angiography and embolization of high-grade blunt splenic injury would reduce NOM failure rates in this population. This was a prospective study at our Level I trauma center as part of a performance-improvement project. Demographics, injury characteristics, and outcomes were compared with historic controls. The protocol required all stable patients with grade III to V splenic injuries be referred for angiography and embolization. In historic controls, referral was based on surgeon preference. From January 1, 2010 to December 31, 2012, there were 168 patients with grades III to V spleen injuries admitted; NOM was undertaken in 113 (67%) patients. The protocol was followed in 97 patients, with a failure rate of 5%. Failure rate in the 16 protocol deviations was 25% (p = 0.02). Historic controls from January 1, 2007 to December 31, 2009 were compared with the protocol group. One hundred and fifty-three patients with grade III to V injuries were admitted during this period, 80 (52%) patients underwent attempted NOM. Failure rate was significantly higher than for the protocol group (15%, p = 0.04). Use of a protocol requiring angiography and embolization for all high-grade spleen injuries slated for NOM leads to a significantly decreased failure rate. We recommend angiography and embolization as an adjunct to NOM for all grade III to V splenic injuries. Copyright © 2014 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Ceramic capacitor insulation resistance failures accelerated by low voltage
NASA Technical Reports Server (NTRS)
Brennan, T. F.
1978-01-01
Ceramic capacitors failed insulation resistance testing at less than one-tenth their rated voltage. Many failures recovered as the voltage was increased. Comprehensive failure analysis techniques, some of which are unprecedented, were used to examine these failures. It was determined that there was more than one failure mechanism, and the results indicate a need for special additional screening.
Matrix Dominated Failure of Fiber-Reinforced Composite Laminates Under Static and Dynamic Loading
NASA Astrophysics Data System (ADS)
Schaefer, Joseph Daniel
Hierarchical material systems provide the unique opportunity to connect material knowledge to solving specific design challenges. Representing the quickest growing class of hierarchical materials in use, fiber-reinforced polymer composites (FRPCs) offer superior strength and stiffness-to-weight ratios, damage tolerance, and decreasing production costs compared to metals and alloys. However, the implementation of FRPCs has historically been fraught with inadequate knowledge of the material failure behavior due to incomplete verification of recent computational constitutive models and improper (or non-existent) experimental validation, which has severely slowed creation and development. Noted by the recent Materials Genome Initiative and the Worldwide Failure Exercise, current state of the art qualification programs endure a 20 year gap between material conceptualization and implementation due to the lack of effective partnership between computational coding (simulation) and experimental characterization. Qualification processes are primarily experiment driven; the anisotropic nature of composites predisposes matrix-dominant properties to be sensitive to strain rate, which necessitates extensive testing. To decrease the qualification time, a framework that practically combines theoretical prediction of material failure with limited experimental validation is required. In this work, the Northwestern Failure Theory (NU Theory) for composite lamina is presented as the theoretical basis from which the failure of unidirectional and multidirectional composite laminates is investigated. From an initial experimental characterization of basic lamina properties, the NU Theory is employed to predict the matrix-dependent failure of composites under any state of biaxial stress from quasi-static to 1000 s-1 strain rates. It was found that the number of experiments required to characterize the strain-rate-dependent failure of a new composite material was reduced by an order of magnitude, and the resulting strain-rate-dependence was applicable for a large class of materials. The presented framework provides engineers with the capability to quickly identify fiber and matrix combinations for a given application and determine the failure behavior over the range of practical loadings cases. The failure-mode-based NU Theory may be especially useful when partnered with computational approaches (which often employ micromechanics to determine constituent and constitutive response) to provide accurate validation of the matrix-dominated failure modes experienced by laminates during progressive failure.
Dahabreh, Issa J; Trikalinos, Thomas A; Lau, Joseph; Schmid, Christopher H
2017-03-01
To compare statistical methods for meta-analysis of sensitivity and specificity of medical tests (e.g., diagnostic or screening tests). We constructed a database of PubMed-indexed meta-analyses of test performance from which 2 × 2 tables for each included study could be extracted. We reanalyzed the data using univariate and bivariate random effects models fit with inverse variance and maximum likelihood methods. Analyses were performed using both normal and binomial likelihoods to describe within-study variability. The bivariate model using the binomial likelihood was also fit using a fully Bayesian approach. We use two worked examples-thoracic computerized tomography to detect aortic injury and rapid prescreening of Papanicolaou smears to detect cytological abnormalities-to highlight that different meta-analysis approaches can produce different results. We also present results from reanalysis of 308 meta-analyses of sensitivity and specificity. Models using the normal approximation produced sensitivity and specificity estimates closer to 50% and smaller standard errors compared to models using the binomial likelihood; absolute differences of 5% or greater were observed in 12% and 5% of meta-analyses for sensitivity and specificity, respectively. Results from univariate and bivariate random effects models were similar, regardless of estimation method. Maximum likelihood and Bayesian methods produced almost identical summary estimates under the bivariate model; however, Bayesian analyses indicated greater uncertainty around those estimates. Bivariate models produced imprecise estimates of the between-study correlation of sensitivity and specificity. Differences between methods were larger with increasing proportion of studies that were small or required a continuity correction. The binomial likelihood should be used to model within-study variability. Univariate and bivariate models give similar estimates of the marginal distributions for sensitivity and specificity. Bayesian methods fully quantify uncertainty and their ability to incorporate external evidence may be useful for imprecisely estimated parameters. Copyright © 2017 Elsevier Inc. All rights reserved.
Shirazi, Mohammadali; Dhavala, Soma Sekhar; Lord, Dominique; Geedipally, Srinivas Reddy
2017-10-01
Safety analysts usually use post-modeling methods, such as the Goodness-of-Fit statistics or the Likelihood Ratio Test, to decide between two or more competitive distributions or models. Such metrics require all competitive distributions to be fitted to the data before any comparisons can be accomplished. Given the continuous growth in introducing new statistical distributions, choosing the best one using such post-modeling methods is not a trivial task, in addition to all theoretical or numerical issues the analyst may face during the analysis. Furthermore, and most importantly, these measures or tests do not provide any intuitions into why a specific distribution (or model) is preferred over another (Goodness-of-Logic). This paper ponders into these issues by proposing a methodology to design heuristics for Model Selection based on the characteristics of data, in terms of descriptive summary statistics, before fitting the models. The proposed methodology employs two analytic tools: (1) Monte-Carlo Simulations and (2) Machine Learning Classifiers, to design easy heuristics to predict the label of the 'most-likely-true' distribution for analyzing data. The proposed methodology was applied to investigate when the recently introduced Negative Binomial Lindley (NB-L) distribution is preferred over the Negative Binomial (NB) distribution. Heuristics were designed to select the 'most-likely-true' distribution between these two distributions, given a set of prescribed summary statistics of data. The proposed heuristics were successfully compared against classical tests for several real or observed datasets. Not only they are easy to use and do not need any post-modeling inputs, but also, using these heuristics, the analyst can attain useful information about why the NB-L is preferred over the NB - or vice versa- when modeling data. Copyright © 2017 Elsevier Ltd. All rights reserved.
Modeling avian abundance from replicated counts using binomial mixture models
Kery, Marc; Royle, J. Andrew; Schmid, Hans
2005-01-01
Abundance estimation in ecology is usually accomplished by capture–recapture, removal, or distance sampling methods. These may be hard to implement at large spatial scales. In contrast, binomial mixture models enable abundance estimation without individual identification, based simply on temporally and spatially replicated counts. Here, we evaluate mixture models using data from the national breeding bird monitoring program in Switzerland, where some 250 1-km2 quadrats are surveyed using the territory mapping method three times during each breeding season. We chose eight species with contrasting distribution (wide–narrow), abundance (high–low), and detectability (easy–difficult). Abundance was modeled as a random effect with a Poisson or negative binomial distribution, with mean affected by forest cover, elevation, and route length. Detectability was a logit-linear function of survey date, survey date-by-elevation, and sampling effort (time per transect unit). Resulting covariate effects and parameter estimates were consistent with expectations. Detectability per territory (for three surveys) ranged from 0.66 to 0.94 (mean 0.84) for easy species, and from 0.16 to 0.83 (mean 0.53) for difficult species, depended on survey effort for two easy and all four difficult species, and changed seasonally for three easy and three difficult species. Abundance was positively related to route length in three high-abundance and one low-abundance (one easy and three difficult) species, and increased with forest cover in five forest species, decreased for two nonforest species, and was unaffected for a generalist species. Abundance estimates under the most parsimonious mixture models were between 1.1 and 8.9 (median 1.8) times greater than estimates based on territory mapping; hence, three surveys were insufficient to detect all territories for each species. We conclude that binomial mixture models are an important new approach for estimating abundance corrected for detectability when only repeated-count data are available. Future developments envisioned include estimation of trend, occupancy, and total regional abundance.
Kuo, Lindsay E; Kaufman, Elinore; Hoffman, Rebecca L; Pascual, Jose L; Martin, Niels D; Kelz, Rachel R; Holena, Daniel N
2017-03-01
Failure-to-rescue is defined as the conditional probability of death after a complication, and the failure-to-rescue rate reflects a center's ability to successfully "rescue" patients after complications. The validity of the failure-to-rescue rate as a quality measure is dependent on the preventability of death and the appropriateness of this measure for use in the trauma population is untested. We sought to evaluate the relationship between preventability and failure-to-rescue in trauma. All adjudications from a mortality review panel at an academic level I trauma center from 2005-2015 were merged with registry data for the same time period. The preventability of each death was determined by panel consensus as part of peer review. Failure-to-rescue deaths were defined as those occurring after any registry-defined complication. Univariate and multivariate logistic regression models between failure-to-rescue status and preventability were constructed and time to death was examined using survival time analyses. Of 26,557 patients, 2,735 (10.5%) had a complication, of whom 359 died for a failure-to-rescue rate of 13.2%. Of failure-to-rescue deaths, 272 (75.6%) were judged to be non-preventable, 65 (18.1%) were judged potentially preventable, and 22 (6.1%) were judged to be preventable by peer review. After adjusting for other patient factors, there remained a strong association between failure-to-rescue status and potentially preventable (odds ratio 2.32, 95% confidence interval, 1.47-3.66) and preventable (odds ratio 14.84, 95% confidence interval, 3.30-66.71) judgment. Despite a strong association between failure-to-rescue status and preventability adjudication, only a minority of deaths meeting the definition of failure to rescue were judged to be preventable or potentially preventable. Revision of the failure-to-rescue metric before use in trauma care benchmarking is warranted. Copyright © 2016 Elsevier Inc. All rights reserved.
Kuo, Lindsay E.; Kaufman, Elinore; Hoffman, Rebecca L.; Pascual, Jose L.; Martin, Niels D.; Kelz, Rachel R.; Holena, Daniel N.
2018-01-01
Background Failure-to-rescue is defined as the conditional probability of death after a complication, and the failure-to-rescue rate reflects a center’s ability to successfully “rescue” patients after complications. The validity of the failure-to-rescue rate as a quality measure is dependent on the preventability of death and the appropriateness of this measure for use in the trauma population is untested. We sought to evaluate the relationship between preventability and failure-to-rescue in trauma. Methods All adjudications from a mortality review panel at an academic level I trauma center from 2005–2015 were merged with registry data for the same time period. The preventability of each death was determined by panel consensus as part of peer review. Failure-to-rescue deaths were defined as those occurring after any registry-defined complication. Univariate and multivariate logistic regression models between failure-to-rescue status and preventability were constructed and time to death was examined using survival time analyses. Results Of 26,557 patients, 2,735 (10.5%) had a complication, of whom 359 died for a failure-to-rescue rate of 13.2%. Of failure-to-rescue deaths, 272 (75.6%) were judged to be non-preventable, 65 (18.1%) were judged potentially preventable, and 22 (6.1%) were judged to be preventable by peer review. After adjusting for other patient factors, there remained a strong association between failure-to-rescue status and potentially preventable (odds ratio 2.32, 95% confidence interval, 1.47–3.66) and preventable (odds ratio 14.84, 95% confidence interval, 3.30–66.71) judgment. Conclusion Despite a strong association between failure-to-rescue status and preventability adjudication, only a minority of deaths meeting the definition of failure to rescue were judged to be preventable or potentially preventable. Revision of the failure-to-rescue metric before use in trauma care benchmarking is warranted. PMID:27788924
Shan, Zhi; Wade, Kelly R; Schollum, Meredith L; Robertson, Peter A; Thambyah, Ashvin; Broom, Neil D
2017-10-01
Part I of this study explored mechanisms of disc failure in a complex posture incorporating physiological amounts of flexion and shear at a loading rate considerably lower than likely to occur in a typical in vivo manual handling situation. Given the strain-rate-dependent mechanical properties of the heavily hydrated disc, loading rate will likely influence the mechanisms of disc failure. Part II investigates the mechanisms of failure in healthy discs subjected to surprise-rate compression while held in the same complex posture. 37 motion segments from 13 healthy mature ovine lumbar spines were compressed in a complex posture intended to simulate the situation arising when bending and twisting while lifting a heavy object at a displacement rate of 400 mm/min. Seven of the 37 samples reached the predetermined displacement prior to a reduction in load and were classified as early stage failures, providing insight to initial areas of disc disruption. Both groups of damaged discs were then analysed microstructurally using light microscopy. The average failure load under high rate complex loading was 6.96 kN (STD 1.48 kN), significantly lower statistically than for low rate complex loading [8.42 kN (STD 1.22 kN)]. Also, unlike simple flexion or low rate complex loading, direct radial ruptures and non-continuous mid-wall tearing in the posterior and posterolateral regions were commonly accompanied by disruption extending to the lateral and anterior disc. This study has again shown that multiple modes of damage are common when compressing a segment in a complex posture, and the load bearing ability, already less than in a neutral or flexed posture, is further compromised with high rate complex loading.
Patterns of gun deaths across US counties 1999-2013.
Kalesan, Bindu; Galea, Sandro
2017-05-01
We examined the socio-demographic distribution of gun deaths across 3143 counties in 50 United States' states to understand the spatial patterns and correlates of high and low gun deaths. We used aggregate counts of gun deaths and population in all counties from 1999 to 2013 from the Centers for Disease Control and Prevention's Wide-ranging Online Data for Epidemiologic Research (WONDER). We characterized four levels of gun violence, as distinct levels of gun death rates of relatively safe, unsafe, violent, and extremely violent counties, based on quartiles of 15-year county-specific gun death rates per 100,000 and used negative binomial regression models allowing clustering by state to calculate incidence rate ratios and 95% confidence intervals (95% CIs). Most states had at least one violent or extremely violent county. Extremely violent gun counties were mostly rural, poor, predominantly minority, had high unemployment rate and homicide rate. Overall, homicide rate was significantly associated with gun deaths (incidence rate ratios = 1.08, 95% CI = 1.06-1.09). In relatively safe counties, this risk was 1.09 (95% CI = 1.05-1.13) and in extremely violent gun counties was 1.03 (95% CI = 1.03-1.04). There are broad differences in gun death rates across the United States representing different levels of gun death rates in each state with distinct socio-demographic profiles. Copyright © 2017 Elsevier Inc. All rights reserved.
Failure mechanisms of fibrin-based surgical tissue adhesives
NASA Astrophysics Data System (ADS)
Sierra, David Hugh
A series of studies was performed to investigate the potential impact of heterogeneity in the matrix of multiple-component fibrin-based tissue adhesives upon their mechanical and biomechanical properties both in vivo and in vitro. Investigations into the failure mechanisms by stereological techniques demonstrated that heterogeneity could be measured quantitatively and that the variation in heterogeneity could be altered both by the means of component mixing and delivery and by the formulation of the sealant. Ex vivo tensile adhesive strength was found to be inversely proportional to the amount of heterogeneity. In contrast, in vivo tensile wound-closure strength was found to be relatively unaffected by the degree of heterogeneity, while in vivo parenchymal organ hemostasis in rabbits was found to be affected: greater heterogeneity appeared to correlate with an increase in hemostasis time and amount of sealant necessary to effect hemostasis. Tensile testing of the bulk sealant showed that mechanical parameters were proportional to fibrin concentration and that the physical characteristics of the failure supported a ductile mechanism. Strain hardening as a function of percentage of strain, and strain rate was observed for both concentrations, and syneresis was observed at low strain rates for the lower fibrin concentration. Blister testing demonstrated that burst pressure and failure energy were proportional to fibrin concentration and decreased with increasing flow rate. Higher fibrin concentration demonstrated predominately compact morphology debonds with cohesive failure loci, demonstrating shear or viscous failure in a viscoelastic rubbery adhesive. The lower fibrin concentration sealant exhibited predominately fractal morphology debonds with cohesive failure loci, supporting an elastoviscous material condition. The failure mechanism for these was hypothesized and shown to be flow-induced ductile fracture. Based on these findings, the failure mechanism was stochastic in nature because the mean failure energy and burst pressure values were not predictive of locus and morphology. Instead, flow rate and fibrin concentration showed the most predictive value, with the outcome best described as a probability distribution rather than a specific deterministic outcome.
NASA Astrophysics Data System (ADS)
Zhang, Z.; Cho, H. M.; Platnick, S. E.; Meyer, K.; Lebsock, M. D.
2014-12-01
The cloud optical thickness (τ) and droplet effective radius (re) are two key cloud parameters retrieved by MODIS (Moderate Resolution Imaging Spectroradiometer). These MODIS cloud products are widely used in a broad range of earth system science applications. In this paper, we present a comprehensive analysis of the failed cloud τ and/or re retrievals for liquid-phase clouds over ocean in the Collection 6 MODIS cloud product. The main findings from this study are summarized as follows: MODIS retrieval failure rates for marine boundary layer (MBL) clouds have a strong dependence on the spectral combination used for retrieval (e.g., 0.86 + 2.1 µm vs. 0.8 + 3.7 µm) and the cloud morphology (i.e., "good" pixels vs. partly cloudy (PCL) pixels). Combining all clear-sky-restoral (CSR) categories (CSR=0,1 and 3), the 0.86 + 2.1 µm and 0.86 + 3.7 µm spectral combinations have an overall failure rate of about 20% and 12%, respectively (See figure below). The PCL pixels (CSR=1 & 3) have significantly higher failure rates and contribute more to the total failure population than the "good" (CSR=0) pixels. The majority of the failed retrievals are caused by the re too large failure, which explains about 85% and 70% of the failed 0.86 + 2.1 µm and 0.86 + 3.7 µm retrievals, respectively. The remaining failures are either due to the re too small failure or τ retrieval failure. The geographical distribution of failure rates has a significant dependence on cloud regime, lower over the coastal stratocumulus cloud regime and higher over the broken trade-wind cumulus cloud regime over open oceans. Enhanced retrieval failure rates are found when MBL clouds have high sub-pixel inhomogeneity , or are located at special Sun-satellite viewing geometries, such as sunglint, large viewing or solar zenith angle, or cloudbow and glory angles, or subject to cloud masking, cloud overlapping and/or cloud phase retrieval issues. About 80% of the failure retrievals can be attributed to at least one or more potential reasons mentioned above. Collocated radar reflectivity observations from CloudSat suggest that the remaining 20% are unlikely to be retrieval artifacts, but reflection of true cloud microphysics, i.e., the true is either truly very small or very large.
Construction of moment-matching multinomial lattices using Vandermonde matrices and Gröbner bases
NASA Astrophysics Data System (ADS)
Lundengârd, Karl; Ogutu, Carolyne; Silvestrov, Sergei; Ni, Ying; Weke, Patrick
2017-01-01
In order to describe and analyze the quantitative behavior of stochastic processes, such as the process followed by a financial asset, various discretization methods are used. One such set of methods are lattice models where a time interval is divided into equal time steps and the rate of change for the process is restricted to a particular set of values in each time step. The well-known binomial- and trinomial models are the most commonly used in applications, although several kinds of higher order models have also been examined. Here we will examine various ways of designing higher order lattice schemes with different node placements in order to guarantee moment-matching with the process.
Factors affecting road mortality of white-tailed deer in eastern South Dakota
Grovenburg, Troy W.; Jenks, Jonathan A.; Klaver, Robert W.; Monteith, Kevin L.; Galster, Dwight H.; Schauer, Ron J.; Morlock, Wilbert W.; Delger, Joshua A.
2008-01-01
White-tailed deer (Odocoileus virginianus) mortalities (n = 4,433) caused by collisions with automobiles during 2003 were modeled in 35 counties in eastern South Dakota. Seventeen independent variables and 5 independent variable interactions were evaluated to explain deer mortalities. A negative binomial regression model (Ln Y = 1.25 – 0.12 [percentage tree coverage] + 0.0002 [county area] + 5.39 [county hunter success rate] + 0.0023 [vehicle proxy 96–104 km/hr roads], model deviance = 33.43, χ2 = 27.53, df = 27) was chosen using a combination of a priori model selection and AICc. Management options include use of the model to predict road mortalities and to increase the number of hunting licenses, which could result in fewer DVCs.
NASA Astrophysics Data System (ADS)
Beach, Shaun E.; Semkow, Thomas M.; Remling, David J.; Bradt, Clayton J.
2017-07-01
We have developed accessible methods to demonstrate fundamental statistics in several phenomena, in the context of teaching electronic signal processing in a physics-based college-level curriculum. A relationship between the exponential time-interval distribution and Poisson counting distribution for a Markov process with constant rate is derived in a novel way and demonstrated using nuclear counting. Negative binomial statistics is demonstrated as a model for overdispersion and justified by the effect of electronic noise in nuclear counting. The statistics of digital packets on a computer network are shown to be compatible with the fractal-point stochastic process leading to a power-law as well as generalized inverse Gaussian density distributions of time intervals between packets.
Prediction of failure pressure and leak rate of stress corrosion.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Majumdar, S.; Kasza, K.; Park, J. Y.
2002-06-24
An ''equivalent rectangular crack'' approach was employed to predict rupture pressures and leak rates through laboratory generated stress corrosion cracks and steam generator tubes removed from the McGuire Nuclear Station. Specimen flaws were sized by post-test fractography in addition to a pre-test advanced eddy current technique. The predicted and observed test data on rupture and leak rate are compared. In general, the test failure pressures and leak rates are closer to those predicted on the basis of fractography than on nondestructive evaluation (NDE). However, the predictions based on NDE results are encouraging, particularly because they have the potential to determinemore » a more detailed geometry of ligamented cracks, from which failure pressure and leak rate can be more accurately predicted. One test specimen displayed a time-dependent increase of leak rate under constant pressure.« less
Assuring reliability program effectiveness.
NASA Technical Reports Server (NTRS)
Ball, L. W.
1973-01-01
An attempt is made to provide simple identification and description of techniques that have proved to be most useful either in developing a new product or in improving reliability of an established product. The first reliability task is obtaining and organizing parts failure rate data. Other tasks are parts screening, tabulation of general failure rates, preventive maintenance, prediction of new product reliability, and statistical demonstration of achieved reliability. Five principal tasks for improving reliability involve the physics of failure research, derating of internal stresses, control of external stresses, functional redundancy, and failure effects control. A final task is the training and motivation of reliability specialist engineers.
Importance of teamwork, communication and culture on failure-to-rescue in the elderly.
Ghaferi, A A; Dimick, J B
2016-01-01
Surgical mortality increases significantly with age. Wide variations in mortality rates across hospitals suggest potential levers for improvement. Failure-to-rescue has been posited as a potential mechanism underlying these differences. A review was undertaken of the literature evaluating surgery, mortality, failure-to-rescue and the elderly. This was followed by a review of ongoing studies and unpublished work aiming to understand better the mechanisms underlying variations in surgical mortality in elderly patients. Multiple hospital macro-system factors, such as nurse staffing, available hospital technology and teaching status, are associated with differences in failure-to-rescue rates. There is emerging literature regarding important micro-system factors associated with failure-to-rescue. These are grouped into three broad categories: hospital resources, attitudes and behaviours. Ongoing work to produce interventions to reduce variations in failure-to-rescue rates include a focus on teamwork, communication and safety culture. Researchers are using novel mixed-methods approaches and theories adapted from organizational studies in high-reliability organizations in an effort to improve the care of elderly surgical patients. Although elderly surgical patients experience failure-to-rescue events at much higher rates than their younger counterparts, patient-level effects do not sufficiently explain these differences. Increased attention to the role of organizational dynamics in hospitals' ability to rescue these high-risk patients will establish high-yield interventions aimed at improving patient safety. © 2015 BJS Society Ltd Published by John Wiley & Sons Ltd.
Failure rate analysis of Goddard Space Flight Center spacecraft performance during orbital life
NASA Technical Reports Server (NTRS)
Norris, H. P.; Timmins, A. R.
1976-01-01
Space life performance data on 57 Goddard Space Flight Center spacecraft are analyzed from the standpoint of determining an appropriate reliability model and the associated reliability parameters. Data from published NASA reports, which cover the space performance of GSFC spacecraft launched in the 1960-1970 decade, form the basis of the analyses. The results of the analyses show that the time distribution of 449 malfunctions, of which 248 were classified as failures (not necessarily catastrophic), follow a reliability growth pattern that can be described with either the Duane model or a Weibull distribution. The advantages of both mathematical models are used in order to: identify space failure rates, observe chronological trends, and compare failure rates with those experienced during the prelaunch environmental tests of the flight model spacecraft.
Firearm Ownership and Violent Crime in the U.S.: An Ecologic Study.
Monuteaux, Michael C; Lee, Lois K; Hemenway, David; Mannix, Rebekah; Fleegler, Eric W
2015-08-01
Although some view the ownership of firearms as a deterrent to crime, the relationship between population-level firearm ownership rates and violent criminal perpetration is unclear. The purpose of this study is to test the association between state-level firearm ownership and violent crime. State-level rates of household firearm ownership and annual rates of criminal acts from 2001, 2002, and 2004 were analyzed in 2014. Firearm ownership rates were taken from a national survey and crime data were taken from the Federal Bureau of Investigation Uniform Crime Reports. Rates of criminal behavior were estimated as a function of household gun ownership using negative binomial regression models, controlling for several demographic factors. Higher levels of firearm ownership were associated with higher levels of firearm assault and firearm robbery. There was also a significant association between firearm ownership and firearm homicide, as well as overall homicide. The findings do not support the hypothesis that higher population firearm ownership rates reduce firearm-associated criminal perpetration. On the contrary, evidence shows that states with higher levels of firearm ownership have an increased risk for violent crimes perpetrated with a firearm. Public health stakeholders should consider the outcomes associated with private firearm ownership. Copyright © 2015 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.
Neutron-induced single event burnout in high voltage electronics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Normand, E.; Wert, J.L.; Oberg, D.L.
Energetic neutrons with an atmospheric neutron spectrum, which were demonstrated to induce single event burnout in power MOSFETs, have been shown to induce burnout in high voltage (>3,000V) electronics when operated at voltages as low as 50% of rated voltage. The laboratory failure rates correlate well with field failure rates measured in Europe.
1989-08-18
conditions, strain rate , geometry, manufacturing variables, microstructure, surface conditions, and alloy contamination. Exzvples of service failures are...depends on the ductility of the material, strain rate and stress concentration. The macrosocpic appearances of two ductile overstress fractures are shown...distribution of nucleation sites, stress orientation, temperature, ductility and strain rate . The size of the dimples is oontrolled by the size, number ard
Kissinger, Patricia; White, Scott; Manhart, Lisa E.; Schwebke, Jane; Taylor, Stephanie N; Mena, Leandro; Khosropour, Christine M; Wilcox, Larissa; Schmidt, Norine; Martin, David H
2016-01-01
Background Three recent prospective studies have suggested that the 1 g dose of azithromycin for Chlamydia trachomatis (Ct) was less effective than expected, reporting a wide range of treatment failure rates (5.8%–22.6%). Reasons for the disparate results could be attributed to geographic or methodological differences. The purpose of this study was to re-examine the studies and attempt to harmonize methodologies to reduce misclassification as a result of false positives from early test-of-cure (TOC) or reinfection as a result of sexual exposure rather than treatment failure. Methods Men who had sex with women, who received 1 g azithromycin under directly observed therapy (DOT) for presumptive treatment of nongonococcal urethritis (NGU) with confirmed Ct were included. Baseline screening was performed on urethral swabs or urine and TOC screening was performed on urine using nucleic acid amplification tests (NAAT). Post-treatment vaginal sexual exposure was elicited at TOC. Data from the three studies was obtained and re-analyzed. Rates of Ct re-test positive were examined for all cases and a sensitivity analysis was conducted to either reclassify potential false positives/reinfections as negative or remove them from the analysis. Results The crude treatment failure rate was 12.8% (31/242). The rate when potential false positives/reinfections were reclassified as negative was 6.2% (15/242) or when these were excluded from analysis was 10.9% (15/138). Conclusion In these samples of men who have sex with women with Ct-related NGU, azithromycin treatment failure was between 6.2% and 12.8%. This range of failure is lower than previously published but higher than the desired World Health Organization’s target chlamydia treatment failure rate of < 5%. PMID:27631353
Kissinger, Patricia J; White, Scott; Manhart, Lisa E; Schwebke, Jane; Taylor, Stephanie N; Mena, Leandro; Khosropour, Christine M; Wilcox, Larissa; Schmidt, Norine; Martin, David H
2016-10-01
Three recent prospective studies have suggested that the 1-g dose of azithromycin for Chlamydia trachomatis (Ct) was less effective than expected, reporting a wide range of treatment failure rates (5.8%-22.6%). Reasons for the disparate results could be attributed to geographic or methodological differences. The purpose of this study was to reexamine the studies and attempt to harmonize methodologies to reduce misclassification as a result of false positives from early test-of-cure (TOC) or reinfection as a result of sexual exposure rather than treatment failure. Men who had sex with women, who received 1-g azithromycin under directly observed therapy for presumptive treatment of nongonococcal urethritis with confirmed Ct were included. Baseline screening was performed on urethral swabs or urine, and TOC screening was performed on urine using nucleic acid amplification tests. Posttreatment vaginal sexual exposure was elicited at TOC. Data from the 3 studies were obtained and reanalyzed. Rates of Ct retest positive were examined for all cases, and a sensitivity analysis was conducted to either reclassify potential false positives/reinfections as negative or remove them from the analysis. The crude treatment failure rate was 12.8% (31/242). The rate when potential false positives/reinfections were reclassified as negative was 6.2% (15/242) or when these were excluded from analysis was 10.9% (15/138). In these samples of men who have sex with women with Ct-related nongonococcal urethritis, azithromycin treatment failure was between 6.2% and 12.8%. This range of failure is lower than previously published but higher than the desired World Health Organization's target chlamydia treatment failure rate of < 5%.
Strauss, Eric J; Verma, Nikhil N; Salata, Michael J; McGill, Kevin C; Klifto, Christopher; Nicholson, Gregory P; Cole, Brian J; Romeo, Anthony A
2014-03-01
The current study evaluated the outcomes of biologic resurfacing of the glenoid using a lateral meniscus allograft or human acellular dermal tissue matrix at intermediate-term follow-up. Forty-five patients (mean age, 42.2 years) underwent biologic resurfacing of the glenoid, and 41 were available for follow-up at a mean of 2.8 years. Lateral meniscal allograft resurfacing was used in 31 patients and human acellular dermal tissue matrix interposition in 10. Postoperative range of motion and clinical outcomes were assessed at the final follow-up. The overall clinical failure rate was 51.2%. The lateral meniscal allograft cohort had a failure rate of 45.2%, with a mean time to failure of 3.4 years. Human acellular dermal tissue matrix interposition had a failure rate of 70.0%, with a mean time to failure of 2.2 years. Overall, significant improvements were seen compared with baseline with respect to the visual analog pain score (3.0 vs. 6.3), American Shoulder and Elbow Surgeons score (62.0 vs. 36.8), and Simple Shoulder Test score (7.0 vs. 4.0). Significant improvements were seen for forward elevation (106° to 138°) and external rotation (31° to 51°). Despite significant improvements compared with baseline values, biologic resurfacing of the glenoid resulted in a high rate of clinical failure at intermediate follow-up. Our results suggest that biologic resurfacing of the glenoid may have a minimal and as yet undefined role in the management of glenohumeral arthritis in the young active patient over more traditional methods of hemiarthroplasty or total shoulder arthroplasty. Copyright © 2014 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Mosby, Inc. All rights reserved.
Generator exchange is associated with an increased rate of Sprint Fidelis lead failure.
Lovelock, Joshua D; Patel, Ayesha; Mengistu, Andenet; Hoskins, Michael; El-Chami, Mikhael; Lloyd, Michael S; Leon, Angel; DeLurgio, David; Langberg, Jonathan J
2012-10-01
The Medtronic Sprint Fidelis defibrillator lead is at an increased risk for failure and was recalled in October 2007. Approximately 268,000 leads were implanted, and more than 100,000 patients still have active Fidelis leads. A number of studies have examined the rate and clinical predictors of lead failure, but none has addressed the effect of an implantable cardioverter-defibrillator generator exchange on subsequent lead failure. Although the manufacturer asserts that "Sprint Fidelis performance after device change-out is similar to lead performance without device change-out," published data are lacking. To assess the effect of implantable cardioverter-defibrillator generator exchange on the rate of Fidelis lead failure. A chart review was conducted in patients who underwent implantation of a Fidelis lead. Patients with a functioning Fidelis lead at generator exchange were compared with controls with leads implanted for a comparable amount of time not undergoing ICD replacement. A total of 1366 patients received a Fidelis lead prior to the recall, of which 479 were still actively followed. Seventy-two patients with a functioning lead underwent generator exchange without lead replacement. Following generator replacement, 15 leads failed. Sixty percent of the Fidelis leads failed within 3 months. Generator exchange increased the rate of lead failure compared with matched controls (20.8% vs 2.54%; P < .001). Generator exchange is associated with a higher than expected rate of Fidelis lead failure, often within 3 months. The risk-benefit ratio of Fidelis lead replacement at the time of generator exchange may be greater than appreciated. Copyright © 2012 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.
The iothalamate clearance in cats with experimentally induced renal failure.
Ohashi, F; Kuroda, K; Shimada, T; Shimada, Y; Ota, M
1996-08-01
Plasma iothalamate (IOT) disappearance rates were measured after a single-injection of IOT (113.8 mg/kg, IV) in cats with experimentally induced renal failure. The disappearance rates especially fitted into the one compartment model. The mean value of plasma disappearance rates of IOT in these cats with induced renal failure (2.16 +/- 0.240 x 10(-3) micrograms/ml/min) was markedly lower than that of clinically healthy cats (4.10 +/- 1.00 x 10(-3) micrograms/ml/min). These results demonstrate that IOT clearance is available for evaluation of renal function in cats.
A heart failure initiative to reduce the length of stay and readmission rates.
White, Sabrina Marie; Hill, Alethea
2014-01-01
The purpose of this pilot was to improve multidisciplinary coordination of care and patient education and foster self-management behaviors. The primary and secondary outcomes achieved from this pilot were to decrease the 30-day readmission rate and heart failure length of stay. The primary practice site was an inpatient medical-surgical nursing unit. The length of stay decreased from 6.05% to 4.42% for heart failure diagnostic-related group 291 as a result of utilizing the model. The length of stay decreased from 3.9% to 3.09%, which was also less than the national rate of 3.8036% for diagnostic-related group 292. In addition, the readmission rate decreased from 23.1% prior to January 2013 to 12.9%. Implementation of standards of care coordination can decrease length of stay, readmission rate, and improve self-management. Implementation of evidence-based heart failure guidelines, improved interdisciplinary coordination of care, patient education, self-management skills, and transitional care at the time of discharge improved overall heart failure outcome measures. Utilizing the longitudinal model of care to transition patients to home aided in evaluating social support, resource allocation and utilization, access to care postdischarge, and interdisciplinary coordination of care. The collaboration between disciplines improved continuity of care, patient compliance to their discharge regimen, and adequate discharge follow-up.
Calvert, George T; Cummings, Judd E; Bowles, Austin J; Jones, Kevin B; Wurtz, L Daniel; Randall, R Lor
2014-03-01
Aseptic failure of massive endoprostheses used in the reconstruction of major skeletal defects remains a major clinical problem. Fixation using compressive osseointegration was developed as an alternative to cemented and traditional press-fit fixation in an effort to decrease aseptic failure rates. The purpose of this study was to answer the following questions: (1) What is the survivorship of this technique at minimum 2-year followup? (2) Were patient demographic variables (age, sex) or anatomic location associated with implant failure? (3) Were there any prosthesis-related variables (eg, spindle size) associated with failure? (4) Was there a discernible learning curve associated with the use of the new device as defined by a difference in failure rate early in the series versus later on? The first 50 cases using compressive osseointegration fixation from two tertiary referral centers were retrospectively studied. Rates of component removal for any reason and for aseptic failure were calculated. Demographic, surgical, and oncologic factors were analyzed using regression analysis to assess for association with implant failure. Minimum followup was 2 years with a mean of 66 months. Median age at the time of surgery was 14.5 years. A total of 15 (30%) implants were removed for any reason. Of these revisions, seven (14%) were the result of aseptic failure. Five of the seven aseptic failures occurred at less than 1 year (average, 8.3 months), and none occurred beyond 17 months. With the limited numbers available, no demographic, surgical, or prosthesis-related factors correlated with failure. Most aseptic failures of compressive osseointegration occurred early. Longer followup is needed to determine if this technique is superior to other forms of fixation.
Care management for low-risk patients with heart failure: a randomized, controlled trial.
DeBusk, Robert Frank; Miller, Nancy Houston; Parker, Kathleen Marie; Bandura, Albert; Kraemer, Helena Chmura; Cher, Daniel Joseph; West, Jeffrey Alan; Fowler, Michael Bruce; Greenwald, George
2004-10-19
Nurse care management programs for patients with chronic illness have been shown to be safe and effective. To determine whether a telephone-mediated nurse care management program for heart failure reduced the rate of rehospitalization for heart failure and for all causes over a 1-year period. Randomized, controlled trial of usual care with nurse management versus usual care alone in patients hospitalized for heart failure from May 1998 through October 2001. 5 northern California hospitals in a large health maintenance organization. Of 2786 patients screened, 462 met clinical criteria for heart failure and were randomly assigned (228 to intervention and 234 to usual care). Nurse care management provided structured telephone surveillance and treatment for heart failure and coordination of patients' care with primary care physicians. Time to first rehospitalization for heart failure or for any cause and time to a combined end point of first rehospitalization, emergency department visit, or death. At 1 year, half of the patients had been rehospitalized at least once and 11% had died. Only one third of rehospitalizations were for heart failure. The rate of first rehospitalization for heart failure was similar in both groups (proportional hazard, 0.85 [95% CI, 0.46 to 1.57]). The rate of all-cause rehospitalization was similar (proportional hazard, 0.98 [CI, 0.76 to 1.27]). The findings of this study, conducted in a single health care system, may not be generalizable to other health care systems. The overall effect of the intervention was minor. Among patients with heart failure at low risk on the basis of sociodemographic and medical attributes, nurse care management did not statistically significantly reduce rehospitalizations for heart failure or for any cause. Such programs may be less effective for patients at low risk than those at high risk.
Lah, Soowhan; Wilson, Emily L; Beesley, Sarah; Sagy, Iftach; Orme, James; Novack, Victor; Brown, Samuel M
2018-01-09
The Center for Medicare and Medicaid Services (CMS) and the Hospital Quality Alliance began collecting and reporting United States hospital performance in the treatment of pneumonia and heart failure in 2008. Whether the utilization of hospice might affect CMS-reported mortality and readmission rates is not known. Hospice utilization (mean days on hospice per decedent) for 2012 from the Dartmouth Atlas (a project of the Dartmouth Institute that reports a variety of public health and policy-related statistics) was merged with hospital-level 30-day mortality and readmission rates for pneumonia and heart failure from CMS. The association between hospice use and outcomes was analyzed with multivariate quantile regression controlling for quality of care metrics, acute care bed availability, regional variability and other measures. 2196 hospitals reported data to both CMS and the Dartmouth Atlas in 2012. Higher rates of hospice utilization were associated with lower rates of 30-day mortality and readmission for pneumonia but not for heart failure. Higher quality of care was associated with lower rates of mortality for both pneumonia and heart failure. Greater acute care bed availability was associated with increased readmission rates for both conditions (p < 0.05 for all). Higher rates of hospice utilization were associated with lower rates of 30-day mortality and readmission for pneumonia as reported by CMS. While causality is not established, it is possible that hospice referrals might directly affect CMS outcome metrics. Further clarification of the relationship between hospice referral patterns and publicly reported CMS outcomes appears warranted.
NASA Technical Reports Server (NTRS)
Bloomquist, C. E.; Kallmeyer, R. H.
1972-01-01
Field failure rates and confidence factors are presented for 88 identifiable components of the ground support equipment at the John F. Kennedy Space Center. For most of these, supplementary information regarding failure mode and cause is tabulated. Complete reliability assessments are included for three systems, eight subsystems, and nine generic piece-part classifications. Procedures for updating or augmenting the reliability results are also included.
Study of the Progressive Failure of Composites under Axial Loading with Varying Strain Rates
2011-12-01
8 a. Waddoups, Eisenmann , and Kaminski Failure Theory ..........8 b. Whitney-Nuismer Failure Theory ..........................................11...Width (m) WEK Waddoups, Eisenmann , and Kaminski failure theory xiv x Coordinate measured from center of notch perpendicular to direction of...comprised of differing assumptions, effort, and knowledge of material properties. a. Waddoups, Eisenmann , and Kaminski Failure Theory One of the
Failure Pressure and Leak Rate of Steam Generator Tubes With Stress Corrosion Cracks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Majumdar, S.; Kasza, K.; Park, J.Y.
2002-07-01
This paper illustrates the use of an 'equivalent rectangular crack' approach to predict leak rates through laboratory generated stress corrosion cracks. A comparison between predicted and observed test data on rupture and leak rate from laboratory generated stress corrosion cracks are provided. Specimen flaws were sized by post-test fractography in addition to pre-test advanced eddy current technique. The test failure pressures and leak rates are shown to be closer to those predicted on the basis of fractography than on NDE. However, the predictions based on NDE results are encouraging, particularly because they have the potential to determine a more detailedmore » geometry of ligamentous cracks from which more accurate predictions of failure pressure and leak rate can be made in the future. (authors)« less
Small area estimation for estimating the number of infant mortality in West Java, Indonesia
NASA Astrophysics Data System (ADS)
Anggreyani, Arie; Indahwati, Kurnia, Anang
2016-02-01
Demographic and Health Survey Indonesia (DHSI) is a national designed survey to provide information regarding birth rate, mortality rate, family planning and health. DHSI was conducted by BPS in cooperation with National Population and Family Planning Institution (BKKBN), Indonesia Ministry of Health (KEMENKES) and USAID. Based on the publication of DHSI 2012, the infant mortality rate for a period of five years before survey conducted is 32 for 1000 birth lives. In this paper, Small Area Estimation (SAE) is used to estimate the number of infant mortality in districts of West Java. SAE is a special model of Generalized Linear Mixed Models (GLMM). In this case, the incidence of infant mortality is a Poisson distribution which has equdispersion assumption. The methods to handle overdispersion are binomial negative and quasi-likelihood model. Based on the results of analysis, quasi-likelihood model is the best model to overcome overdispersion problem. The basic model of the small area estimation used basic area level model. Mean square error (MSE) which based on resampling method is used to measure the accuracy of small area estimates.
NASA Astrophysics Data System (ADS)
Leptokaropoulos, Konstantinos; Staszek, Monika; Lasocki, Stanisław; Martínez-Garzón, Patricia; Kwiatek, Grzegorz
2018-02-01
The Geysers geothermal field located in California, USA, is the largest geothermal site in the world, operating since the 1960s. We here investigate and quantify the correlation between temporal seismicity evolution and variation of the injection data by examination of time-series through specified statistical tools (binomial test to investigate significant rate changes, cross correlation between seismic and injection data, b-value variation analysis). To do so, we utilize seismicity and operational data associated with two injection wells (Prati-9 and Prati-29) which cover a time period of approximately 7 yr (from November 2007 to August 2014). The seismicity is found to be significantly positively correlated with the injection rate. The maximum correlation occurs with a seismic response delay of ˜2 weeks, following injection operations. Those results are very stable even after considering hypocentral uncertainties, by applying a vertical shift of the events foci up to 300 m. Our analysis indicates also time variations of b-value, which exhibits significant positive correlation with injection rates.
Physical activity, body functions and disability among middle-aged and older Spanish adults.
Caron, Alexandre; Ayala, Alba; Damián, Javier; Rodriguez-Blazquez, Carmen; Almazán, Javier; Castellote, Juan Manuel; Comin, Madgalena; Forjaz, Maria João; de Pedro, Jesús
2017-07-18
Physical activity (PA) is a health determinant among middle-aged and older adults. In contrast, poor health is expected to have a negative impact on PA. This study sought to assess to what extent specific International Classification of Functioning, Disability and Health (ICF) health components were associated with PA among older adults. We used a sample of 864 persons aged ≥50 years, positively screened for disability or cognition in a cross-sectional community survey in Spain. Weekly energy expenditure during PA was measured with the Yale Physical Activity Survey (YPAS) scale. The associations between body function impairment, health conditions or World Health Organization Disability Assessment Schedule (WHODAS 2.0) disability scores and energy expenditure were quantified using negative-binomial regression, and expressed in terms of adjusted mean ratios (aMRs). Mean energy expenditure was 4542 Kcal/week. A lower weekly energy expenditure was associated with: severe/extreme impairment of mental functions, aMR 0.38, 95% confidence interval, CI (0.21-0.68), and neuromusculoskeletal and movement functions, aMR 0.50 (0.35-0.72); WHODAS 2.0 disability, aMR 0.55 (0.34-0.91); dementia, aMR 0.45 (0.31-0.66); and heart failure, aMR 0.54 (0.34-0.87). In contrast, people with arthritis/osteoarthritis had a higher energy expenditure, aMR 1.27 (1.07-1.51). Our results suggest that there is a strong relationship between selected body function impairments, mainly mental, and PA. Although more research is needed to fully understand causal relationships, strategies to improve PA among the elderly may require targeting mental, neuromusculoskeletal and movement functions, disability determinants (including barriers), and specific approaches for persons with dementia or heart failure.
Ghushchyan, Vahram; Nair, Kavita V; Page, Robert L
2015-01-01
The objective of this study was to determine the direct and indirect costs of acute coronary syndromes (ACS) alone and with common cardiovascular comorbidities. A retrospective analysis was conducted using the Medical Expenditure Panel Survey from 1998 to 2009. Four mutually exclusive cohorts were evaluated: ACS only, ACS with atrial fibrillation (AF), ACS with heart failure (HF), and ACS with both conditions. Direct costs were calculated for all-cause and cardiovascular-related health care resource utilization. Indirect costs were determined from productivity losses from missed days of work. Regression analysis was developed for each outcome controlling for age, US census region, insurance coverage, sex, race, ethnicity, education attainment, family income, and comorbidity burden. A negative binomial regression model was used for health care utilization variables. A Tobit model was utilized for health care costs and productivity loss variables. Total health care costs were greatest for those with ACS and both AF and HF ($38,484±5,191) followed by ACS with HF ($32,871±2,853), ACS with AF ($25,192±2,253), and ACS only ($17,954±563). Compared with the ACS only cohort, the mean all-cause adjusted health care costs associated with ACS with AF, ACS with HF, and ACS with AF and HF were $5,073 (95% confidence interval [CI] 719-9,427), $11,297 (95% CI 5,610-16,985), and $15,761 (95% CI 4,784-26,738) higher, respectively. Average wage losses associated with ACS with and without AF and/or HF amounted to $5,266 (95% CI -7,765, -2,767), when compared with patients without these conditions. ACS imposes a significant economic burden at both the individual and society level, particularly when with comorbid AF and HF.
Tavakoli, J; Costi, J J
2018-04-15
While few studies have improved our understanding of composition and organization of elastic fibres in the inter-lamellar matrix (ILM), its clinical relevance is not fully understood. Moreover, no studies have measured the direct tensile and shear failure and viscoelastic properties of the ILM. Therefore, the aim of this study was, for the first time, to measure the viscoelastic and failure properties of the ILM in both the tension and shear directions of loading. Using an ovine model, isolated ILM samples were stretched to 40% of their initial length at three strain rates of 0.1%s -1 (slow), 1%s -1 (medium) and 10%s -1 (fast) and a ramp test to failure was performed at a strain rate of 10%s -1 . The findings from this study identified that the stiffness of the ILM was significantly larger at faster strain rates, and energy absorption significantly smaller, compared to slower strain rates, and the viscoelastic and failure properties were not significantly different under tension and shear loading. We found a strain rate dependent response of the ILM during dynamic loading, particularly at the fastest rate. The ILM demonstrated a significantly higher capability for energy absorption at slow strain rates compared to medium and fast strain rates. A significant increase in modulus was found in both loading directions and all strain rates, having a trend of larger modulus in tension and at faster strain rates. The finding of no significant difference in failure properties in both loading directions, was consistent with our previous ultra-structural studies that revealed a well-organized (±45°) elastic fibre orientation in the ILM. The results from this study can be used to develop and validate finite element models of the AF at the tissue scale, as well as providing new strategies for fabricating tissue engineered scaffolds. While few studies have improved our understanding of composition and organization of elastic fibres in the inter-lamellar matrix (ILM) of the annulus in the disc no studies have measured the direct mechanical failure and viscoelastic properties of the ILM. The findings from this study identified that the stiffness of the ILM was significantly larger at faster strain rates, and energy absorption significantly smaller, compared to slower strain rates. The failure properties of the ILM were not significantly different under tension and shear. Copyright © 2018 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
On mobile wireless ad hoc IP video transports
NASA Astrophysics Data System (ADS)
Kazantzidis, Matheos
2006-05-01
Multimedia transports in wireless, ad-hoc, multi-hop or mobile networks must be capable of obtaining information about the network and adaptively tune sending and encoding parameters to the network response. Obtaining meaningful metrics to guide a stable congestion control mechanism in the transport (i.e. passive, simple, end-to-end and network technology independent) is a complex problem. Equally difficult is obtaining a reliable QoS metrics that agrees with user perception in a client/server or distributed environment. Existing metrics, objective or subjective, are commonly used after or before to test or report on a transmission and require access to both original and transmitted frames. In this paper, we propose that an efficient and successful video delivery and the optimization of overall network QoS requires innovation in a) a direct measurement of available and bottleneck capacity for its congestion control and b) a meaningful subjective QoS metric that is dynamically reported to video sender. Once these are in place, a binomial -stable, fair and TCP friendly- algorithm can be used to determine the sending rate and other packet video parameters. An adaptive mpeg codec can then continually test and fit its parameters and temporal-spatial data-error control balance using the perceived QoS dynamic feedback. We suggest a new measurement based on a packet dispersion technique that is independent of underlying network mechanisms. We then present a binomial control based on direct measurements. We implement a QoS metric that is known to agree with user perception (MPQM) in a client/server, distributed environment by using predetermined table lookups and characterization of video content.
Association between month of birth and melanoma risk: fact or fiction?
Fiessler, Cornelia; Pfahlberg, Annette B; Keller, Andrea K; Radespiel-Tröger, Martin; Uter, Wolfgang; Gefeller, Olaf
2017-04-01
Evidence on the effect of ultraviolet radiation (UVR) exposure in infancy on melanoma risk in later life is scarce. Three recent studies suggest that people born in spring carry a higher melanoma risk. Our study aimed at verifying whether such a seasonal pattern of melanoma risk actually exists. Data from the population-based Cancer Registry Bavaria (CRB) on the birth months of 28 374 incident melanoma cases between 2002 and 2012 were analysed and compared with data from the Bavarian State Office for Statistics and Data Processing on the birth month distribution in the Bavarian population. Crude and adjusted analyses using negative binomial regression models were performed in the total study group and supplemented by several subgroup analyses. In the crude analysis, the birth months March-May were over-represented among melanoma cases. Negative binomial regression models adjusted only for sex and birth year revealed a seasonal association between melanoma risk and birth month with 13-21% higher relative incidence rates for March, April and May compared with the reference December. However, after additionally adjusting for the birth month distribution of the Bavarian population, these risk estimates decreased markedly and no association with the birth month was observed any more. Similar results emerged in all subgroup analyses. Our large registry-based study provides no evidence that people born in spring carry a higher risk for developing melanoma in later life and thus lends no support to the hypothesis of higher UVR susceptibility during the first months of life. © The Author 2016; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association
Carvalho, Vitor Oliveira; Guimarães, Guilherme Veiga; Bocchi, Edimar Alcides
2008-01-01
BACKGROUND The relationship between the percentage of oxygen consumption reserve and percentage of heart rate reserve in heart failure patients either on non-optimized or off beta-blocker therapy is known to be unreliable. The aim of this study was to evaluate the relationship between the percentage of oxygen consumption reserve and percentage of heart rate reserve in heart failure patients receiving optimized and non-optimized beta-blocker treatment during a treadmill cardiopulmonary exercise test. METHODS A total of 27 sedentary heart failure patients (86% male, 50±12 years) on optimized beta-blocker therapy with a left ventricle ejection fraction of 33±8% and 35 sedentary non-optimized heart failure patients (75% male, 47±10 years) with a left ventricle ejection fraction of 30±10% underwent the treadmill cardiopulmonary exercise test (Naughton protocol). Resting and peak effort values of both the percentage of oxygen consumption reserve and percentage of heart rate reserve were, by definition, 0 and 100, respectively. RESULTS The heart rate slope for the non-optimized group was derived from the points 0.949±0.088 (0 intercept) and 1.055±0.128 (1 intercept), p<0.0001. The heart rate slope for the optimized group was derived from the points 1.026±0.108 (0 intercept) and 1.012±0.108 (1 intercept), p=0.47. Regression linear plots for the heart rate slope for each patient in the non-optimized and optimized groups revealed a slope of 0.986 (almost perfect) for the optimized group, but the regression analysis for the non-optimized group was 0.030 (far from perfect, which occurs at 1). CONCLUSION The relationship between the percentage of oxygen consumption reserve and percentage of heart rate reserve in patients on optimized beta-blocker therapy was reliable, but this relationship was unreliable in non-optimized heart failure patients. PMID:19060991
Quality of care and investment in property, plant, and equipment in hospitals.
Levitt, S W
1994-02-01
This study explores the relationship between quality of care and investment in property, plant, and equipment (PPE) in hospitals. Hospitals' investment in PPE was derived from audited financial statements for the fiscal years 1984-1989. Peer Review Organization (PRO) Generic Quality Screen (GQS) reviews and confirmed failures between April 1989 and September 1990 were obtained from the Massachusetts PRO. Weighted least squares regression models used PRO GQS confirmed failure rates as the dependent variable, and investment in PPE as the key explanatory variable. Investment in PPE was standardized, summed by the hospital over the six years, and divided by the hospital's average number of beds in that period. The number of PRO reviewed cases with one or more GQS confirmed failures was divided by the total number of cases reviewed to create confirmed failure rates. Investment in PPE in Massachusetts hospitals is correlated with GQS confirmed failure rates. A financial variable, investment in PPE, predicts certain dimensions of quality of care in hospitals.
Exact tests using two correlated binomial variables in contemporary cancer clinical trials.
Yu, Jihnhee; Kepner, James L; Iyer, Renuka
2009-12-01
New therapy strategies for the treatment of cancer are rapidly emerging because of recent technology advances in genetics and molecular biology. Although newer targeted therapies can improve survival without measurable changes in tumor size, clinical trial conduct has remained nearly unchanged. When potentially efficacious therapies are tested, current clinical trial design and analysis methods may not be suitable for detecting therapeutic effects. We propose an exact method with respect to testing cytostatic cancer treatment using correlated bivariate binomial random variables to simultaneously assess two primary outcomes. The method is easy to implement. It does not increase the sample size over that of the univariate exact test and in most cases reduces the sample size required. Sample size calculations are provided for selected designs.
Why Do Medial Unicompartmental Knee Arthroplasties Fail Today?
van der List, Jelle P; Zuiderbaan, Hendrik A; Pearle, Andrew D
2016-05-01
Failure rates are higher in medial unicompartmental knee arthroplasty (UKA) than total knee arthroplasty. To improve these failure rates, it is important to understand why medial UKA fail. Because individual studies lack power to show failure modes, a systematic review was performed to assess medial UKA failure modes. Furthermore, we compared cohort studies with registry-based studies, early with midterm and late failures and fixed-bearing with mobile-bearing implants. Databases of PubMed, EMBASE, and Cochrane and annual registries were searched for medial UKA failures. Studies were included when they reported >25 failures or when they reported early (<5 years), midterm (5-10 years), or late failures (>10 years). Thirty-seven cohort studies (4 level II studies and 33 level III studies) and 2 registry-based studies were included. A total of 3967 overall failures, 388 time-dependent failures, and 1305 implant design failures were identified. Aseptic loosening (36%) and osteoarthritis (OA) progression (20%) were the most common failure modes. Aseptic loosening (26%) was most common early failure mode, whereas OA progression was more commonly seen in midterm and late failures (38% and 40%, respectively). Polyethylene wear (12%) and instability (12%) were more common in fixed-bearing implants, whereas pain (14%) and bearing dislocation (11%) were more common in mobile-bearing implants. This level III systematic review identified aseptic loosening and OA progression as the major failure modes. Aseptic loosening was the main failure mode in early years and mobile-bearing implants, whereas OA progression caused most failures in late years and fixed-bearing implants. Copyright © 2016 Elsevier Inc. All rights reserved.
Tucker, Adam; Warnock, Michael; McDonald, Sinead; Cusick, Laurence; Foster, Andrew P
2018-04-01
Cephalomedullary nail (CMN) failure is a rare entity following hip fracture treatment. However, it poses significant challenges for revision surgery, both mechanically and biologically. Nail failure rates have been reported at < 2%; however, no published studies have reported revision surgery procedures and their respective outcomes. We present a regional experience, with outcomes, of the revision options. We identified 20 fatigued CMNs that underwent four different revision procedures. Mean age was 73 ± 15.24 years, with a 3:1 female preponderance, and a median ASA grade of 3. Post-operative CMN radiographs demonstrated a significant number of fractures were fixed in varus, with reductions in neck-shaft angles post-operatively. A "poor" quality of reduction resulted in significantly earlier nail failure, compared to "adequate" and "good" (p = 0.027). Tip-Apex Distance (TAD) mean was 23.2 ± 8.3 mm, and an adequate TAD with three-point fixation was seen in only 35% of cases. Mean time to failure was 401.0 ± 237.2 days, with mean age at failure of 74.0 ± 14.8 years. Options after failure included revision CMN nail, proximal femoral locking plate (PFLP), long-stem or restoration arthroplasty, or femoral endoprosthesis. Barthel Functional Index scores showed no significant difference at 3 and 12 months post-operatively, nor any difference between treatment groups. Mean 12-month mortality was 30%, akin to a primary hip fracture mortality risk according to NICE guidelines. Mortality rates were lowest in revision nails. Subsequent revision rates were higher in the PFLP group. There is no reported evidence on the best surgical technique for managing the failed CMN, with no clear functional benefit in the options above. Good surgical technique at the time of primary CMN surgery is critical in minimising fatigue failure. After revision, overall mortality rates were equivalent to reported primary hip fracture mortality rates. Further multicentre evaluations are required to assess which technique convey the best functional outcomes without compromising 12-month mortality rates.
NASA Technical Reports Server (NTRS)
Peeples, Steven
2015-01-01
A three degree of freedom (DOF) spherical actuator is proposed that will replace functions requiring three single DOF actuators in robotic manipulators providing space and weight savings while reducing the overall failure rate. Exploration satellites, Space Station payload manipulators, and rovers requiring pan, tilt, and rotate movements need an actuator for each function. Not only does each actuator introduce additional failure modes and require bulky mechanical gimbals, each contains many moving parts, decreasing mean time to failure. A conventional robotic manipulator is shown in figure 1. Spherical motors perform all three actuation functions, i.e., three DOF, with only one moving part. Given a standard three actuator system whose actuators have a given failure rate compared to a spherical motor with an equal failure rate, the three actuator system is three times as likely to fail over the latter. The Jet Propulsion Laboratory reliability studies of NASA robotic spacecraft have shown that mechanical hardware/mechanism failures are more frequent and more likely to significantly affect mission success than are electronic failures. Unfortunately, previously designed spherical motors have been unable to provide the performance needed by space missions. This inadequacy is also why they are unavailable commercially. An improved patentable spherically actuated motor (SAM) is proposed to provide the performance and versatility required by NASA missions.
Effect of bladder augmentation on VP shunt failure rates in spina bifida.
Gonzalez, Dani O; Cooper, Jennifer N; McLeod, Daryl J
2017-12-11
Most patients with spina bifida require ventriculoperitoneal (VP) shunt placement. Some also require bladder augmentation, which may increase the risk of VP shunt malfunction and/or failure. The aim of this study was to assess whether bladder augmentation affects the rate of VP shunt failure in this population. Using the Pediatric Health Information System, we studied patients with spina bifida born between 1992 and 2014 who underwent VP shunt placement. Using conditional logistic regression, we compared age- and hospital-matched patients who did and did not undergo a bladder augmentation to determine their difference in rates of VP shunt failure. There were 4192 patients with spina bifida who underwent both surgical closure and VP shunt placement. Of these, 203 patients with bladder augmentation could be matched to 593 patients without bladder augmentation. VP shunt failure occurred within 2 years in 7.7% of patients, the majority of whom were in the group who underwent bladder augmentation (87%). After adjusting for confounders, undergoing bladder augmentation was independently associated with VP shunt failure (HR: 33.5, 95% CI: 13.15-85.44, p< 0.001). Bladder augmentation appears to be associated with VP shunt failure. Additional studies are necessary to better define this relationship and identify risk-reduction techniques.
Gaytán, Paul; Yáñez, Jorge; Sánchez, Filiberto; Soberón, Xavier
2001-01-01
We describe here a method to generate combinatorial libraries of oligonucleotides mutated at the codon-level, with control of the mutagenesis rate so as to create predictable binomial distributions of mutants. The method allows enrichment of the libraries with single, double or larger multiplicity of amino acid replacements by appropriate choice of the mutagenesis rate, depending on the concentration of synthetic precursors. The method makes use of two sets of deoxynucleoside-phosphoramidites bearing orthogonal protecting groups [4,4′-dimethoxytrityl (DMT) and 9-fluorenylmethoxycarbonyl (Fmoc)] in the 5′ hydroxyl. These phosphoramidites are divergently combined during automated synthesis in such a way that wild-type codons are assembled with commercial DMT-deoxynucleoside-methyl-phosphoramidites while mutant codons are assembled with Fmoc-deoxynucleoside-methyl-phosphoramidites in an NNG/C fashion in a single synthesis column. This method is easily automated and suitable for low mutagenesis rates and large windows, such as those required for directed evolution and alanine scanning. Through the assembly of three oligonucleotide libraries at different mutagenesis rates, followed by cloning at the polylinker region of plasmid pUC18 and sequencing of 129 clones, we concluded that the method performs essentially as intended. PMID:11160911
Westreich, Daniel; Becker-Dreps, Sylvia; Adair, Linda S.; Sandler, Robert S.; Sarkar, Rajiv; Kattula, Deepthi; Ward, Honorine D.; Meshnick, Steven R.; Kang, Gagandeep
2015-01-01
Background: Antibiotic treatment of childhood illnesses is common in India. In addition to contributing to antimicrobial resistance, antibiotics might result in increased susceptibility to diarrhea through interactions with the gastrointestinal microbiota. Breast milk, which enriches the microbiota early in life, may increase the resilience of the microbiota against perturbations by antibiotics. Methods: In a prospective observational cohort study, we assessed whether antibiotic exposures from birth to 6 months affected rates of diarrhea up to age 3 years among 465 children from Vellore, India. Adjusting for treatment indicators, we modeled diarrheal rates among children exposed and unexposed to antibiotics using negative binomial regression. We further assessed whether the effect of antibiotics on diarrheal rates was modified by exclusive breastfeeding at 6 months. Results: More than half of the children (n = 267, 57.4%) were given at least one course of antibiotics in the first 6 months of life. The adjusted relative incidence rate of diarrhea was 33% higher among children who received antibiotics under 6 months of age compared with those who did not (incidence rate ratio: 1.33, 95% confidence interval: 1.12, 1.57). Children who were exclusively breastfed until 6 months of age did not have increased diarrheal rates following antibiotic use. Conclusions: Antibiotic exposures early in life were associated with increased rates of diarrhea in early childhood. Exclusive breastfeeding might protect against this negative impact. PMID:25742244
Ramirez, Marizen; Bedford, Ronald; Wu, Hongqian; Harland, Karisa; Cavanaugh, Joseph E; Peek-Asa, Corinne
2016-09-01
To evaluate the effectiveness of roadway policies for lighting and marking of farm equipment in reducing crashes in Illinois, Iowa, Kansas, Minnesota, Missouri, Nebraska, North Dakota, South Dakota and Wisconsin. In this ecological study, state policies on lighting and marking of farm equipment were scored for compliance with standards of the American Society of Agricultural and Biological Engineers (ASABE). Using generalized estimating equations negative binomial models, we estimated the relationships between lighting and marking scores, and farm equipment crash rates, per 100 000 farm operations. A total of 7083 crashes involving farm equipment was reported from 2005 to 2010 in the Upper Midwest and Great Plains. As the state lighting and marking score increased by 5 units, crash rates reduced by 17% (rate ratio=0.83; 95% CI 0.78 to 0.88). Lighting-only (rate ratio=0.48; 95% CI 0.45 to 0.51) and marking-only policies (rate ratio=0.89; 95% CI 0.83 to 0.96) were each associated with reduced crash rates. Aligning lighting and marking policies with ASABE standards may effectively reduce crash rates involving farm equipment. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
State Firearm Legislation and Nonfatal Firearm Injuries.
Simonetti, Joseph A; Rowhani-Rahbar, Ali; Mills, Brianna; Young, Bessie; Rivara, Frederick P
2015-08-01
We investigated whether stricter state-level firearm legislation was associated with lower hospital discharge rates for nonfatal firearm injuries. We estimated discharge rates for hospitalized and emergency department-treated nonfatal firearm injuries in 18 states in 2010 and used negative binomial regression to determine whether strength of state firearm legislation was independently associated with total nonfatal firearm injury discharge rates. We identified 26 744 discharges for nonfatal firearm injuries. The overall age-adjusted discharge rate was 19.0 per 100 000 person-years (state range = 3.3-36.6), including 7.9 and 11.1 discharges per 100 000 for hospitalized and emergency department-treated injuries, respectively. In models adjusting for differences in state sociodemographic characteristics and economic conditions, states in the strictest tertile of legislative strength had lower discharge rates for total (incidence rate ratio [IRR] = 0.60; 95% confidence interval [CI] = 0.44, 0.82), assault-related (IRR = 0.58; 95% CI = 0.34, 0.99), self-inflicted (IRR = 0.18; 95% CI = 0.14, 0.24), and unintentional (IRR = 0.53; 95% CI = 0.34, 0.84) nonfatal firearm injuries. There is significant variation in state-level hospital discharge rates for nonfatal firearm injuries, and stricter state firearm legislation is associated with lower discharge rates for such injuries.
State Firearm Legislation and Nonfatal Firearm Injuries
Rowhani-Rahbar, Ali; Mills, Brianna; Young, Bessie; Rivara, Frederick P.
2015-01-01
Objectives. We investigated whether stricter state-level firearm legislation was associated with lower hospital discharge rates for nonfatal firearm injuries. Methods. We estimated discharge rates for hospitalized and emergency department–treated nonfatal firearm injuries in 18 states in 2010 and used negative binomial regression to determine whether strength of state firearm legislation was independently associated with total nonfatal firearm injury discharge rates. Results. We identified 26 744 discharges for nonfatal firearm injuries. The overall age-adjusted discharge rate was 19.0 per 100 000 person-years (state range = 3.3–36.6), including 7.9 and 11.1 discharges per 100 000 for hospitalized and emergency department–treated injuries, respectively. In models adjusting for differences in state sociodemographic characteristics and economic conditions, states in the strictest tertile of legislative strength had lower discharge rates for total (incidence rate ratio [IRR] = 0.60; 95% confidence interval [CI] = 0.44, 0.82), assault-related (IRR = 0.58; 95% CI = 0.34, 0.99), self-inflicted (IRR = 0.18; 95% CI = 0.14, 0.24), and unintentional (IRR = 0.53; 95% CI = 0.34, 0.84) nonfatal firearm injuries. Conclusions. There is significant variation in state-level hospital discharge rates for nonfatal firearm injuries, and stricter state firearm legislation is associated with lower discharge rates for such injuries. PMID:26066935
Sparke, Claire; Moon, Lynelle; Green, Frances; Mathew, Tim; Cass, Alan; Chadban, Steve; Chapman, Jeremy; Hoy, Wendy; McDonald, Stephen
2013-03-01
To date, incidence data for kidney failure in Australia have been available for only those who start renal replacement therapy (RRT). Information about the total incidence of kidney failure, including non-RRT-treated cases, is important to help understand the burden of kidney failure in the community and the characteristics of patients who die without receiving treatment. Data linkage study of national observational data sets. All incident treated cases recorded in the Australia and New Zealand Dialysis and Transplant Registry (ANZDATA) probabilistically linked to incident untreated kidney failure cases derived from national death registration data for 2003-2007. Age, sex, and year. Kidney failure, a combination of incident RRT or death attributed to kidney failure (without RRT). Total incidence of kidney failure (treated and untreated) and treatment rates. There were 21,370 incident cases of kidney failure in 2003-2007. The incidence rate was 20.9/100,000 population (95% CI, 18.3-24.0) and was significantly higher among older people and males (26.1/100,000 population; 95% CI, 22.5-30.0) compared with females (17.0/100,000 population; 95% CI, 14.9-19.2). There were similars number of treated (10,949) and untreated (10,421) cases, but treatment rates were influenced highly by age. More than 90% of cases in all age groups between 5 and 60 years were treated, but this percentage decreased sharply for older people; only 4% of cases in persons 85 years or older were treated (ORs for no treatment of 115 [95% CI, 118-204] for men ≥80 years and 400 [95% CI, 301-531] for women ≥80 years compared with women who were <50 years). Cross-sectional design, reliance on accurate coding of kidney failure in death registration data. Almost all Australians who develop kidney failure at younger than 60 years receive RRT, but treatment rates decrease substantially above that age. Copyright © 2013 National Kidney Foundation, Inc. All rights reserved.
Factors associated with dental implant survival: a 4-year retrospective analysis.
Zupnik, Jamie; Kim, Soo-woo; Ravens, Daniel; Karimbux, Nadeem; Guze, Kevin
2011-10-01
Dental implants are a predictable treatment option for replacing missing teeth and have strong survival and success outcomes. However, previous research showed a wide array of potential risk factors that may have contributed to dental implant failures. The objectives of this study are to study if implant survival rates were affected by known risk factors and risk indicators that may have contributed to implant failures. The secondary outcome measures were whether the level of expertise of the periodontal residents affected success rates and how the rate of implant success at the Harvard School of Dental Medicine (HSDM) compared to published standards. A retrospective chart review of patients at the HSDM who had one of two types of rough-surface implants (group A or B) placed by periodontology residents from 2003 to 2006 was performed. Demographic, health, and implant data were collected and analyzed by multimodel analyses to determine failure rates and any factors that may have increased the likelihood of an implant failure. The study cohort included 341 dental implants. The odds ratio for an implant failure was most clearly elevated for diabetes (2.59 implant surface group B (7.84), and male groups (4.01). There was no significant difference regarding the resident experience. The success rate for HSDM periodontology residents was 96.48% during the 4-year study period. This study demonstrates that implant success rates at HSDM fell within accepted published standards, confirmed previously identified risk factors for a failure, and potentially suggested that other acknowledged risk factors could be controlled for. Furthermore, the level of experience of the periodontology resident did not have an impact on survival outcomes.
Srinivasan, Murali; Vazquez, Lydia; Rieder, Philippe; Moraguez, Osvaldo; Bernard, Jean-Pierre; Belser, Urs C
2014-05-01
The aim of this review was to test the hypothesis that 6 mm micro-rough short Straumann(®) implants provide predictable survival rates and verify that most failures occurring are early failures. A PubMed and hand search was performed to identify studies involving micro-rough 6-mm-short implants published between January 1987 and August 2011. Studies were included that (i) involve Straumann(®) 6 mm implants placed in the human jaws, (ii) provide data on the survival rate, (iii) mention the time of failure, and (iv) report a minimum follow-up period of 12 months following placement. A meta-analysis was performed on the extracted data. From a total of 842 publications that were screened, 12 methodologically sound articles qualified to be included for the statistical evaluation based on our inclusion criteria. A total of 690 Straumann(®) 6-mm-short implants were evaluated in the reviewed studies (Total: placed-690, failed-25; maxilla: placed-266, failed-14; mandible: placed-364, failed-5; follow-up period: 1-8 years). A meta-analysis was performed on the calculated early cumulative survival rates (CSR%). The pooled early CSR% calculated in this meta-analysis was 93.7%, whereas the overall survival rates in the maxilla and mandible were 94.7% and 98.6% respectively. Implant failures observed were predominantly early failures (76%). This meta-analysis provides robust evidence that micro-rough 6-mm-short dental implants are a predictable treatment option, providing favorable survival rates. The failures encountered with 6-mm-short implants were predominantly early and their survival in the mandible was slightly superior. © 2013 John Wiley & Sons A/S. Published by Blackwell Publishing Ltd.
Dental implants in medically complex patients-a retrospective study.
Manor, Yifat; Simon, Roy; Haim, Doron; Garfunkel, Adi; Moses, Ofer
2017-03-01
Dental implant insertion for oral rehabilitation is a worldwide procedure for healthy and medically compromised patients. The impact of systemic disease risks on the outcome of implant therapy is unclear, since there are few if any published randomized controlled trials (RCTs). The objective of this study is to investigate the rate of complications and failures following dental implantation in medically compromised patients in order to elucidate risk factors and prevent them. A retrospective cohort study was conducted from patient files treated with dental implantation between the years 2008-2014. The study group consisted of medically complex patients while the control group consisted of healthy patients. Preoperative, intraoperative, and post operative clinical details were retrieved from patients' files. The survival rate and the success rate of the dental implants were evaluated clinically and radiographically. A total of 204 patients (1003 dental implants) were included in the research, in the study group, 93 patients with 528 dental implants and in the control group, 111 patients with 475 dental implants. No significant differences were found between the groups regarding implant failures or complications. The failure rate of dental implants among the patients was 11.8 % in the study group and 16.2 % in the control group (P = 0.04). It was found that patients with a higher number of implants (mean 6.8) had failures compared with patients with a lower number of implants (mean 4.2) regardless of their health status (P < 0.01). We found a similar rate of failure and complications of dental implantation in medically complex patients and in healthy patients. Medically complex patients can undergo dental implantation. There are similar rates of complications and failures of dental implants in medically complex patients and in healthy patients.
Induction of labor in elderly nulliparous women.
Hadar, Eran; Hiersch, Liran; Ashwal, Eran; Chen, Rony; Wiznitzer, Arnon; Gabbay-Benziv, Rinat
2017-09-01
Maternal age is an important consideration for antenatal care, labor and delivery. We aimed to evaluate the induction of labor (IoL) failure rates among elderly nulliparous women. We conducted a retrospective analysis of all nulliparous women at 34 + 0 to 41 + 6 weeks, undergoing cervical ripening by prostaglandin E2 (PGE2) vaginal insert. Study group included elderly (≥35 years) nulliparous and control group included non-elderly (<35 years) nulliparous women. Primary outcome was IoL failure rate and secondary outcome was cesarean delivery rate. Outcomes were compared between the groups by univariate analysis followed by regression analysis to adjust results to potential confounders. Of 537 women undergoing IoL, 69 (12.8%) were elderly. The univariate analysis demonstrated no difference in IoL failure rate (26.5% versus 34.8%, p = 0.502) between groups. However, elderly nulliparous women had higher rates of cesarean delivery (36.2% versus 21.4%, p = 0.009). This difference was no longer significant after adjustment for maternal body mass index, indication for delivery, birth weight and gestational age at delivery. Among nulliparous women, older maternal age is not associated with higher rates of IoL failure or cesarean deliveries.
A Random Variable Transformation Process.
ERIC Educational Resources Information Center
Scheuermann, Larry
1989-01-01
Provides a short BASIC program, RANVAR, which generates random variates for various theoretical probability distributions. The seven variates include: uniform, exponential, normal, binomial, Poisson, Pascal, and triangular. (MVL)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Brett Emery Trabun; Gamage, Thoshitha Thanushka; Bakken, David Edward
This disclosure describes, in part, a system management component and failure detection component for use in a power grid data network to identify anomalies within the network and systematically adjust the quality of service of data published by publishers and subscribed to by subscribers within the network. In one implementation, subscribers may identify a desired data rate, a minimum acceptable data rate, desired latency, minimum acceptable latency and a priority for each subscription. The failure detection component may identify an anomaly within the network and a source of the anomaly. Based on the identified anomaly, data rates and or datamore » paths may be adjusted in real-time to ensure that the power grid data network does not become overloaded and/or fail.« less
Quantifying Appropriate De-rating of SiC MOSFETs Subject to Cosmic Rays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatty, Kiran
Terrestrial Cosmic Radiation (TCR) is known to cause failures in high-voltage Si devices resulting in de-rating of the maximum reverse blocking voltage. In this work, a test setup was developed and unaccelerated TCR testing was performed on 1200V Si IGBTs, 1200V SiC MOSFETs and 1200V SiC Schottky diodes. Failures due to TCR were generated on 1200V Si IGBTs at reverse voltages from 900V to 1175V. Si IGBTs investigated in this work will need to be operated at a maximum voltage of 800V to achieve a Failure in Time (FIT) rate of 100. No failures were observed on 1200V SiC MOSFETsmore » and Schottky diodes after testing at 1200V for over 1.5 years demonstrating low FIT rates compared to Si IGBTs. 1200V SiC Schottky diodes were fabricated in this program and the packaged devices were used in the TCR testing.« less
Predictions of High Strain Rate Failure Modes in Layered Aluminum Composites
NASA Astrophysics Data System (ADS)
Khanikar, Prasenjit; Zikry, M. A.
2014-01-01
A dislocation density-based crystalline plasticity formulation, specialized finite-element techniques, and rational crystallographic orientation relations were used to predict and characterize the failure modes associated with the high strain rate behavior of aluminum layered composites. Two alloy layers, a high strength alloy, aluminum 2195, and an aluminum alloy 2139, with high toughness, were modeled with representative microstructures that included precipitates, dispersed particles, and different grain boundary distributions. Different layer arrangements were investigated for high strain rate applications and the optimal arrangement was with the high toughness 2139 layer on the bottom, which provided extensive shear strain localization, and the high strength 2195 layer on the top for high strength resistance The layer thickness of the bottom high toughness layer also affected the bending behavior of the roll-bonded interface and the potential delamination of the layers. Shear strain localization, dynamic cracking, and delamination are the mutually competing failure mechanisms for the layered metallic composite, and control of these failure modes can be used to optimize behavior for high strain rate applications.
Analysis of Failures of High Speed Shaft Bearing System in a Wind Turbine
NASA Astrophysics Data System (ADS)
Wasilczuk, Michał; Gawarkiewicz, Rafał; Bastian, Bartosz
2018-01-01
During the operation of wind turbines with gearbox of traditional configuration, consisting of one planetary stage and two helical stages high failure rate of high speed shaft bearings is observed. Such a high failures frequency is not reflected in the results of standard calculations of bearing durability. Most probably it can be attributed to atypical failure mechanism. The authors studied problems in 1.5 MW wind turbines of one of Polish wind farms. The analysis showed that the problems of high failure rate are commonly met all over the world and that the statistics for the analysed turbines were very similar. After the study of potential failure mechanism and its potential reasons, modification of the existing bearing system was proposed. Various options, with different bearing types were investigated. Different versions were examined for: expected durability increase, extent of necessary gearbox modifications and possibility to solve existing problems in operation.
Predictability of Landslide Timing From Quasi-Periodic Precursory Earthquakes
NASA Astrophysics Data System (ADS)
Bell, Andrew F.
2018-02-01
Accelerating rates of geophysical signals are observed before a range of material failure phenomena. They provide insights into the physical processes controlling failure and the basis for failure forecasts. However, examples of accelerating seismicity before landslides are rare, and their behavior and forecasting potential are largely unknown. Here I use a Bayesian methodology to apply a novel gamma point process model to investigate a sequence of quasiperiodic repeating earthquakes preceding a large landslide at Nuugaatsiaq in Greenland in June 2017. The evolution in earthquake rate is best explained by an inverse power law increase with time toward failure, as predicted by material failure theory. However, the commonly accepted power law exponent value of 1.0 is inconsistent with the data. Instead, the mean posterior value of 0.71 indicates a particularly rapid acceleration toward failure and suggests that only relatively short warning times may be possible for similar landslides in future.
Quantitative method of medication system interface evaluation.
Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F
2007-01-01
The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daniel, Isaac M.
To facilitate and accelerate the process of introducing, evaluating and adopting of new material systems, it is important to develop/establish comprehensive and effective procedures of characterization, modeling and failure prediction of structural laminates based on the properties of the constituent materials, e. g., fibers, matrix, and the single ply or lamina. A new failure theory, the Northwestern (NU-Daniel) theory, has been proposed for predicting lamina yielding and failure under multi-axial states of stress including strain rate effects. It is primarily applicable to matrix-dominated interfiber/interlaminar failures. It is based on micromechanical failure mechanisms but is expressed in terms of easily measuredmore » macroscopic lamina stiffness and strength properties. It is presented in the form of a master failure envelope incorporating strain rate effects. The theory was further adapted and extended to the prediction of in situ first ply yielding and failure (FPY and FPF) and progressive failure of multi-directional laminates under static and dynamic loadings. The significance of this theory is that it allows for rapid screening of new composite materials without very extensive testing and offers easily implemented design tools.« less
NASA Technical Reports Server (NTRS)
Feldstein, J. F.
1977-01-01
Failure data from 16 commercial spacecraft were analyzed to evaluate failure trends, reliability growth, and effectiveness of tests. It was shown that the test programs were highly effective in ensuring a high level of in-orbit reliability. There was only a single catastrophic problem in 44 years of in-orbit operation on 12 spacecraft. The results also indicate that in-orbit failure rates are highly correlated with unit and systems test failure rates. The data suggest that test effectiveness estimates can be used to guide the content of a test program to ensure that in-orbit reliability goals are achieved.
Failure rate of single dose methotrexate in managment of ectopic pregnancy.
Sendy, Feras; AlShehri, Eman; AlAjmi, Amani; Bamanie, Elham; Appani, Surekha; Shams, Taghreed
2015-01-01
Background. One of the treatment modalities for ectopic pregnancy is methotrexate. The purpose of this study is to identify the failure rate of methotrexate in treating patients with ectopic pregnancy as well as the risk factors leading to treatment failure. Methods. A retrospective chart review of 225 patients who received methotrexate as a primary management option for ectopic pregnancy. Failure of single dose of methotrexate was defined as drop of BHCG level less than or equal to 14% in the seventh day after administration of methotrexate. Results. 225 patients had methotrexate. Most of the patients (151 (67%)) received methotrexate based on the following formula: f 50 mg X body surface area. Single dose of methotrexate was successful in 72% (162/225) of the patients. 28% (63/225) were labeled as failure of single dose of methotrexate because of suboptimal drop in BhCG. 63% (40/63) of failure received a second dose of methotrexate, and 37% (23/63) underwent surgical treatment. Among patient who received initial dose of methotrexate, 71% had moderate or severe pain, and 58% had ectopic mass size of more than 4 cm on ultrasound. Conclusion. Liberal use of medical treatment of ectopic pregnancy results in 71% success rate.
NASA Astrophysics Data System (ADS)
Cutiongco, Eric C.; Chung, Yip-Wah
1994-07-01
A method for predicting scuffing failure based on the competitive kinetics of oxide formation and removal has been developed and applied to the sliding of AISI 52100 steel on steel with poly-alpha-olefin as the lubricant. Oxide formation rates were determining using static oxidation tests on coupons of 52100 steel covered with poly-alpha-olefin at temperatures of 140 C to 250 C. Oxide removal rates were determined at different combinations of initial average nominal contact pressures (950 MPa to 1578 MPa) and sliding velocities (0.4 m/s to 1.8 m/s) using a ball-on-disk vacuum tribotester. The nominal asperity flash temperatures generated during the wear tests were calculated and the temperatures corresponding to the intersection of the the Arrhenius plots of oxide formation and removal rates were determined and taken as the critical failure temperatures. The pressure-velocity failure transition diagram was constructed by plotting the critical failure temperatures along isotherms of average nominal asperity flash temperatures calculated at different combinations of contact stress and sliding speed. The predicted failure transition curve agreed well with experimental scuffing data.
The Oxford unicompartmental knee fails at a high rate in a high-volume knee practice.
Schroer, William C; Barnes, C Lowry; Diesfeld, Paul; LeMarr, Angela; Ingrassia, Rachel; Morton, Diane J; Reedy, Mary
2013-11-01
The Oxford knee is a unicompartmental implant featuring a mobile-bearing polyethylene component with excellent long-term survivorship results reported by the implant developers and early adopters. By contrast, other studies have reported higher revision rates in large academic practices and in national registries. Registry data have shown increased failure with this implant especially by lower-volume surgeons and institutions. In the setting of a high-volume knee arthroplasty practice, we sought to determine (1) the failure rate of the Oxford unicompartmental knee implant using a failure definition for aseptic loosening that combined clinical features, plain radiographs, and scintigraphy, and (2) whether increased experience with this implant would decrease failure rate, if there is a learning curve effect. Eighty-three Oxford knee prostheses were implanted between September 2005 and July 2008 by the principal investigator. Radiographic and clinical data were available for review for all cases. A failed knee was defined as having recurrent pain after an earlier period of recovery from surgery, progressive radiolucent lines compared with initial postoperative radiographs, and a bone scan showing an isolated area of uptake limited to the area of the replaced compartment. Eleven knees in this series failed (13%); Kaplan-Meier survivorship was 86.5% (95% CI, 78.0%-95.0%) at 5 years. Failure occurrences were distributed evenly over the course of the study period. No learning curve effect was identified. Based on these findings, including a high failure rate of the Oxford knee implant and the absence of any discernible learning curve effect, the principal investigator no longer uses this implant.
NASA Technical Reports Server (NTRS)
Vitali, Roberto; Lutomski, Michael G.
2004-01-01
National Aeronautics and Space Administration s (NASA) International Space Station (ISS) Program uses Probabilistic Risk Assessment (PRA) as part of its Continuous Risk Management Process. It is used as a decision and management support tool to not only quantify risk for specific conditions, but more importantly comparing different operational and management options to determine the lowest risk option and provide rationale for management decisions. This paper presents the derivation of the probability distributions used to quantify the failure rates and the probability of failures of the basic events employed in the PRA model of the ISS. The paper will show how a Bayesian approach was used with different sources of data including the actual ISS on orbit failures to enhance the confidence in results of the PRA. As time progresses and more meaningful data is gathered from on orbit failures, an increasingly accurate failure rate probability distribution for the basic events of the ISS PRA model can be obtained. The ISS PRA has been developed by mapping the ISS critical systems such as propulsion, thermal control, or power generation into event sequences diagrams and fault trees. The lowest level of indenture of the fault trees was the orbital replacement units (ORU). The ORU level was chosen consistently with the level of statistically meaningful data that could be obtained from the aerospace industry and from the experts in the field. For example, data was gathered for the solenoid valves present in the propulsion system of the ISS. However valves themselves are composed of parts and the individual failure of these parts was not accounted for in the PRA model. In other words the failure of a spring within a valve was considered a failure of the valve itself.
Hu, Mary Y.; Maroo, Seema; Kyne, Lorraine; Cloud, Jeffrey; Tummala, Sanjeev; Katchar, Kianoosh; Dreisbach, Valley; Noddin, Laura; Kelly, Ciarán P.
2009-01-01
Background & Aims Recent studies of C. difficile infection (CDI) indicate a dramatic increase in metronidazole failure. The aims of this study were to compare current and historical rates of metronidazole failure and identify risk factors for metronidazole failure. Methods 89 patients with CDI in 2004–2006 were followed for 60 days. Data were compared to a historical cohort of 63 CDI patients studied prospectively in 1998. Metronidazole failure was defined as persistent diarrhea after 10 days of therapy or a change of therapy to vancomycin. Stool samples were analyzed for the presence of NAP-1 strain. Results Metronidazole failure rates were 35% in both the 1998 and 2004–2006 cohorts. There was no difference in the median time to resolution of diarrhea (8 vs. 5 days, p = 0.52) or the proportion with more than 10 days of diarrhea (35% vs. 29%, p = 0.51). Risk factors for metronidazole failure included recent cephalosporin use (OR 32, 95% CI 5–219), CDI on admission (OR 23, 95% CI 3–156), and transfer from another hospital (OR 11, 95% CI 2–72). The frequency of NAP-1 infection in patients with and without metronidazole failure was similar (26% vs. 21%, p = 0.67). Conclusions We found no difference in metronidazole failure rates in 1998 and 2004–2006. Patients with recent cephalosporin use, CDI on admission, and transfer from another hospital were more likely to fail metronidazole and may benefit from early aggressive therapy. Infection with the epidemic NAP-1 strain was not associated with metronidazole failure in endemic CDI. PMID:19081526
Samuel, Susan M; Palacios-Derflingher, Luz; Tonelli, Marcello; Manns, Braden; Crowshoe, Lynden; Ahmed, Sofia B; Jun, Min; Saad, Nathalie; Hemmelgarn, Brenda R
2014-02-04
Despite a low prevalence of chronic kidney disease (estimated glomerular filtration rate [GFR]<60 mL/min per 1.73 m2), First Nations people have high rates of kidney failure requiring chronic dialysis or kidney transplantation. We sought to examine whether the presence and severity of albuminuria contributes to the progression of chronic kidney disease to kidney failure among First Nations people. We identified all adult residents of Alberta (age≥18 yr) for whom an outpatient serum creatinine measurement was available from May 1, 2002, to Mar. 31, 2008. We determined albuminuria using urine dipsticks and categorized results as normal (i.e., no albuminuria), mild, heavy or unmeasured. Our primary outcome was progression to kidney failure (defined as the need for chronic dialysis or kidney transplantation, or a sustained doubling of serum creatinine levels). We calculated rates of progression to kidney failure by First Nations status, by estimated GFR and by albuminuria category. We determined the relative hazard of progression to kidney failure for First Nations compared with non-First Nations participants by level of albuminuria and estimated GFR. Of the 1 816 824 participants we identified, 48 669 (2.7%) were First Nations. First Nations people were less likely to have normal albuminuria compared with non-First Nations people (38.7% v. 56.4%). Rates of progression to kidney failure were consistently 2- to 3-fold higher among First Nations people than among non-First Nations people, across all levels of albuminuria and estimated GFRs. Compared with non-First Nations people, First Nations people with an estimated GFR of 15.0-29.9 mL/min per 1.73 m2 had the highest risk of progression to kidney failure, with similar hazard ratios for those with normal and heavy albuminuria. Albuminuria confers a similar risk of progression to kidney failure for First Nations and non-First Nations people.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Menikoff, Ralph
SURFplus is a reactive burn model for high explosives aimed at modelling shock initiation and propagation of detonation waves. It utilizes the SURF model for the fast hot-spot reaction plus a slow reaction for the energy released by carbon clustering. A feature of the SURF model is that there is a partially decoupling between burn rate parameters and detonation wave properties. Previously, parameters for PBX 9502 that control shock ini- tiation had been calibrated to Pop plot data (distance-of-run to detonation as a function of shock pressure initiating the detonation). Here burn rate parameters for the high pres- sure regimemore » are adjusted to t the failure diameter and the limiting detonation speed just above the failure diameter. Simulated results are shown for an uncon ned rate stick when the 9502 diameter is slightly above and slightly below the failure diameter. Just above the failure diameter, in the rest frame of the detonation wave, the front is sonic at the PBX/air interface. As a consequence, the lead shock in the neighborhood of the interface is supported by the detonation pressure in the interior of the explosive rather than the reaction immediately behind the front. In the interior, the sonic point occurs near the end of the fast hot-spot reaction. Consequently, the slow carbon clustering reaction can not a ect the failure diameter. Below the failure diameter, the radial extent of the detonation front decreases starting from the PBX/air interface. That is, the failure starts at the PBX boundary and propagates inward to the axis of the rate stick.« less
NASA Astrophysics Data System (ADS)
Telesman, J.; Smith, T. M.; Gabb, T. P.; Ring, A. J.
2018-06-01
Cyclic near-threshold fatigue crack growth (FCG) behavior of two disk superalloys was evaluated and was shown to exhibit an unexpected sudden failure mode transition from a mostly transgranular failure mode at higher stress intensity factor ranges to an almost completely intergranular failure mode in the threshold regime. The change in failure modes was associated with a crossover of FCG resistance curves in which the conditions that produced higher FCG rates in the Paris regime resulted in lower FCG rates and increased ΔK th values in the threshold region. High-resolution scanning and transmission electron microscopy were used to carefully characterize the crack tips at these near-threshold conditions. Formation of stable Al-oxide followed by Cr-oxide and Ti-oxides was found to occur at the crack tip prior to formation of unstable oxides. To contrast with the threshold failure mode regime, a quantitative assessment of the role that the intergranular failure mode has on cyclic FCG behavior in the Paris regime was also performed. It was demonstrated that even a very limited intergranular failure content dominates the FCG response under mixed mode failure conditions.
Yield and failure criteria for composite materials under static and dynamic loading
Daniel, Isaac M.
2015-12-23
To facilitate and accelerate the process of introducing, evaluating and adopting of new material systems, it is important to develop/establish comprehensive and effective procedures of characterization, modeling and failure prediction of structural laminates based on the properties of the constituent materials, e. g., fibers, matrix, and the single ply or lamina. A new failure theory, the Northwestern (NU-Daniel) theory, has been proposed for predicting lamina yielding and failure under multi-axial states of stress including strain rate effects. It is primarily applicable to matrix-dominated interfiber/interlaminar failures. It is based on micromechanical failure mechanisms but is expressed in terms of easily measuredmore » macroscopic lamina stiffness and strength properties. It is presented in the form of a master failure envelope incorporating strain rate effects. The theory was further adapted and extended to the prediction of in situ first ply yielding and failure (FPY and FPF) and progressive failure of multi-directional laminates under static and dynamic loadings. The significance of this theory is that it allows for rapid screening of new composite materials without very extensive testing and offers easily implemented design tools.« less
The Effects of ELDRS at Ultra-Low Dose Rates
NASA Technical Reports Server (NTRS)
Chen, Dakai; Forney, James; Carts, Martin; Phan, Anthony; Pease, Ronald; Kruckmeyer, Kirby; Cox, Stephen; LaBel, Kenneth; Burns, Samuel; Albarian, Rafi;
2011-01-01
We present results on the effects on ELDRS at dose rates of 10, 5, 1, and 0.5 mrad(Si)/s for a variety of radiation hardened and commercial devices. We observed low dose rate enhancement below 10 mrad(Si)/s in several different parts. The magnitudes of the dose rate effects vary. The TL750L, a commercial voltage regulator, showed dose rate dependence in the functional failures, with initial failures occurring after 10 krad(Si) for the parts irradiated at 0.5 mrad(Si)/s. The RH1021 showed an increase in low dose rate enhancement by 2x at 5 mrad(Si)/s relative to 8 mrad(Si)/s and high dose rate, and parametric failure after 100 krad(Si). Additionally the ELDRS-free devices, such as the LM158 and LM117, showed evidence of dose rate sensitivity in parametric degradations. Several other parts also displayed dose rate enhancement, with relatively lower degradations up to approx.15 to 20 krad(Si). The magnitudes of the dose rate enhancement will likely increase in significance at higher total dose levels.
DiBona, G F; Sawin, L L
2001-08-01
Sympathetic nerve activity, including that in the kidney, is increased in heart failure with increased plasma concentrations of norepinephrine and the vasoconstrictor cotransmitter neuropeptide Y (NPY). We examined the contribution of NPY to sympathetically mediated alterations in kidney function in normal and heart failure rats. Heart failure rats were created by left coronary ligation and myocardial infarction. In anesthetized normal rats, the NPY Y(1) receptor antagonist, H 409/22, at two doses, had no effect on heart rate, arterial pressure, or renal hemodynamic and excretory function. In conscious severe heart failure rats, high-dose H 409/22 decreased mean arterial pressure by 8 +/- 2 mm Hg but had no effect in normal and mild heart failure rats. During graded frequency renal sympathetic nerve stimulation (0 to 10 Hz), high-dose H 409/22 attenuated the decreases in renal blood flow only at 10 Hz (-36% +/- 5%, P <.05) in normal rats but did so at both 4 (-29% +/- 4%, P <.05) and 10 Hz (-33% +/- 5%, P <.05) in heart failure rats. The glomerular filtration rate, urinary flow rate, and sodium excretion responses to renal sympathetic nerve stimulation were not affected by high-dose H 409/22 in either normal or heart failure rats. NPY does not participate in the regulation of kidney function and arterial pressure in normal conscious or anesthetized rats. When sympathetic nervous system activity is increased, as in heart failure and intense renal sympathetic nerve stimulation, respectively, a small contribution of NPY to maintenance of arterial pressure and to sympathetic renal vasoconstrictor responses may be identified.
Destructive Single-Event Failures in Diodes
NASA Technical Reports Server (NTRS)
Casey, Megan C.; Gigliuto, Robert A.; Lauenstein, Jean-Marie; Wilcox, Edward P.; Kim, Hak; Chen, Dakai; Phan, Anthony M.; LaBel, Kenneth A.
2013-01-01
In this summary, we have shown that diodes are susceptible to destructive single-event effects, and that these failures occur along the guard ring. By determining the last passing voltages, a safe operating area can be derived. By derating off of those values, rather than by the rated voltage, like what is currently done with power MOSFETs, we can work to ensure the safety of future missions. However, there are still open questions about these failures. Are they limited to a single manufacturer, a small number, or all of them? Is there a threshold rated voltage that must be exceeded to see these failures? With future work, we hope to answer these questions. In the full paper, laser results will also be presented to verify that failures only occur along the guard ring.
Telerehabilitation for patients with heart failure.
Tousignant, Michel; Mampuya, Warner Mbuila
2015-02-01
Heart failure is a chronic and progressive condition that is associated with high morbidity and mortality rates. Even though cardiac rehabilitation (CR) has been shown to be beneficial to heart failure patients, only a very small proportion of them will actually be referred and eventually participate. The low participation rate is due in part to accessibility and travel difficulties. Telerehabilitation is a new approach in the rehabilitation field that allows patients to receive a complete rehabilitation program at home in a safe manner and under adequate supervision. We believe that by increasing accessibility to CR, telerehabilitation programs will significantly improve heart failure patients' functional capacity and quality of life. However, it is crucial to provide policy makers with evidence-based data on cardiac telerehabilitation if we want to see its successful implementation in heart failure patients.
Heart-rate variability depression in porcine peritonitis-induced sepsis without organ failure.
Jarkovska, Dagmar; Valesova, Lenka; Chvojka, Jiri; Benes, Jan; Danihel, Vojtech; Sviglerova, Jitka; Nalos, Lukas; Matejovic, Martin; Stengl, Milan
2017-05-01
Depression of heart-rate variability (HRV) in conditions of systemic inflammation has been shown in both patients and experimental animal models and HRV has been suggested as an early indicator of sepsis. The sensitivity of HRV-derived parameters to the severity of sepsis, however, remains unclear. In this study we modified the clinically relevant porcine model of peritonitis-induced sepsis in order to avoid the development of organ failure and to test the sensitivity of HRV to such non-severe conditions. In 11 anesthetized, mechanically ventilated and instrumented domestic pigs of both sexes, sepsis was induced by fecal peritonitis. The dose of feces was adjusted and antibiotic therapy was administered to avoid multiorgan failure. Experimental subjects were screened for 40 h from the induction of sepsis. In all septic animals, sepsis with hyperdynamic circulation and increased plasma levels of inflammatory mediators developed within 12 h from the induction of peritonitis. The sepsis did not progress to multiorgan failure and there was no spontaneous death during the experiment despite a modest requirement for vasopressor therapy in most animals (9/11). A pronounced reduction of HRV and elevation of heart rate developed quickly (within 5 h, time constant of 1.97 ± 0.80 h for HRV parameter TINN) upon the induction of sepsis and were maintained throughout the experiment. The frequency domain analysis revealed a decrease in the high-frequency component. The reduction of HRV parameters and elevation of heart rate preceded sepsis-associated hemodynamic changes by several hours (time constant of 11.28 ± 2.07 h for systemic vascular resistance decline). A pronounced and fast reduction of HRV occurred in the setting of a moderate experimental porcine sepsis without organ failure. Inhibition of parasympathetic cardiac signaling probably represents the main mechanism of HRV reduction in sepsis. The sensitivity of HRV to systemic inflammation may allow early detection of a moderate sepsis without organ failure. Impact statement A pronounced and fast reduction of heart-rate variability occurred in the setting of a moderate experimental porcine sepsis without organ failure. Dominant reduction of heart-rate variability was found in the high-frequency band indicating inhibition of parasympathetic cardiac signaling as the main mechanism of heart-rate variability reduction. The sensitivity of heart-rate variability to systemic inflammation may contribute to an early detection of moderate sepsis without organ failure.
A Longitudinal Study on Human Outdoor Decomposition in Central Texas.
Suckling, Joanna K; Spradley, M Katherine; Godde, Kanya
2016-01-01
The development of a methodology that estimates the postmortem interval (PMI) from stages of decomposition is a goal for which forensic practitioners strive. A proposed equation (Megyesi et al. 2005) that utilizes total body score (TBS) and accumulated degree days (ADD) was tested using longitudinal data collected from human remains donated to the Forensic Anthropology Research Facility (FARF) at Texas State University-San Marcos. Exact binomial tests examined the rate of the equation to successfully predict ADD. Statistically significant differences were found between ADD estimated by the equation and the observed value for decomposition stage. Differences remained significant after carnivore scavenged donations were removed from analysis. Low success rates for the equation to predict ADD from TBS and the wide standard errors demonstrate the need to re-evaluate the use of this equation and methodology for PMI estimation in different environments; rather, multivariate methods and equations should be derived that are environmentally specific. © 2015 American Academy of Forensic Sciences.
A maximum entropy fracture model for low and high strain-rate fracture in TinSilverCopper alloys
NASA Astrophysics Data System (ADS)
Chan, Dennis K.
SnAgCu solder alloys exhibit significant rate-dependent constitutive behavior. Solder joints made of these alloys exhibit failure modes that are also rate-dependent. Solder joints are an integral part of microelectronic packages and are subjected to a wide variety of loading conditions which range from thermo-mechanical fatigue to impact loading. Consequently, there is a need for non-empirical rate-dependent failure theory that is able to accurately predict fracture in these solder joints. In the present thesis, various failure models are first reviewed. But, these models are typically empirical or are not valid for solder joints due to limiting assumptions such as elastic behavior. Here, the development and validation of a maximum entropy fracture model (MEFM) valid for low strain-rate fracture in SnAgCu solders is presented. To this end, work on characterizing SnAgCu solder behavior at low strain-rates using a specially designed tester to estimate parameters for constitutive models is presented. Next, the maximum entropy fracture model is reviewed. This failure model uses a single damage accumulation parameter and relates the risk of fracture to accumulated inelastic dissipation. A methodology is presented to extract this model parameter through a custom-built microscale mechanical tester for Sn3.8Ag0.7Cu solder. This single parameter is used to numerically simulate fracture in two solder joints with entirely different geometries. The simulations are compared to experimentally observed fracture in these same packages. Following the simulations of fracture at low strain rate, the constitutive behavior of solder alloys across nine decades of strain rates through MTS compression tests and split-Hopkinson bar are presented. Preliminary work on using orthogonal machining as novel technique of material characterization at high strain rates is also presented. The resultant data from the MTS compression and split-Hopkinson bar tester is used to demonstrate the localization of stress to the interface of solder joints at high strain rates. The MEFM is further extended to predict failure in brittle materials. Such an extension allows for fracture prediction within intermetallic compounds (IMCs) in solder joints. It has been experimentally observed that the failure mode shifts from bulk solder to the IMC layer with increasing loading rates. The extension of the MEFM would allow for prediction of the fracture mode within the solder joint under different loading conditions. A fracture model capable of predicting failure modes at higher strain rates is necessary, as mobile electronics are becoming ubiquitous. Mobile devices are prone to being dropped which can induce loading rates within solder joints that are much larger than experienced under thermo-mechanical fatigue. A range of possible damage accumulation parameters for Cu6Sn 5 is determined for the MEFM. A value within the aforementioned range is used to demonstrate the increasing likelihood of IMC fracture in solder joints with larger loading rates. The thesis is concluded with remarks about ongoing work that include determining a more accurate damage accumulation parameter for Cu6Sn 5 IMC, and on using machining as a technique for extracting failure parameters for the MEFM.
Failure analysis and modeling of a multicomputer system. M.S. Thesis
NASA Technical Reports Server (NTRS)
Subramani, Sujatha Srinivasan
1990-01-01
This thesis describes the results of an extensive measurement-based analysis of real error data collected from a 7-machine DEC VaxCluster multicomputer system. In addition to evaluating basic system error and failure characteristics, we develop reward models to analyze the impact of failures and errors on the system. The results show that, although 98 percent of errors in the shared resources recover, they result in 48 percent of all system failures. The analysis of rewards shows that the expected reward rate for the VaxCluster decreases to 0.5 in 100 days for a 3 out of 7 model, which is well over a 100 times that for a 7-out-of-7 model. A comparison of the reward rates for a range of k-out-of-n models indicates that the maximum increase in reward rate (0.25) occurs in going from the 6-out-of-7 model to the 5-out-of-7 model. The analysis also shows that software errors have the lowest reward (0.2 vs. 0.91 for network errors). The large loss in reward rate for software errors is due to the fact that a large proportion (94 percent) of software errors lead to failure. In comparison, the high reward rate for network errors is due to fast recovery from a majority of these errors (median recovery duration is 0 seconds).