Sample records for random effects poisson

  1. Analysis of overdispersed count data by mixtures of Poisson variables and Poisson processes.

    PubMed

    Hougaard, P; Lee, M L; Whitmore, G A

    1997-12-01

    Count data often show overdispersion compared to the Poisson distribution. Overdispersion is typically modeled by a random effect for the mean, based on the gamma distribution, leading to the negative binomial distribution for the count. This paper considers a larger family of mixture distributions, including the inverse Gaussian mixture distribution. It is demonstrated that it gives a significantly better fit for a data set on the frequency of epileptic seizures. The same approach can be used to generate counting processes from Poisson processes, where the rate or the time is random. A random rate corresponds to variation between patients, whereas a random time corresponds to variation within patients.

  2. Application of Poisson random effect models for highway network screening.

    PubMed

    Jiang, Ximiao; Abdel-Aty, Mohamed; Alamili, Samer

    2014-02-01

    In recent years, Bayesian random effect models that account for the temporal and spatial correlations of crash data became popular in traffic safety research. This study employs random effect Poisson Log-Normal models for crash risk hotspot identification. Both the temporal and spatial correlations of crash data were considered. Potential for Safety Improvement (PSI) were adopted as a measure of the crash risk. Using the fatal and injury crashes that occurred on urban 4-lane divided arterials from 2006 to 2009 in the Central Florida area, the random effect approaches were compared to the traditional Empirical Bayesian (EB) method and the conventional Bayesian Poisson Log-Normal model. A series of method examination tests were conducted to evaluate the performance of different approaches. These tests include the previously developed site consistence test, method consistence test, total rank difference test, and the modified total score test, as well as the newly proposed total safety performance measure difference test. Results show that the Bayesian Poisson model accounting for both temporal and spatial random effects (PTSRE) outperforms the model that with only temporal random effect, and both are superior to the conventional Poisson Log-Normal model (PLN) and the EB model in the fitting of crash data. Additionally, the method evaluation tests indicate that the PTSRE model is significantly superior to the PLN model and the EB model in consistently identifying hotspots during successive time periods. The results suggest that the PTSRE model is a superior alternative for road site crash risk hotspot identification. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Compositions, Random Sums and Continued Random Fractions of Poisson and Fractional Poisson Processes

    NASA Astrophysics Data System (ADS)

    Orsingher, Enzo; Polito, Federico

    2012-08-01

    In this paper we consider the relation between random sums and compositions of different processes. In particular, for independent Poisson processes N α ( t), N β ( t), t>0, we have that N_{α}(N_{β}(t)) stackrel{d}{=} sum_{j=1}^{N_{β}(t)} Xj, where the X j s are Poisson random variables. We present a series of similar cases, where the outer process is Poisson with different inner processes. We highlight generalisations of these results where the external process is infinitely divisible. A section of the paper concerns compositions of the form N_{α}(tauk^{ν}), ν∈(0,1], where tauk^{ν} is the inverse of the fractional Poisson process, and we show how these compositions can be represented as random sums. Furthermore we study compositions of the form Θ( N( t)), t>0, which can be represented as random products. The last section is devoted to studying continued fractions of Cauchy random variables with a Poisson number of levels. We evaluate the exact distribution and derive the scale parameter in terms of ratios of Fibonacci numbers.

  4. Toward negative Poisson's ratio composites: Investigation of the auxetic behavior of fibrous networks

    NASA Astrophysics Data System (ADS)

    Tatlier, Mehmet Seha

    Random fibrous can be found among natural and synthetic materials. Some of these random fibrous networks possess negative Poisson's ratio and they are extensively called auxetic materials. The governing mechanisms behind this counter intuitive property in random networks are yet to be understood and this kind of auxetic material remains widely under-explored. However, most of synthetic auxetic materials suffer from their low strength. This shortcoming can be rectified by developing high strength auxetic composites. The process of embedding auxetic random fibrous networks in a polymer matrix is an attractive alternate route to the manufacture of auxetic composites, however before such an approach can be developed, a methodology for designing fibrous networks with the desired negative Poisson's ratios must first be established. This requires an understanding of the factors which bring about negative Poisson's ratios in these materials. In this study, a numerical model is presented in order to investigate the auxetic behavior in compressed random fiber networks. Finite element analyses of three-dimensional stochastic fiber networks were performed to gain insight into the effects of parameters such as network anisotropy, network density, and degree of network compression on the out-of-plane Poisson's ratio and Young's modulus. The simulation results suggest that the compression is the critical parameter that gives rise to negative Poisson's ratio while anisotropy significantly promotes the auxetic behavior. This model can be utilized to design fibrous auxetic materials and to evaluate feasibility of developing auxetic composites by using auxetic fibrous networks as the reinforcing layer.

  5. Poisson process stimulation of an excitable membrane cable model.

    PubMed Central

    Goldfinger, M D

    1986-01-01

    The convergence of multiple inputs within a single-neuronal substrate is a common design feature of both peripheral and central nervous systems. Typically, the result of such convergence impinges upon an intracellularly contiguous axon, where it is encoded into a train of action potentials. The simplest representation of the result of convergence of multiple inputs is a Poisson process; a general representation of axonal excitability is the Hodgkin-Huxley/cable theory formalism. The present work addressed multiple input convergence upon an axon by applying Poisson process stimulation to the Hodgkin-Huxley axonal cable. The results showed that both absolute and relative refractory periods yielded in the axonal output a random but non-Poisson process. While smaller amplitude stimuli elicited a type of short-interval conditioning, larger amplitude stimuli elicited impulse trains approaching Poisson criteria except for the effects of refractoriness. These results were obtained for stimulus trains consisting of pulses of constant amplitude and constant or variable durations. By contrast, with or without stimulus pulse shape variability, the post-impulse conditional probability for impulse initiation in the steady-state was a Poisson-like process. For stimulus variability consisting of randomly smaller amplitudes or randomly longer durations, mean impulse frequency was attenuated or potentiated, respectively. Limitations and implications of these computations are discussed. PMID:3730505

  6. Assessment of Poisson, probit and linear models for genetic analysis of presence and number of black spots in Corriedale sheep.

    PubMed

    Peñagaricano, F; Urioste, J I; Naya, H; de los Campos, G; Gianola, D

    2011-04-01

    Black skin spots are associated with pigmented fibres in wool, an important quality fault. Our objective was to assess alternative models for genetic analysis of presence (BINBS) and number (NUMBS) of black spots in Corriedale sheep. During 2002-08, 5624 records from 2839 animals in two flocks, aged 1 through 6 years, were taken at shearing. Four models were considered: linear and probit for BINBS and linear and Poisson for NUMBS. All models included flock-year and age as fixed effects and animal and permanent environmental as random effects. Models were fitted to the whole data set and were also compared based on their predictive ability in cross-validation. Estimates of heritability ranged from 0.154 to 0.230 for BINBS and 0.269 to 0.474 for NUMBS. For BINBS, the probit model fitted slightly better to the data than the linear model. Predictions of random effects from these models were highly correlated, and both models exhibited similar predictive ability. For NUMBS, the Poisson model, with a residual term to account for overdispersion, performed better than the linear model in goodness of fit and predictive ability. Predictions of random effects from the Poisson model were more strongly correlated with those from BINBS models than those from the linear model. Overall, the use of probit or linear models for BINBS and of a Poisson model with a residual for NUMBS seems a reasonable choice for genetic selection purposes in Corriedale sheep. © 2010 Blackwell Verlag GmbH.

  7. Poisson's ratio of fiber-reinforced composites

    NASA Astrophysics Data System (ADS)

    Christiansson, Henrik; Helsing, Johan

    1996-05-01

    Poisson's ratio flow diagrams, that is, the Poisson's ratio versus the fiber fraction, are obtained numerically for hexagonal arrays of elastic circular fibers in an elastic matrix. High numerical accuracy is achieved through the use of an interface integral equation method. Questions concerning fixed point theorems and the validity of existing asymptotic relations are investigated and partially resolved. Our findings for the transverse effective Poisson's ratio, together with earlier results for random systems by other authors, make it possible to formulate a general statement for Poisson's ratio flow diagrams: For composites with circular fibers and where the phase Poisson's ratios are equal to 1/3, the system with the lowest stiffness ratio has the highest Poisson's ratio. For other choices of the elastic moduli for the phases, no simple statement can be made.

  8. Tailpulse signal generator

    DOEpatents

    Baker, John [Walnut Creek, CA; Archer, Daniel E [Knoxville, TN; Luke, Stanley John [Pleasanton, CA; Decman, Daniel J [Livermore, CA; White, Gregory K [Livermore, CA

    2009-06-23

    A tailpulse signal generating/simulating apparatus, system, and method designed to produce electronic pulses which simulate tailpulses produced by a gamma radiation detector, including the pileup effect caused by the characteristic exponential decay of the detector pulses, and the random Poisson distribution pulse timing for radioactive materials. A digital signal process (DSP) is programmed and configured to produce digital values corresponding to pseudo-randomly selected pulse amplitudes and pseudo-randomly selected Poisson timing intervals of the tailpulses. Pulse amplitude values are exponentially decayed while outputting the digital value to a digital to analog converter (DAC). And pulse amplitudes of new pulses are added to decaying pulses to simulate the pileup effect for enhanced realism in the simulation.

  9. Simulation of diffuse-charge capacitance in electric double layer capacitors

    NASA Astrophysics Data System (ADS)

    Sun, Ning; Gersappe, Dilip

    2017-01-01

    We use a Lattice Boltzmann Model (LBM) in order to simulate diffuse-charge dynamics in Electric Double Layer Capacitors (EDLCs). Simulations are carried out for both the charge and the discharge processes on 2D systems of complex random electrode geometries (pure random, random spheres and random fibers). The steric effect of concentrated solutions is considered by using a Modified Poisson-Nernst-Planck (MPNP) equations and compared with regular Poisson-Nernst-Planck (PNP) systems. The effects of electrode microstructures (electrode density, electrode filler morphology, filler size, etc.) on the net charge distribution and charge/discharge time are studied in detail. The influence of applied potential during discharging process is also discussed. Our studies show how electrode morphology can be used to tailor the properties of supercapacitors.

  10. A Poisson approach to the validation of failure time surrogate endpoints in individual patient data meta-analyses.

    PubMed

    Rotolo, Federico; Paoletti, Xavier; Burzykowski, Tomasz; Buyse, Marc; Michiels, Stefan

    2017-01-01

    Surrogate endpoints are often used in clinical trials instead of well-established hard endpoints for practical convenience. The meta-analytic approach relies on two measures of surrogacy: one at the individual level and one at the trial level. In the survival data setting, a two-step model based on copulas is commonly used. We present a new approach which employs a bivariate survival model with an individual random effect shared between the two endpoints and correlated treatment-by-trial interactions. We fit this model using auxiliary mixed Poisson models. We study via simulations the operating characteristics of this mixed Poisson approach as compared to the two-step copula approach. We illustrate the application of the methods on two individual patient data meta-analyses in gastric cancer, in the advanced setting (4069 patients from 20 randomized trials) and in the adjuvant setting (3288 patients from 14 randomized trials).

  11. An Intrinsic Algorithm for Parallel Poisson Disk Sampling on Arbitrary Surfaces.

    PubMed

    Ying, Xiang; Xin, Shi-Qing; Sun, Qian; He, Ying

    2013-03-08

    Poisson disk sampling plays an important role in a variety of visual computing, due to its useful statistical property in distribution and the absence of aliasing artifacts. While many effective techniques have been proposed to generate Poisson disk distribution in Euclidean space, relatively few work has been reported to the surface counterpart. This paper presents an intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces. We propose a new technique for parallelizing the dart throwing. Rather than the conventional approaches that explicitly partition the spatial domain to generate the samples in parallel, our approach assigns each sample candidate a random and unique priority that is unbiased with regard to the distribution. Hence, multiple threads can process the candidates simultaneously and resolve conflicts by checking the given priority values. It is worth noting that our algorithm is accurate as the generated Poisson disks are uniformly and randomly distributed without bias. Our method is intrinsic in that all the computations are based on the intrinsic metric and are independent of the embedding space. This intrinsic feature allows us to generate Poisson disk distributions on arbitrary surfaces. Furthermore, by manipulating the spatially varying density function, we can obtain adaptive sampling easily.

  12. Graphic Simulations of the Poisson Process.

    DTIC Science & Technology

    1982-10-01

    RANDOM NUMBERS AND TRANSFORMATIONS..o......... 11 Go THE RANDOM NUMBERGENERATOR....... .oo..... 15 III. POISSON PROCESSES USER GUIDE....oo.ooo ......... o...again. In the superimposed mode, two Poisson processes are active, each with a different rate parameter, (call them Type I and Type II with respective...occur. The value ’p’ is generated by the following equation where ’Li’ and ’L2’ are the rates of the two Poisson processes ; p = Li / (Li + L2) The value

  13. Application of zero-inflated poisson mixed models in prognostic factors of hepatitis C.

    PubMed

    Akbarzadeh Baghban, Alireza; Pourhoseingholi, Asma; Zayeri, Farid; Jafari, Ali Akbar; Alavian, Seyed Moayed

    2013-01-01

    In recent years, hepatitis C virus (HCV) infection represents a major public health problem. Evaluation of risk factors is one of the solutions which help protect people from the infection. This study aims to employ zero-inflated Poisson mixed models to evaluate prognostic factors of hepatitis C. The data was collected from a longitudinal study during 2005-2010. First, mixed Poisson regression (PR) model was fitted to the data. Then, a mixed zero-inflated Poisson model was fitted with compound Poisson random effects. For evaluating the performance of the proposed mixed model, standard errors of estimators were compared. The results obtained from mixed PR showed that genotype 3 and treatment protocol were statistically significant. Results of zero-inflated Poisson mixed model showed that age, sex, genotypes 2 and 3, the treatment protocol, and having risk factors had significant effects on viral load of HCV patients. Of these two models, the estimators of zero-inflated Poisson mixed model had the minimum standard errors. The results showed that a mixed zero-inflated Poisson model was the almost best fit. The proposed model can capture serial dependence, additional overdispersion, and excess zeros in the longitudinal count data.

  14. Full Bayes Poisson gamma, Poisson lognormal, and zero inflated random effects models: Comparing the precision of crash frequency estimates.

    PubMed

    Aguero-Valverde, Jonathan

    2013-01-01

    In recent years, complex statistical modeling approaches have being proposed to handle the unobserved heterogeneity and the excess of zeros frequently found in crash data, including random effects and zero inflated models. This research compares random effects, zero inflated, and zero inflated random effects models using a full Bayes hierarchical approach. The models are compared not just in terms of goodness-of-fit measures but also in terms of precision of posterior crash frequency estimates since the precision of these estimates is vital for ranking of sites for engineering improvement. Fixed-over-time random effects models are also compared to independent-over-time random effects models. For the crash dataset being analyzed, it was found that once the random effects are included in the zero inflated models, the probability of being in the zero state is drastically reduced, and the zero inflated models degenerate to their non zero inflated counterparts. Also by fixing the random effects over time the fit of the models and the precision of the crash frequency estimates are significantly increased. It was found that the rankings of the fixed-over-time random effects models are very consistent among them. In addition, the results show that by fixing the random effects over time, the standard errors of the crash frequency estimates are significantly reduced for the majority of the segments on the top of the ranking. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Poisson statistics of PageRank probabilities of Twitter and Wikipedia networks

    NASA Astrophysics Data System (ADS)

    Frahm, Klaus M.; Shepelyansky, Dima L.

    2014-04-01

    We use the methods of quantum chaos and Random Matrix Theory for analysis of statistical fluctuations of PageRank probabilities in directed networks. In this approach the effective energy levels are given by a logarithm of PageRank probability at a given node. After the standard energy level unfolding procedure we establish that the nearest spacing distribution of PageRank probabilities is described by the Poisson law typical for integrable quantum systems. Our studies are done for the Twitter network and three networks of Wikipedia editions in English, French and German. We argue that due to absence of level repulsion the PageRank order of nearby nodes can be easily interchanged. The obtained Poisson law implies that the nearby PageRank probabilities fluctuate as random independent variables.

  16. Mixed-Poisson Point Process with Partially-Observed Covariates: Ecological Momentary Assessment of Smoking.

    PubMed

    Neustifter, Benjamin; Rathbun, Stephen L; Shiffman, Saul

    2012-01-01

    Ecological Momentary Assessment is an emerging method of data collection in behavioral research that may be used to capture the times of repeated behavioral events on electronic devices, and information on subjects' psychological states through the electronic administration of questionnaires at times selected from a probability-based design as well as the event times. A method for fitting a mixed Poisson point process model is proposed for the impact of partially-observed, time-varying covariates on the timing of repeated behavioral events. A random frailty is included in the point-process intensity to describe variation among subjects in baseline rates of event occurrence. Covariate coefficients are estimated using estimating equations constructed by replacing the integrated intensity in the Poisson score equations with a design-unbiased estimator. An estimator is also proposed for the variance of the random frailties. Our estimators are robust in the sense that no model assumptions are made regarding the distribution of the time-varying covariates or the distribution of the random effects. However, subject effects are estimated under gamma frailties using an approximate hierarchical likelihood. The proposed approach is illustrated using smoking data.

  17. Stationary and non-stationary occurrences of miniature end plate potentials are well described as stationary and non-stationary Poisson processes in the mollusc Navanax inermis.

    PubMed

    Cappell, M S; Spray, D C; Bennett, M V

    1988-06-28

    Protractor muscles in the gastropod mollusc Navanax inermis exhibit typical spontaneous miniature end plate potentials with mean amplitude 1.71 +/- 1.19 (standard deviation) mV. The evoked end plate potential is quantized, with a quantum equal to the miniature end plate potential amplitude. When their rate is stationary, occurrence of miniature end plate potentials is a random, Poisson process. When non-stationary, spontaneous miniature end plate potential occurrence is a non-stationary Poisson process, a Poisson process with the mean frequency changing with time. This extends the random Poisson model for miniature end plate potentials to the frequently observed non-stationary occurrence. Reported deviations from a Poisson process can sometimes be accounted for by the non-stationary Poisson process and more complex models, such as clustered release, are not always needed.

  18. Effect of motivational interviewing on rates of early childhood caries: a randomized trial.

    PubMed

    Harrison, Rosamund; Benton, Tonya; Everson-Stewart, Siobhan; Weinstein, Phil

    2007-01-01

    The purposes of this randomized controlled trial were to: (1) test motivational interviewing (MI) to prevent early childhood caries; and (2) use Poisson regression for data analysis. A total of 240 South Asian children 6 to 18 months old were enrolled and randomly assigned to either the MI or control condition. Children had a dental exam, and their mothers completed pretested instruments at baseline and 1 and 2 years postintervention. Other covariates that might explain outcomes over and above treatment differences were modeled using Poisson regression. Hazard ratios were produced. Analyses included all participants whenever possible. Poisson regression supported a protective effect of MI (hazard ratio [HR]=0.54 (95%CI=035-0.84)-that is, the M/ group had about a 46% lower rate of dmfs at 2 years than did control children. Similar treatment effect estimates were obtained from models that included, as alternative outcomes, ds, dms, and dmfs, including "white spot lesions." Exploratory analyses revealed that rates of dmfs were higher in children whose mothers had: (1) prechewed their food; (2) been raised in a rural environment; and (3) a higher family income (P<.05). A motivational interviewing-style intervention shows promise to promote preventive behaviors in mothers of young children at high risk for caries.

  19. Paretian Poisson Processes

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Klafter, Joseph

    2008-05-01

    Many random populations can be modeled as a countable set of points scattered randomly on the positive half-line. The points may represent magnitudes of earthquakes and tornados, masses of stars, market values of public companies, etc. In this article we explore a specific class of random such populations we coin ` Paretian Poisson processes'. This class is elemental in statistical physics—connecting together, in a deep and fundamental way, diverse issues including: the Poisson distribution of the Law of Small Numbers; Paretian tail statistics; the Fréchet distribution of Extreme Value Theory; the one-sided Lévy distribution of the Central Limit Theorem; scale-invariance, renormalization and fractality; resilience to random perturbations.

  20. Simple, Efficient Estimators of Treatment Effects in Randomized Trials Using Generalized Linear Models to Leverage Baseline Variables

    PubMed Central

    Rosenblum, Michael; van der Laan, Mark J.

    2010-01-01

    Models, such as logistic regression and Poisson regression models, are often used to estimate treatment effects in randomized trials. These models leverage information in variables collected before randomization, in order to obtain more precise estimates of treatment effects. However, there is the danger that model misspecification will lead to bias. We show that certain easy to compute, model-based estimators are asymptotically unbiased even when the working model used is arbitrarily misspecified. Furthermore, these estimators are locally efficient. As a special case of our main result, we consider a simple Poisson working model containing only main terms; in this case, we prove the maximum likelihood estimate of the coefficient corresponding to the treatment variable is an asymptotically unbiased estimator of the marginal log rate ratio, even when the working model is arbitrarily misspecified. This is the log-linear analog of ANCOVA for linear models. Our results demonstrate one application of targeted maximum likelihood estimation. PMID:20628636

  1. Dynamics of a prey-predator system under Poisson white noise excitation

    NASA Astrophysics Data System (ADS)

    Pan, Shan-Shan; Zhu, Wei-Qiu

    2014-10-01

    The classical Lotka-Volterra (LV) model is a well-known mathematical model for prey-predator ecosystems. In the present paper, the pulse-type version of stochastic LV model, in which the effect of a random natural environment has been modeled as Poisson white noise, is investigated by using the stochastic averaging method. The averaged generalized Itô stochastic differential equation and Fokker-Planck-Kolmogorov (FPK) equation are derived for prey-predator ecosystem driven by Poisson white noise. Approximate stationary solution for the averaged generalized FPK equation is obtained by using the perturbation method. The effect of prey self-competition parameter ɛ2 s on ecosystem behavior is evaluated. The analytical result is confirmed by corresponding Monte Carlo (MC) simulation.

  2. Effects of unstratified and centre-stratified randomization in multi-centre clinical trials.

    PubMed

    Anisimov, Vladimir V

    2011-01-01

    This paper deals with the analysis of randomization effects in multi-centre clinical trials. The two randomization schemes most often used in clinical trials are considered: unstratified and centre-stratified block-permuted randomization. The prediction of the number of patients randomized to different treatment arms in different regions during the recruitment period accounting for the stochastic nature of the recruitment and effects of multiple centres is investigated. A new analytic approach using a Poisson-gamma patient recruitment model (patients arrive at different centres according to Poisson processes with rates sampled from a gamma distributed population) and its further extensions is proposed. Closed-form expressions for corresponding distributions of the predicted number of the patients randomized in different regions are derived. In the case of two treatments, the properties of the total imbalance in the number of patients on treatment arms caused by using centre-stratified randomization are investigated and for a large number of centres a normal approximation of imbalance is proved. The impact of imbalance on the power of the study is considered. It is shown that the loss of statistical power is practically negligible and can be compensated by a minor increase in sample size. The influence of patient dropout is also investigated. The impact of randomization on predicted drug supply overage is discussed. Copyright © 2010 John Wiley & Sons, Ltd.

  3. Wigner surmises and the two-dimensional homogeneous Poisson point process.

    PubMed

    Sakhr, Jamal; Nieminen, John M

    2006-04-01

    We derive a set of identities that relate the higher-order interpoint spacing statistics of the two-dimensional homogeneous Poisson point process to the Wigner surmises for the higher-order spacing distributions of eigenvalues from the three classical random matrix ensembles. We also report a remarkable identity that equates the second-nearest-neighbor spacing statistics of the points of the Poisson process and the nearest-neighbor spacing statistics of complex eigenvalues from Ginibre's ensemble of 2 x 2 complex non-Hermitian random matrices.

  4. Hurdle models for multilevel zero-inflated data via h-likelihood.

    PubMed

    Molas, Marek; Lesaffre, Emmanuel

    2010-12-30

    Count data often exhibit overdispersion. One type of overdispersion arises when there is an excess of zeros in comparison with the standard Poisson distribution. Zero-inflated Poisson and hurdle models have been proposed to perform a valid likelihood-based analysis to account for the surplus of zeros. Further, data often arise in clustered, longitudinal or multiple-membership settings. The proper analysis needs to reflect the design of a study. Typically random effects are used to account for dependencies in the data. We examine the h-likelihood estimation and inference framework for hurdle models with random effects for complex designs. We extend the h-likelihood procedures to fit hurdle models, thereby extending h-likelihood to truncated distributions. Two applications of the methodology are presented. Copyright © 2010 John Wiley & Sons, Ltd.

  5. Deterministic multidimensional nonuniform gap sampling.

    PubMed

    Worley, Bradley; Powers, Robert

    2015-12-01

    Born from empirical observations in nonuniformly sampled multidimensional NMR data relating to gaps between sampled points, the Poisson-gap sampling method has enjoyed widespread use in biomolecular NMR. While the majority of nonuniform sampling schemes are fully randomly drawn from probability densities that vary over a Nyquist grid, the Poisson-gap scheme employs constrained random deviates to minimize the gaps between sampled grid points. We describe a deterministic gap sampling method, based on the average behavior of Poisson-gap sampling, which performs comparably to its random counterpart with the additional benefit of completely deterministic behavior. We also introduce a general algorithm for multidimensional nonuniform sampling based on a gap equation, and apply it to yield a deterministic sampling scheme that combines burst-mode sampling features with those of Poisson-gap schemes. Finally, we derive a relationship between stochastic gap equations and the expectation value of their sampling probability densities. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Effectiveness of preventive home visits in reducing the risk of falls in old age: a randomized controlled trial

    PubMed Central

    Luck, Tobias; Motzek, Tom; Luppa, Melanie; Matschinger, Herbert; Fleischer, Steffen; Sesselmann, Yves; Roling, Gudrun; Beutner, Katrin; König, Hans-Helmut; Behrens, Johann; Riedel-Heller, Steffi G

    2013-01-01

    Background Falls in older people are a major public health issue, but the underlying causes are complex. We sought to evaluate the effectiveness of preventive home visits as a multifactorial, individualized strategy to reduce falls in community-dwelling older people. Methods Data were derived from a prospective randomized controlled trial with follow-up examination after 18 months. Two hundred and thirty participants (≥80 years of age) with functional impairment were randomized to intervention and control groups. The intervention group received up to three preventive home visits including risk assessment, home counseling intervention, and a booster session. The control group received no preventive home visits. Structured interviews at baseline and follow-up provided information concerning falls in both study groups. Random-effects Poisson regression evaluated the effect of preventive home visits on the number of falls controlling for covariates. Results Random-effects Poisson regression showed a significant increase in the number of falls between baseline and follow-up in the control group (incidence rate ratio 1.96) and a significant decrease in the intervention group (incidence rate ratio 0.63) controlling for age, sex, family status, level of care, and impairment in activities of daily living. Conclusion Our results indicate that a preventive home visiting program can be effective in reducing falls in community-dwelling older people. PMID:23788832

  7. Mixed effect Poisson log-linear models for clinical and epidemiological sleep hypnogram data

    PubMed Central

    Swihart, Bruce J.; Caffo, Brian S.; Crainiceanu, Ciprian; Punjabi, Naresh M.

    2013-01-01

    Bayesian Poisson log-linear multilevel models scalable to epidemiological studies are proposed to investigate population variability in sleep state transition rates. Hierarchical random effects are used to account for pairings of subjects and repeated measures within those subjects, as comparing diseased to non-diseased subjects while minimizing bias is of importance. Essentially, non-parametric piecewise constant hazards are estimated and smoothed, allowing for time-varying covariates and segment of the night comparisons. The Bayesian Poisson regression is justified through a re-derivation of a classical algebraic likelihood equivalence of Poisson regression with a log(time) offset and survival regression assuming exponentially distributed survival times. Such re-derivation allows synthesis of two methods currently used to analyze sleep transition phenomena: stratified multi-state proportional hazards models and log-linear models with GEE for transition counts. An example data set from the Sleep Heart Health Study is analyzed. Supplementary material includes the analyzed data set as well as the code for a reproducible analysis. PMID:22241689

  8. Nonparametric estimation of the heterogeneity of a random medium using compound Poisson process modeling of wave multiple scattering.

    PubMed

    Le Bihan, Nicolas; Margerin, Ludovic

    2009-07-01

    In this paper, we present a nonparametric method to estimate the heterogeneity of a random medium from the angular distribution of intensity of waves transmitted through a slab of random material. Our approach is based on the modeling of forward multiple scattering using compound Poisson processes on compact Lie groups. The estimation technique is validated through numerical simulations based on radiative transfer theory.

  9. Spatiotemporal hurdle models for zero-inflated count data: Exploring trends in emergency department visits.

    PubMed

    Neelon, Brian; Chang, Howard H; Ling, Qiang; Hastings, Nicole S

    2016-12-01

    Motivated by a study exploring spatiotemporal trends in emergency department use, we develop a class of two-part hurdle models for the analysis of zero-inflated areal count data. The models consist of two components-one for the probability of any emergency department use and one for the number of emergency department visits given use. Through a hierarchical structure, the models incorporate both patient- and region-level predictors, as well as spatially and temporally correlated random effects for each model component. The random effects are assigned multivariate conditionally autoregressive priors, which induce dependence between the components and provide spatial and temporal smoothing across adjacent spatial units and time periods, resulting in improved inferences. To accommodate potential overdispersion, we consider a range of parametric specifications for the positive counts, including truncated negative binomial and generalized Poisson distributions. We adopt a Bayesian inferential approach, and posterior computation is handled conveniently within standard Bayesian software. Our results indicate that the negative binomial and generalized Poisson hurdle models vastly outperform the Poisson hurdle model, demonstrating that overdispersed hurdle models provide a useful approach to analyzing zero-inflated spatiotemporal data. © The Author(s) 2014.

  10. Scaling the Poisson Distribution

    ERIC Educational Resources Information Center

    Farnsworth, David L.

    2014-01-01

    We derive the additive property of Poisson random variables directly from the probability mass function. An important application of the additive property to quality testing of computer chips is presented.

  11. Stochastic analysis of three-dimensional flow in a bounded domain

    USGS Publications Warehouse

    Naff, R.L.; Vecchia, A.V.

    1986-01-01

    A commonly accepted first-order approximation of the equation for steady state flow in a fully saturated spatially random medium has the form of Poisson's equation. This form allows for the advantageous use of Green's functions to solve for the random output (hydraulic heads) in terms of a convolution over the random input (the logarithm of hydraulic conductivity). A solution for steady state three- dimensional flow in an aquifer bounded above and below is presented; consideration of these boundaries is made possible by use of Green's functions to solve Poisson's equation. Within the bounded domain the medium hydraulic conductivity is assumed to be a second-order stationary random process as represented by a simple three-dimensional covariance function. Upper and lower boundaries are taken to be no-flow boundaries; the mean flow vector lies entirely in the horizontal dimensions. The resulting hydraulic head covariance function exhibits nonstationary effects resulting from the imposition of boundary conditions. Comparisons are made with existing infinite domain solutions.

  12. Application of the Hotelling and ideal observers to detection and localization of exoplanets.

    PubMed

    Caucci, Luca; Barrett, Harrison H; Devaney, Nicholas; Rodríguez, Jeffrey J

    2007-12-01

    The ideal linear discriminant or Hotelling observer is widely used for detection tasks and image-quality assessment in medical imaging, but it has had little application in other imaging fields. We apply it to detection of planets outside of our solar system with long-exposure images obtained from ground-based or space-based telescopes. The statistical limitations in this problem include Poisson noise arising mainly from the host star, electronic noise in the image detector, randomness or uncertainty in the point-spread function (PSF) of the telescope, and possibly a random background. PSF randomness is reduced but not eliminated by the use of adaptive optics. We concentrate here on the effects of Poisson and electronic noise, but we also show how to extend the calculation to include a random PSF. For the case where the PSF is known exactly, we compare the Hotelling observer to other observers commonly used for planet detection; comparison is based on receiver operating characteristic (ROC) and localization ROC (LROC) curves.

  13. Application of the Hotelling and ideal observers to detection and localization of exoplanets

    PubMed Central

    Caucci, Luca; Barrett, Harrison H.; Devaney, Nicholas; Rodríguez, Jeffrey J.

    2008-01-01

    The ideal linear discriminant or Hotelling observer is widely used for detection tasks and image-quality assessment in medical imaging, but it has had little application in other imaging fields. We apply it to detection of planets outside of our solar system with long-exposure images obtained from ground-based or space-based telescopes. The statistical limitations in this problem include Poisson noise arising mainly from the host star, electronic noise in the image detector, randomness or uncertainty in the point-spread function (PSF) of the telescope, and possibly a random background. PSF randomness is reduced but not eliminated by the use of adaptive optics. We concentrate here on the effects of Poisson and electronic noise, but we also show how to extend the calculation to include a random PSF. For the case where the PSF is known exactly, we compare the Hotelling observer to other observers commonly used for planet detection; comparison is based on receiver operating characteristic (ROC) and localization ROC (LROC) curves. PMID:18059905

  14. Bayesian random-effect model for predicting outcome fraught with heterogeneity--an illustration with episodes of 44 patients with intractable epilepsy.

    PubMed

    Yen, A M-F; Liou, H-H; Lin, H-L; Chen, T H-H

    2006-01-01

    The study aimed to develop a predictive model to deal with data fraught with heterogeneity that cannot be explained by sampling variation or measured covariates. The random-effect Poisson regression model was first proposed to deal with over-dispersion for data fraught with heterogeneity after making allowance for measured covariates. Bayesian acyclic graphic model in conjunction with Markov Chain Monte Carlo (MCMC) technique was then applied to estimate the parameters of both relevant covariates and random effect. Predictive distribution was then generated to compare the predicted with the observed for the Bayesian model with and without random effect. Data from repeated measurement of episodes among 44 patients with intractable epilepsy were used as an illustration. The application of Poisson regression without taking heterogeneity into account to epilepsy data yielded a large value of heterogeneity (heterogeneity factor = 17.90, deviance = 1485, degree of freedom (df) = 83). After taking the random effect into account, the value of heterogeneity factor was greatly reduced (heterogeneity factor = 0.52, deviance = 42.5, df = 81). The Pearson chi2 for the comparison between the expected seizure frequencies and the observed ones at two and three months of the model with and without random effect were 34.27 (p = 1.00) and 1799.90 (p < 0.0001), respectively. The Bayesian acyclic model using the MCMC method was demonstrated to have great potential for disease prediction while data show over-dispersion attributed either to correlated property or to subject-to-subject variability.

  15. Probabilistic Estimation of Rare Random Collisions in 3 Space

    DTIC Science & Technology

    2009-03-01

    extended Poisson process as a feature of probability theory. With the bulk of research in extended Poisson processes going into parame- ter estimation, the...application of extended Poisson processes to spatial processes is largely untouched. Faddy performed a short study of spatial data, but overtly...the theory of extended Poisson processes . To date, the processes are limited in that the rates only depend on the number of arrivals at some time

  16. Study of non-Hodgkin's lymphoma mortality associated with industrial pollution in Spain, using Poisson models

    PubMed Central

    Ramis, Rebeca; Vidal, Enrique; García-Pérez, Javier; Lope, Virginia; Aragonés, Nuria; Pérez-Gómez, Beatriz; Pollán, Marina; López-Abente, Gonzalo

    2009-01-01

    Background Non-Hodgkin's lymphomas (NHLs) have been linked to proximity to industrial areas, but evidence regarding the health risk posed by residence near pollutant industries is very limited. The European Pollutant Emission Register (EPER) is a public register that furnishes valuable information on industries that release pollutants to air and water, along with their geographical location. This study sought to explore the relationship between NHL mortality in small areas in Spain and environmental exposure to pollutant emissions from EPER-registered industries, using three Poisson-regression-based mathematical models. Methods Observed cases were drawn from mortality registries in Spain for the period 1994–2003. Industries were grouped into the following sectors: energy; metal; mineral; organic chemicals; waste; paper; food; and use of solvents. Populations having an industry within a radius of 1, 1.5, or 2 kilometres from the municipal centroid were deemed to be exposed. Municipalities outside those radii were considered as reference populations. The relative risks (RRs) associated with proximity to pollutant industries were estimated using the following methods: Poisson Regression; mixed Poisson model with random provincial effect; and spatial autoregressive modelling (BYM model). Results Only proximity of paper industries to population centres (>2 km) could be associated with a greater risk of NHL mortality (mixed model: RR:1.24, 95% CI:1.09–1.42; BYM model: RR:1.21, 95% CI:1.01–1.45; Poisson model: RR:1.16, 95% CI:1.06–1.27). Spatial models yielded higher estimates. Conclusion The reported association between exposure to air pollution from the paper, pulp and board industry and NHL mortality is independent of the model used. Inclusion of spatial random effects terms in the risk estimate improves the study of associations between environmental exposures and mortality. The EPER could be of great utility when studying the effects of industrial pollution on the health of the population. PMID:19159450

  17. An intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces.

    PubMed

    Ying, Xiang; Xin, Shi-Qing; Sun, Qian; He, Ying

    2013-09-01

    Poisson disk sampling has excellent spatial and spectral properties, and plays an important role in a variety of visual computing. Although many promising algorithms have been proposed for multidimensional sampling in euclidean space, very few studies have been reported with regard to the problem of generating Poisson disks on surfaces due to the complicated nature of the surface. This paper presents an intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces. In sharp contrast to the conventional parallel approaches, our method neither partitions the given surface into small patches nor uses any spatial data structure to maintain the voids in the sampling domain. Instead, our approach assigns each sample candidate a random and unique priority that is unbiased with regard to the distribution. Hence, multiple threads can process the candidates simultaneously and resolve conflicts by checking the given priority values. Our algorithm guarantees that the generated Poisson disks are uniformly and randomly distributed without bias. It is worth noting that our method is intrinsic and independent of the embedding space. This intrinsic feature allows us to generate Poisson disk patterns on arbitrary surfaces in IR(n). To our knowledge, this is the first intrinsic, parallel, and accurate algorithm for surface Poisson disk sampling. Furthermore, by manipulating the spatially varying density function, we can obtain adaptive sampling easily.

  18. On the fractal characterization of Paretian Poisson processes

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo I.; Sokolov, Igor M.

    2012-06-01

    Paretian Poisson processes are Poisson processes which are defined on the positive half-line, have maximal points, and are quantified by power-law intensities. Paretian Poisson processes are elemental in statistical physics, and are the bedrock of a host of power-law statistics ranging from Pareto's law to anomalous diffusion. In this paper we establish evenness-based fractal characterizations of Paretian Poisson processes. Considering an array of socioeconomic evenness-based measures of statistical heterogeneity, we show that: amongst the realm of Poisson processes which are defined on the positive half-line, and have maximal points, Paretian Poisson processes are the unique class of 'fractal processes' exhibiting scale-invariance. The results established in this paper are diametric to previous results asserting that the scale-invariance of Poisson processes-with respect to physical randomness-based measures of statistical heterogeneity-is characterized by exponential Poissonian intensities.

  19. Bayesian dynamic modeling of time series of dengue disease case counts.

    PubMed

    Martínez-Bello, Daniel Adyro; López-Quílez, Antonio; Torres-Prieto, Alexander

    2017-07-01

    The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model's short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC) for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease, producing useful models for decision-making in public health.

  20. Evaluation of the accuracy of the Rotating Parallel Ray Omnidirectional Integration for instantaneous pressure reconstruction from the measured pressure gradient

    NASA Astrophysics Data System (ADS)

    Moreto, Jose; Liu, Xiaofeng

    2017-11-01

    The accuracy of the Rotating Parallel Ray omnidirectional integration for pressure reconstruction from the measured pressure gradient (Liu et al., AIAA paper 2016-1049) is evaluated against both the Circular Virtual Boundary omnidirectional integration (Liu and Katz, 2006 and 2013) and the conventional Poisson equation approach. Dirichlet condition at one boundary point and Neumann condition at all other boundary points are applied to the Poisson solver. A direct numerical simulation database of isotropic turbulence flow (JHTDB), with a homogeneously distributed random noise added to the entire field of DNS pressure gradient, is used to assess the performance of the methods. The random noise, generated by the Matlab function Rand, has a magnitude varying randomly within the range of +/-40% of the maximum DNS pressure gradient. To account for the effect of the noise distribution pattern on the reconstructed pressure accuracy, a total of 1000 different noise distributions achieved by using different random number seeds are involved in the evaluation. Final results after averaging the 1000 realizations show that the error of the reconstructed pressure normalized by the DNS pressure variation range is 0.15 +/-0.07 for the Poisson equation approach, 0.028 +/-0.003 for the Circular Virtual Boundary method and 0.027 +/-0.003 for the Rotating Parallel Ray method, indicating the robustness of the Rotating Parallel Ray method in pressure reconstruction. Sponsor: The San Diego State University UGP program.

  1. A Spatial Poisson Hurdle Model for Exploring Geographic Variation in Emergency Department Visits

    PubMed Central

    Neelon, Brian; Ghosh, Pulak; Loebs, Patrick F.

    2012-01-01

    Summary We develop a spatial Poisson hurdle model to explore geographic variation in emergency department (ED) visits while accounting for zero inflation. The model consists of two components: a Bernoulli component that models the probability of any ED use (i.e., at least one ED visit per year), and a truncated Poisson component that models the number of ED visits given use. Together, these components address both the abundance of zeros and the right-skewed nature of the nonzero counts. The model has a hierarchical structure that incorporates patient- and area-level covariates, as well as spatially correlated random effects for each areal unit. Because regions with high rates of ED use are likely to have high expected counts among users, we model the spatial random effects via a bivariate conditionally autoregressive (CAR) prior, which introduces dependence between the components and provides spatial smoothing and sharing of information across neighboring regions. Using a simulation study, we show that modeling the between-component correlation reduces bias in parameter estimates. We adopt a Bayesian estimation approach, and the model can be fit using standard Bayesian software. We apply the model to a study of patient and neighborhood factors influencing emergency department use in Durham County, North Carolina. PMID:23543242

  2. Do bacterial cell numbers follow a theoretical Poisson distribution? Comparison of experimentally obtained numbers of single cells with random number generation via computer simulation.

    PubMed

    Koyama, Kento; Hokunan, Hidekazu; Hasegawa, Mayumi; Kawamura, Shuso; Koseki, Shigenobu

    2016-12-01

    We investigated a bacterial sample preparation procedure for single-cell studies. In the present study, we examined whether single bacterial cells obtained via 10-fold dilution followed a theoretical Poisson distribution. Four serotypes of Salmonella enterica, three serotypes of enterohaemorrhagic Escherichia coli and one serotype of Listeria monocytogenes were used as sample bacteria. An inoculum of each serotype was prepared via a 10-fold dilution series to obtain bacterial cell counts with mean values of one or two. To determine whether the experimentally obtained bacterial cell counts follow a theoretical Poisson distribution, a likelihood ratio test between the experimentally obtained cell counts and Poisson distribution which parameter estimated by maximum likelihood estimation (MLE) was conducted. The bacterial cell counts of each serotype sufficiently followed a Poisson distribution. Furthermore, to examine the validity of the parameters of Poisson distribution from experimentally obtained bacterial cell counts, we compared these with the parameters of a Poisson distribution that were estimated using random number generation via computer simulation. The Poisson distribution parameters experimentally obtained from bacterial cell counts were within the range of the parameters estimated using a computer simulation. These results demonstrate that the bacterial cell counts of each serotype obtained via 10-fold dilution followed a Poisson distribution. The fact that the frequency of bacterial cell counts follows a Poisson distribution at low number would be applied to some single-cell studies with a few bacterial cells. In particular, the procedure presented in this study enables us to develop an inactivation model at the single-cell level that can estimate the variability of survival bacterial numbers during the bacterial death process. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Time distributions of solar energetic particle events: Are SEPEs really random?

    NASA Astrophysics Data System (ADS)

    Jiggens, P. T. A.; Gabriel, S. B.

    2009-10-01

    Solar energetic particle events (SEPEs) can exhibit flux increases of several orders of magnitude over background levels and have always been considered to be random in nature in statistical models with no dependence of any one event on the occurrence of previous events. We examine whether this assumption of randomness in time is correct. Engineering modeling of SEPEs is important to enable reliable and efficient design of both Earth-orbiting and interplanetary spacecraft and future manned missions to Mars and the Moon. All existing engineering models assume that the frequency of SEPEs follows a Poisson process. We present analysis of the event waiting times using alternative distributions described by Lévy and time-dependent Poisson processes and compared these with the usual Poisson distribution. The results show significant deviation from a Poisson process and indicate that the underlying physical processes might be more closely related to a Lévy-type process, suggesting that there is some inherent “memory” in the system. Inherent Poisson assumptions of stationarity and event independence are investigated, and it appears that they do not hold and can be dependent upon the event definition used. SEPEs appear to have some memory indicating that events are not completely random with activity levels varying even during solar active periods and are characterized by clusters of events. This could have significant ramifications for engineering models of the SEP environment, and it is recommended that current statistical engineering models of the SEP environment should be modified to incorporate long-term event dependency and short-term system memory.

  4. Reduction of Poisson noise in measured time-resolved data for time-domain diffuse optical tomography.

    PubMed

    Okawa, S; Endo, Y; Hoshi, Y; Yamada, Y

    2012-01-01

    A method to reduce noise for time-domain diffuse optical tomography (DOT) is proposed. Poisson noise which contaminates time-resolved photon counting data is reduced by use of maximum a posteriori estimation. The noise-free data are modeled as a Markov random process, and the measured time-resolved data are assumed as Poisson distributed random variables. The posterior probability of the occurrence of the noise-free data is formulated. By maximizing the probability, the noise-free data are estimated, and the Poisson noise is reduced as a result. The performances of the Poisson noise reduction are demonstrated in some experiments of the image reconstruction of time-domain DOT. In simulations, the proposed method reduces the relative error between the noise-free and noisy data to about one thirtieth, and the reconstructed DOT image was smoothed by the proposed noise reduction. The variance of the reconstructed absorption coefficients decreased by 22% in a phantom experiment. The quality of DOT, which can be applied to breast cancer screening etc., is improved by the proposed noise reduction.

  5. Modeling Stochastic Variability in the Numbers of Surviving Salmonella enterica, Enterohemorrhagic Escherichia coli, and Listeria monocytogenes Cells at the Single-Cell Level in a Desiccated Environment

    PubMed Central

    Koyama, Kento; Hokunan, Hidekazu; Hasegawa, Mayumi; Kawamura, Shuso

    2016-01-01

    ABSTRACT Despite effective inactivation procedures, small numbers of bacterial cells may still remain in food samples. The risk that bacteria will survive these procedures has not been estimated precisely because deterministic models cannot be used to describe the uncertain behavior of bacterial populations. We used the Poisson distribution as a representative probability distribution to estimate the variability in bacterial numbers during the inactivation process. Strains of four serotypes of Salmonella enterica, three serotypes of enterohemorrhagic Escherichia coli, and one serotype of Listeria monocytogenes were evaluated for survival. We prepared bacterial cell numbers following a Poisson distribution (indicated by the parameter λ, which was equal to 2) and plated the cells in 96-well microplates, which were stored in a desiccated environment at 10% to 20% relative humidity and at 5, 15, and 25°C. The survival or death of the bacterial cells in each well was confirmed by adding tryptic soy broth as an enrichment culture. Changes in the Poisson distribution parameter during the inactivation process, which represent the variability in the numbers of surviving bacteria, were described by nonlinear regression with an exponential function based on a Weibull distribution. We also examined random changes in the number of surviving bacteria using a random number generator and computer simulations to determine whether the number of surviving bacteria followed a Poisson distribution during the bacterial death process by use of the Poisson process. For small initial cell numbers, more than 80% of the simulated distributions (λ = 2 or 10) followed a Poisson distribution. The results demonstrate that variability in the number of surviving bacteria can be described as a Poisson distribution by use of the model developed by use of the Poisson process. IMPORTANCE We developed a model to enable the quantitative assessment of bacterial survivors of inactivation procedures because the presence of even one bacterium can cause foodborne disease. The results demonstrate that the variability in the numbers of surviving bacteria was described as a Poisson distribution by use of the model developed by use of the Poisson process. Description of the number of surviving bacteria as a probability distribution rather than as the point estimates used in a deterministic approach can provide a more realistic estimation of risk. The probability model should be useful for estimating the quantitative risk of bacterial survival during inactivation. PMID:27940547

  6. Modeling Stochastic Variability in the Numbers of Surviving Salmonella enterica, Enterohemorrhagic Escherichia coli, and Listeria monocytogenes Cells at the Single-Cell Level in a Desiccated Environment.

    PubMed

    Koyama, Kento; Hokunan, Hidekazu; Hasegawa, Mayumi; Kawamura, Shuso; Koseki, Shigenobu

    2017-02-15

    Despite effective inactivation procedures, small numbers of bacterial cells may still remain in food samples. The risk that bacteria will survive these procedures has not been estimated precisely because deterministic models cannot be used to describe the uncertain behavior of bacterial populations. We used the Poisson distribution as a representative probability distribution to estimate the variability in bacterial numbers during the inactivation process. Strains of four serotypes of Salmonella enterica, three serotypes of enterohemorrhagic Escherichia coli, and one serotype of Listeria monocytogenes were evaluated for survival. We prepared bacterial cell numbers following a Poisson distribution (indicated by the parameter λ, which was equal to 2) and plated the cells in 96-well microplates, which were stored in a desiccated environment at 10% to 20% relative humidity and at 5, 15, and 25°C. The survival or death of the bacterial cells in each well was confirmed by adding tryptic soy broth as an enrichment culture. Changes in the Poisson distribution parameter during the inactivation process, which represent the variability in the numbers of surviving bacteria, were described by nonlinear regression with an exponential function based on a Weibull distribution. We also examined random changes in the number of surviving bacteria using a random number generator and computer simulations to determine whether the number of surviving bacteria followed a Poisson distribution during the bacterial death process by use of the Poisson process. For small initial cell numbers, more than 80% of the simulated distributions (λ = 2 or 10) followed a Poisson distribution. The results demonstrate that variability in the number of surviving bacteria can be described as a Poisson distribution by use of the model developed by use of the Poisson process. We developed a model to enable the quantitative assessment of bacterial survivors of inactivation procedures because the presence of even one bacterium can cause foodborne disease. The results demonstrate that the variability in the numbers of surviving bacteria was described as a Poisson distribution by use of the model developed by use of the Poisson process. Description of the number of surviving bacteria as a probability distribution rather than as the point estimates used in a deterministic approach can provide a more realistic estimation of risk. The probability model should be useful for estimating the quantitative risk of bacterial survival during inactivation. Copyright © 2017 Koyama et al.

  7. Generalized master equations for non-Poisson dynamics on networks.

    PubMed

    Hoffmann, Till; Porter, Mason A; Lambiotte, Renaud

    2012-10-01

    The traditional way of studying temporal networks is to aggregate the dynamics of the edges to create a static weighted network. This implicitly assumes that the edges are governed by Poisson processes, which is not typically the case in empirical temporal networks. Accordingly, we examine the effects of non-Poisson inter-event statistics on the dynamics of edges, and we apply the concept of a generalized master equation to the study of continuous-time random walks on networks. We show that this equation reduces to the standard rate equations when the underlying process is Poissonian and that its stationary solution is determined by an effective transition matrix whose leading eigenvector is easy to calculate. We conduct numerical simulations and also derive analytical results for the stationary solution under the assumption that all edges have the same waiting-time distribution. We discuss the implications of our work for dynamical processes on temporal networks and for the construction of network diagnostics that take into account their nontrivial stochastic nature.

  8. Generalized master equations for non-Poisson dynamics on networks

    NASA Astrophysics Data System (ADS)

    Hoffmann, Till; Porter, Mason A.; Lambiotte, Renaud

    2012-10-01

    The traditional way of studying temporal networks is to aggregate the dynamics of the edges to create a static weighted network. This implicitly assumes that the edges are governed by Poisson processes, which is not typically the case in empirical temporal networks. Accordingly, we examine the effects of non-Poisson inter-event statistics on the dynamics of edges, and we apply the concept of a generalized master equation to the study of continuous-time random walks on networks. We show that this equation reduces to the standard rate equations when the underlying process is Poissonian and that its stationary solution is determined by an effective transition matrix whose leading eigenvector is easy to calculate. We conduct numerical simulations and also derive analytical results for the stationary solution under the assumption that all edges have the same waiting-time distribution. We discuss the implications of our work for dynamical processes on temporal networks and for the construction of network diagnostics that take into account their nontrivial stochastic nature.

  9. Pareto genealogies arising from a Poisson branching evolution model with selection.

    PubMed

    Huillet, Thierry E

    2014-02-01

    We study a class of coalescents derived from a sampling procedure out of N i.i.d. Pareto(α) random variables, normalized by their sum, including β-size-biasing on total length effects (β < α). Depending on the range of α we derive the large N limit coalescents structure, leading either to a discrete-time Poisson-Dirichlet (α, -β) Ξ-coalescent (α ε[0, 1)), or to a family of continuous-time Beta (2 - α, α - β)Λ-coalescents (α ε[1, 2)), or to the Kingman coalescent (α ≥ 2). We indicate that this class of coalescent processes (and their scaling limits) may be viewed as the genealogical processes of some forward in time evolving branching population models including selection effects. In such constant-size population models, the reproduction step, which is based on a fitness-dependent Poisson Point Process with scaling power-law(α) intensity, is coupled to a selection step consisting of sorting out the N fittest individuals issued from the reproduction step.

  10. Estimating safety effects of pavement management factors utilizing Bayesian random effect models.

    PubMed

    Jiang, Ximiao; Huang, Baoshan; Zaretzki, Russell L; Richards, Stephen; Yan, Xuedong

    2013-01-01

    Previous studies of pavement management factors that relate to the occurrence of traffic-related crashes are rare. Traditional research has mostly employed summary statistics of bidirectional pavement quality measurements in extended longitudinal road segments over a long time period, which may cause a loss of important information and result in biased parameter estimates. The research presented in this article focuses on crash risk of roadways with overall fair to good pavement quality. Real-time and location-specific data were employed to estimate the effects of pavement management factors on the occurrence of crashes. This research is based on the crash data and corresponding pavement quality data for the Tennessee state route highways from 2004 to 2009. The potential temporal and spatial correlations among observations caused by unobserved factors were considered. Overall 6 models were built accounting for no correlation, temporal correlation only, and both the temporal and spatial correlations. These models included Poisson, negative binomial (NB), one random effect Poisson and negative binomial (OREP, ORENB), and two random effect Poisson and negative binomial (TREP, TRENB) models. The Bayesian method was employed to construct these models. The inference is based on the posterior distribution from the Markov chain Monte Carlo (MCMC) simulation. These models were compared using the deviance information criterion. Analysis of the posterior distribution of parameter coefficients indicates that the pavement management factors indexed by Present Serviceability Index (PSI) and Pavement Distress Index (PDI) had significant impacts on the occurrence of crashes, whereas the variable rutting depth was not significant. Among other factors, lane width, median width, type of terrain, and posted speed limit were significant in affecting crash frequency. The findings of this study indicate that a reduction in pavement roughness would reduce the likelihood of traffic-related crashes. Hence, maintaining a low level of pavement roughness is strongly suggested. In addition, the results suggested that the temporal correlation among observations was significant and that the ORENB model outperformed all other models.

  11. Exploring the existence of a stayer population with mover-stayer counting process models: application to joint damage in psoriatic arthritis.

    PubMed

    Yiu, Sean; Farewell, Vernon T; Tom, Brian D M

    2017-08-01

    Many psoriatic arthritis patients do not progress to permanent joint damage in any of the 28 hand joints, even under prolonged follow-up. This has led several researchers to fit models that estimate the proportion of stayers (those who do not have the propensity to experience the event of interest) and to characterize the rate of developing damaged joints in the movers (those who have the propensity to experience the event of interest). However, when fitted to the same data, the paper demonstrates that the choice of model for the movers can lead to widely varying conclusions on a stayer population, thus implying that, if interest lies in a stayer population, a single analysis should not generally be adopted. The aim of the paper is to provide greater understanding regarding estimation of a stayer population by comparing the inferences, performance and features of multiple fitted models to real and simulated data sets. The models for the movers are based on Poisson processes with patient level random effects and/or dynamic covariates, which are used to induce within-patient correlation, and observation level random effects are used to account for time varying unobserved heterogeneity. The gamma, inverse Gaussian and compound Poisson distributions are considered for the random effects.

  12. Final Scientific Report,

    DTIC Science & Technology

    1980-11-26

    and J.B. Thomas, "The Effect of a Memoryless Nonlinearity on the Spectrum of a Random Process," IEEE Transactions on Information Theory, Vol. IT-23, pp...Density Function from Measurements Corrupted by Poisson Noise," IEEE Transactions on Information Theory, Vol. IT-23, pp. 764-766, November 1977. H. Derin...pp. 243-249, December 1977. G.L. Wise and N.C. Gallagher, "On Spherically Invariant Random Processes," IEEE Transactions on Information Theory, Vol. IT

  13. Clustered mixed nonhomogeneous Poisson process spline models for the analysis of recurrent event panel data.

    PubMed

    Nielsen, J D; Dean, C B

    2008-09-01

    A flexible semiparametric model for analyzing longitudinal panel count data arising from mixtures is presented. Panel count data refers here to count data on recurrent events collected as the number of events that have occurred within specific follow-up periods. The model assumes that the counts for each subject are generated by mixtures of nonhomogeneous Poisson processes with smooth intensity functions modeled with penalized splines. Time-dependent covariate effects are also incorporated into the process intensity using splines. Discrete mixtures of these nonhomogeneous Poisson process spline models extract functional information from underlying clusters representing hidden subpopulations. The motivating application is an experiment to test the effectiveness of pheromones in disrupting the mating pattern of the cherry bark tortrix moth. Mature moths arise from hidden, but distinct, subpopulations and monitoring the subpopulation responses was of interest. Within-cluster random effects are used to account for correlation structures and heterogeneity common to this type of data. An estimating equation approach to inference requiring only low moment assumptions is developed and the finite sample properties of the proposed estimating functions are investigated empirically by simulation.

  14. Statistical mapping of count survey data

    USGS Publications Warehouse

    Royle, J. Andrew; Link, W.A.; Sauer, J.R.; Scott, J. Michael; Heglund, Patricia J.; Morrison, Michael L.; Haufler, Jonathan B.; Wall, William A.

    2002-01-01

    We apply a Poisson mixed model to the problem of mapping (or predicting) bird relative abundance from counts collected from the North American Breeding Bird Survey (BBS). The model expresses the logarithm of the Poisson mean as a sum of a fixed term (which may depend on habitat variables) and a random effect which accounts for remaining unexplained variation. The random effect is assumed to be spatially correlated, thus providing a more general model than the traditional Poisson regression approach. Consequently, the model is capable of improved prediction when data are autocorrelated. Moreover, formulation of the mapping problem in terms of a statistical model facilitates a wide variety of inference problems which are cumbersome or even impossible using standard methods of mapping. For example, assessment of prediction uncertainty, including the formal comparison of predictions at different locations, or through time, using the model-based prediction variance is straightforward under the Poisson model (not so with many nominally model-free methods). Also, ecologists may generally be interested in quantifying the response of a species to particular habitat covariates or other landscape attributes. Proper accounting for the uncertainty in these estimated effects is crucially dependent on specification of a meaningful statistical model. Finally, the model may be used to aid in sampling design, by modifying the existing sampling plan in a manner which minimizes some variance-based criterion. Model fitting under this model is carried out using a simulation technique known as Markov Chain Monte Carlo. Application of the model is illustrated using Mourning Dove (Zenaida macroura) counts from Pennsylvania BBS routes. We produce both a model-based map depicting relative abundance, and the corresponding map of prediction uncertainty. We briefly address the issue of spatial sampling design under this model. Finally, we close with some discussion of mapping in relation to habitat structure. Although our models were fit in the absence of habitat information, the resulting predictions show a strong inverse relation with a map of forest cover in the state, as expected. Consequently, the results suggest that the correlated random effect in the model is broadly representing ecological variation, and that BBS data may be generally useful for studying bird-habitat relationships, even in the presence of observer errors and other widely recognized deficiencies of the BBS.

  15. Modeling laser velocimeter signals as triply stochastic Poisson processes

    NASA Technical Reports Server (NTRS)

    Mayo, W. T., Jr.

    1976-01-01

    Previous models of laser Doppler velocimeter (LDV) systems have not adequately described dual-scatter signals in a manner useful for analysis and simulation of low-level photon-limited signals. At low photon rates, an LDV signal at the output of a photomultiplier tube is a compound nonhomogeneous filtered Poisson process, whose intensity function is another (slower) Poisson process with the nonstationary rate and frequency parameters controlled by a random flow (slowest) process. In the present paper, generalized Poisson shot noise models are developed for low-level LDV signals. Theoretical results useful in detection error analysis and simulation are presented, along with measurements of burst amplitude statistics. Computer generated simulations illustrate the difference between Gaussian and Poisson models of low-level signals.

  16. Three-dimensionally bonded spongy graphene material with super compressive elasticity and near-zero Poisson's ratio.

    PubMed

    Wu, Yingpeng; Yi, Ningbo; Huang, Lu; Zhang, Tengfei; Fang, Shaoli; Chang, Huicong; Li, Na; Oh, Jiyoung; Lee, Jae Ah; Kozlov, Mikhail; Chipara, Alin C; Terrones, Humberto; Xiao, Peishuang; Long, Guankui; Huang, Yi; Zhang, Fan; Zhang, Long; Lepró, Xavier; Haines, Carter; Lima, Márcio Dias; Lopez, Nestor Perea; Rajukumar, Lakshmy P; Elias, Ana L; Feng, Simin; Kim, Seon Jeong; Narayanan, N T; Ajayan, Pulickel M; Terrones, Mauricio; Aliev, Ali; Chu, Pengfei; Zhang, Zhong; Baughman, Ray H; Chen, Yongsheng

    2015-01-20

    It is a challenge to fabricate graphene bulk materials with properties arising from the nature of individual graphene sheets, and which assemble into monolithic three-dimensional structures. Here we report the scalable self-assembly of randomly oriented graphene sheets into additive-free, essentially homogenous graphene sponge materials that provide a combination of both cork-like and rubber-like properties. These graphene sponges, with densities similar to air, display Poisson's ratios in all directions that are near-zero and largely strain-independent during reversible compression to giant strains. And at the same time, they function as enthalpic rubbers, which can recover up to 98% compression in air and 90% in liquids, and operate between -196 and 900 °C. Furthermore, these sponges provide reversible liquid absorption for hundreds of cycles and then discharge it within seconds, while still providing an effective near-zero Poisson's ratio.

  17. Exploring the effects of roadway characteristics on the frequency and severity of head-on crashes: case studies from Malaysian federal roads.

    PubMed

    Hosseinpour, Mehdi; Yahaya, Ahmad Shukri; Sadullah, Ahmad Farhan

    2014-01-01

    Head-on crashes are among the most severe collision types and of great concern to road safety authorities. Therefore, it justifies more efforts to reduce both the frequency and severity of this collision type. To this end, it is necessary to first identify factors associating with the crash occurrence. This can be done by developing crash prediction models that relate crash outcomes to a set of contributing factors. This study intends to identify the factors affecting both the frequency and severity of head-on crashes that occurred on 448 segments of five federal roads in Malaysia. Data on road characteristics and crash history were collected on the study segments during a 4-year period between 2007 and 2010. The frequency of head-on crashes were fitted by developing and comparing seven count-data models including Poisson, standard negative binomial (NB), random-effect negative binomial, hurdle Poisson, hurdle negative binomial, zero-inflated Poisson, and zero-inflated negative binomial models. To model crash severity, a random-effect generalized ordered probit model (REGOPM) was used given a head-on crash had occurred. With respect to the crash frequency, the random-effect negative binomial (RENB) model was found to outperform the other models according to goodness of fit measures. Based on the results of the model, the variables horizontal curvature, terrain type, heavy-vehicle traffic, and access points were found to be positively related to the frequency of head-on crashes, while posted speed limit and shoulder width decreased the crash frequency. With regard to the crash severity, the results of REGOPM showed that horizontal curvature, paved shoulder width, terrain type, and side friction were associated with more severe crashes, whereas land use, access points, and presence of median reduced the probability of severe crashes. Based on the results of this study, some potential countermeasures were proposed to minimize the risk of head-on crashes. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Comparing effects of fire modeling methods on simulated fire patterns and succession: a case study in the Missouri Ozarks

    Treesearch

    Jian Yang; Hong S. He; Brian R. Sturtevant; Brian R. Miranda; Eric J. Gustafson

    2008-01-01

    We compared four fire spread simulation methods (completely random, dynamic percolation. size-based minimum travel time algorithm. and duration-based minimum travel time algorithm) and two fire occurrence simulation methods (Poisson fire frequency model and hierarchical fire frequency model) using a two-way factorial design. We examined these treatment effects on...

  19. A Random Variable Transformation Process.

    ERIC Educational Resources Information Center

    Scheuermann, Larry

    1989-01-01

    Provides a short BASIC program, RANVAR, which generates random variates for various theoretical probability distributions. The seven variates include: uniform, exponential, normal, binomial, Poisson, Pascal, and triangular. (MVL)

  20. Bayesian dynamic modeling of time series of dengue disease case counts

    PubMed Central

    López-Quílez, Antonio; Torres-Prieto, Alexander

    2017-01-01

    The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model’s short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC) for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease, producing useful models for decision-making in public health. PMID:28671941

  1. Zero-inflated count models for longitudinal measurements with heterogeneous random effects.

    PubMed

    Zhu, Huirong; Luo, Sheng; DeSantis, Stacia M

    2017-08-01

    Longitudinal zero-inflated count data arise frequently in substance use research when assessing the effects of behavioral and pharmacological interventions. Zero-inflated count models (e.g. zero-inflated Poisson or zero-inflated negative binomial) with random effects have been developed to analyze this type of data. In random effects zero-inflated count models, the random effects covariance matrix is typically assumed to be homogeneous (constant across subjects). However, in many situations this matrix may be heterogeneous (differ by measured covariates). In this paper, we extend zero-inflated count models to account for random effects heterogeneity by modeling their variance as a function of covariates. We show via simulation that ignoring intervention and covariate-specific heterogeneity can produce biased estimates of covariate and random effect estimates. Moreover, those biased estimates can be rectified by correctly modeling the random effects covariance structure. The methodological development is motivated by and applied to the Combined Pharmacotherapies and Behavioral Interventions for Alcohol Dependence (COMBINE) study, the largest clinical trial of alcohol dependence performed in United States with 1383 individuals.

  2. Differential expression analysis for RNAseq using Poisson mixed models

    PubMed Central

    Sun, Shiquan; Hood, Michelle; Scott, Laura; Peng, Qinke; Mukherjee, Sayan; Tung, Jenny

    2017-01-01

    Abstract Identifying differentially expressed (DE) genes from RNA sequencing (RNAseq) studies is among the most common analyses in genomics. However, RNAseq DE analysis presents several statistical and computational challenges, including over-dispersed read counts and, in some settings, sample non-independence. Previous count-based methods rely on simple hierarchical Poisson models (e.g. negative binomial) to model independent over-dispersion, but do not account for sample non-independence due to relatedness, population structure and/or hidden confounders. Here, we present a Poisson mixed model with two random effects terms that account for both independent over-dispersion and sample non-independence. We also develop a scalable sampling-based inference algorithm using a latent variable representation of the Poisson distribution. With simulations, we show that our method properly controls for type I error and is generally more powerful than other widely used approaches, except in small samples (n <15) with other unfavorable properties (e.g. small effect sizes). We also apply our method to three real datasets that contain related individuals, population stratification or hidden confounders. Our results show that our method increases power in all three data compared to other approaches, though the power gain is smallest in the smallest sample (n = 6). Our method is implemented in MACAU, freely available at www.xzlab.org/software.html. PMID:28369632

  3. Weak convergence to isotropic complex [Formula: see text] random measure.

    PubMed

    Wang, Jun; Li, Yunmeng; Sang, Liheng

    2017-01-01

    In this paper, we prove that an isotropic complex symmetric α -stable random measure ([Formula: see text]) can be approximated by a complex process constructed by integrals based on the Poisson process with random intensity.

  4. An INAR(1) Negative Multinomial Regression Model for Longitudinal Count Data.

    ERIC Educational Resources Information Center

    Bockenholt, Ulf

    1999-01-01

    Discusses a regression model for the analysis of longitudinal count data in a panel study by adapting an integer-valued first-order autoregressive (INAR(1)) Poisson process to represent time-dependent correlation between counts. Derives a new negative multinomial distribution by combining INAR(1) representation with a random effects approach.…

  5. Itô and Stratonovich integrals on compound renewal processes: the normal/Poisson case

    NASA Astrophysics Data System (ADS)

    Germano, Guido; Politi, Mauro; Scalas, Enrico; Schilling, René L.

    2010-06-01

    Continuous-time random walks, or compound renewal processes, are pure-jump stochastic processes with several applications in insurance, finance, economics and physics. Based on heuristic considerations, a definition is given for stochastic integrals driven by continuous-time random walks, which includes the Itô and Stratonovich cases. It is then shown how the definition can be used to compute these two stochastic integrals by means of Monte Carlo simulations. Our example is based on the normal compound Poisson process, which in the diffusive limit converges to the Wiener process.

  6. Marginalized zero-altered models for longitudinal count data.

    PubMed

    Tabb, Loni Philip; Tchetgen, Eric J Tchetgen; Wellenius, Greg A; Coull, Brent A

    2016-10-01

    Count data often exhibit more zeros than predicted by common count distributions like the Poisson or negative binomial. In recent years, there has been considerable interest in methods for analyzing zero-inflated count data in longitudinal or other correlated data settings. A common approach has been to extend zero-inflated Poisson models to include random effects that account for correlation among observations. However, these models have been shown to have a few drawbacks, including interpretability of regression coefficients and numerical instability of fitting algorithms even when the data arise from the assumed model. To address these issues, we propose a model that parameterizes the marginal associations between the count outcome and the covariates as easily interpretable log relative rates, while including random effects to account for correlation among observations. One of the main advantages of this marginal model is that it allows a basis upon which we can directly compare the performance of standard methods that ignore zero inflation with that of a method that explicitly takes zero inflation into account. We present simulations of these various model formulations in terms of bias and variance estimation. Finally, we apply the proposed approach to analyze toxicological data of the effect of emissions on cardiac arrhythmias.

  7. Marginalized zero-altered models for longitudinal count data

    PubMed Central

    Tabb, Loni Philip; Tchetgen, Eric J. Tchetgen; Wellenius, Greg A.; Coull, Brent A.

    2015-01-01

    Count data often exhibit more zeros than predicted by common count distributions like the Poisson or negative binomial. In recent years, there has been considerable interest in methods for analyzing zero-inflated count data in longitudinal or other correlated data settings. A common approach has been to extend zero-inflated Poisson models to include random effects that account for correlation among observations. However, these models have been shown to have a few drawbacks, including interpretability of regression coefficients and numerical instability of fitting algorithms even when the data arise from the assumed model. To address these issues, we propose a model that parameterizes the marginal associations between the count outcome and the covariates as easily interpretable log relative rates, while including random effects to account for correlation among observations. One of the main advantages of this marginal model is that it allows a basis upon which we can directly compare the performance of standard methods that ignore zero inflation with that of a method that explicitly takes zero inflation into account. We present simulations of these various model formulations in terms of bias and variance estimation. Finally, we apply the proposed approach to analyze toxicological data of the effect of emissions on cardiac arrhythmias. PMID:27867423

  8. A stochastic model for stationary dynamics of prices in real estate markets. A case of random intensity for Poisson moments of prices changes

    NASA Astrophysics Data System (ADS)

    Rusakov, Oleg; Laskin, Michael

    2017-06-01

    We consider a stochastic model of changes of prices in real estate markets. We suppose that in a book of prices the changes happen in points of jumps of a Poisson process with a random intensity, i.e. moments of changes sequently follow to a random process of the Cox process type. We calculate cumulative mathematical expectations and variances for the random intensity of this point process. In the case that the process of random intensity is a martingale the cumulative variance has a linear grows. We statistically process a number of observations of real estate prices and accept hypotheses of a linear grows for estimations as well for cumulative average, as for cumulative variance both for input and output prises that are writing in the book of prises.

  9. Application of spatial Poisson process models to air mass thunderstorm rainfall

    NASA Technical Reports Server (NTRS)

    Eagleson, P. S.; Fennessy, N. M.; Wang, Qinliang; Rodriguez-Iturbe, I.

    1987-01-01

    Eight years of summer storm rainfall observations from 93 stations in and around the 154 sq km Walnut Gulch catchment of the Agricultural Research Service, U.S. Department of Agriculture, in Arizona are processed to yield the total station depths of 428 storms. Statistical analysis of these random fields yields the first two moments, the spatial correlation and variance functions, and the spatial distribution of total rainfall for each storm. The absolute and relative worth of three Poisson models are evaluated by comparing their prediction of the spatial distribution of storm rainfall with observations from the second half of the sample. The effect of interstorm parameter variation is examined.

  10. Super-stable Poissonian structures

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo

    2012-10-01

    In this paper we characterize classes of Poisson processes whose statistical structures are super-stable. We consider a flow generated by a one-dimensional ordinary differential equation, and an ensemble of particles ‘surfing’ the flow. The particles start from random initial positions, and are propagated along the flow by stochastic ‘wave processes’ with general statistics and general cross correlations. Setting the initial positions to be Poisson processes, we characterize the classes of Poisson processes that render the particles’ positions—at all times, and invariantly with respect to the wave processes—statistically identical to their initial positions. These Poisson processes are termed ‘super-stable’ and facilitate the generalization of the notion of stationary distributions far beyond the realm of Markov dynamics.

  11. Estimating random errors due to shot noise in backscatter lidar observations.

    PubMed

    Liu, Zhaoyan; Hunt, William; Vaughan, Mark; Hostetler, Chris; McGill, Matthew; Powell, Kathleen; Winker, David; Hu, Yongxiang

    2006-06-20

    We discuss the estimation of random errors due to shot noise in backscatter lidar observations that use either photomultiplier tube (PMT) or avalanche photodiode (APD) detectors. The statistical characteristics of photodetection are reviewed, and photon count distributions of solar background signals and laser backscatter signals are examined using airborne lidar observations at 532 nm using a photon-counting mode APD. Both distributions appear to be Poisson, indicating that the arrival at the photodetector of photons for these signals is a Poisson stochastic process. For Poisson- distributed signals, a proportional, one-to-one relationship is known to exist between the mean of a distribution and its variance. Although the multiplied photocurrent no longer follows a strict Poisson distribution in analog-mode APD and PMT detectors, the proportionality still exists between the mean and the variance of the multiplied photocurrent. We make use of this relationship by introducing the noise scale factor (NSF), which quantifies the constant of proportionality that exists between the root mean square of the random noise in a measurement and the square root of the mean signal. Using the NSF to estimate random errors in lidar measurements due to shot noise provides a significant advantage over the conventional error estimation techniques, in that with the NSF, uncertainties can be reliably calculated from or for a single data sample. Methods for evaluating the NSF are presented. Algorithms to compute the NSF are developed for the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations lidar and tested using data from the Lidar In-space Technology Experiment.

  12. Estimating Random Errors Due to Shot Noise in Backscatter Lidar Observations

    NASA Technical Reports Server (NTRS)

    Liu, Zhaoyan; Hunt, William; Vaughan, Mark A.; Hostetler, Chris A.; McGill, Matthew J.; Powell, Kathy; Winker, David M.; Hu, Yongxiang

    2006-01-01

    In this paper, we discuss the estimation of random errors due to shot noise in backscatter lidar observations that use either photomultiplier tube (PMT) or avalanche photodiode (APD) detectors. The statistical characteristics of photodetection are reviewed, and photon count distributions of solar background signals and laser backscatter signals are examined using airborne lidar observations at 532 nm using a photon-counting mode APD. Both distributions appear to be Poisson, indicating that the arrival at the photodetector of photons for these signals is a Poisson stochastic process. For Poisson-distributed signals, a proportional, one-to-one relationship is known to exist between the mean of a distribution and its variance. Although the multiplied photocurrent no longer follows a strict Poisson distribution in analog-mode APD and PMT detectors, the proportionality still exists between the mean and the variance of the multiplied photocurrent. We make use of this relationship by introducing the noise scale factor (NSF), which quantifies the constant of proportionality that exists between the root-mean-square of the random noise in a measurement and the square root of the mean signal. Using the NSF to estimate random errors in lidar measurements due to shot noise provides a significant advantage over the conventional error estimation techniques, in that with the NSF uncertainties can be reliably calculated from/for a single data sample. Methods for evaluating the NSF are presented. Algorithms to compute the NSF are developed for the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) lidar and tested using data from the Lidar In-space Technology Experiment (LITE). OCIS Codes:

  13. The distribution of catchment coverage by stationary rainstorms

    NASA Technical Reports Server (NTRS)

    Eagleson, P. S.

    1984-01-01

    The occurrence of wetted rainstorm area within a catchment is modeled as a Poisson arrival process in which each storm is composed of stationary, nonoverlapping, independent random cell clusters whose centers are Poisson-distributed in space and whose areas are fractals. The two Poisson parameters and hence the first two moments of the wetted fraction are derived in terms of catchment average characteristics of the (observable) station precipitation. The model is used to estimate spatial properties of tropical air mass thunderstorms on six tropical catchments in the Sudan.

  14. A Bayesian approach to parameter and reliability estimation in the Poisson distribution.

    NASA Technical Reports Server (NTRS)

    Canavos, G. C.

    1972-01-01

    For life testing procedures, a Bayesian analysis is developed with respect to a random intensity parameter in the Poisson distribution. Bayes estimators are derived for the Poisson parameter and the reliability function based on uniform and gamma prior distributions of that parameter. A Monte Carlo procedure is implemented to make possible an empirical mean-squared error comparison between Bayes and existing minimum variance unbiased, as well as maximum likelihood, estimators. As expected, the Bayes estimators have mean-squared errors that are appreciably smaller than those of the other two.

  15. A Conway-Maxwell-Poisson (CMP) model to address data dispersion on positron emission tomography.

    PubMed

    Santarelli, Maria Filomena; Della Latta, Daniele; Scipioni, Michele; Positano, Vincenzo; Landini, Luigi

    2016-10-01

    Positron emission tomography (PET) in medicine exploits the properties of positron-emitting unstable nuclei. The pairs of γ- rays emitted after annihilation are revealed by coincidence detectors and stored as projections in a sinogram. It is well known that radioactive decay follows a Poisson distribution; however, deviation from Poisson statistics occurs on PET projection data prior to reconstruction due to physical effects, measurement errors, correction of deadtime, scatter, and random coincidences. A model that describes the statistical behavior of measured and corrected PET data can aid in understanding the statistical nature of the data: it is a prerequisite to develop efficient reconstruction and processing methods and to reduce noise. The deviation from Poisson statistics in PET data could be described by the Conway-Maxwell-Poisson (CMP) distribution model, which is characterized by the centring parameter λ and the dispersion parameter ν, the latter quantifying the deviation from a Poisson distribution model. In particular, the parameter ν allows quantifying over-dispersion (ν<1) or under-dispersion (ν>1) of data. A simple and efficient method for λ and ν parameters estimation is introduced and assessed using Monte Carlo simulation for a wide range of activity values. The application of the method to simulated and experimental PET phantom data demonstrated that the CMP distribution parameters could detect deviation from the Poisson distribution both in raw and corrected PET data. It may be usefully implemented in image reconstruction algorithms and quantitative PET data analysis, especially in low counting emission data, as in dynamic PET data, where the method demonstrated the best accuracy. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. The Dependent Poisson Race Model and Modeling Dependence in Conjoint Choice Experiments

    ERIC Educational Resources Information Center

    Ruan, Shiling; MacEachern, Steven N.; Otter, Thomas; Dean, Angela M.

    2008-01-01

    Conjoint choice experiments are used widely in marketing to study consumer preferences amongst alternative products. We develop a class of choice models, belonging to the class of Poisson race models, that describe a "random utility" which lends itself to a process-based description of choice. The models incorporate a dependence structure which…

  17. Binomial leap methods for simulating stochastic chemical kinetics.

    PubMed

    Tian, Tianhai; Burrage, Kevin

    2004-12-01

    This paper discusses efficient simulation methods for stochastic chemical kinetics. Based on the tau-leap and midpoint tau-leap methods of Gillespie [D. T. Gillespie, J. Chem. Phys. 115, 1716 (2001)], binomial random variables are used in these leap methods rather than Poisson random variables. The motivation for this approach is to improve the efficiency of the Poisson leap methods by using larger stepsizes. Unlike Poisson random variables whose range of sample values is from zero to infinity, binomial random variables have a finite range of sample values. This probabilistic property has been used to restrict possible reaction numbers and to avoid negative molecular numbers in stochastic simulations when larger stepsize is used. In this approach a binomial random variable is defined for a single reaction channel in order to keep the reaction number of this channel below the numbers of molecules that undergo this reaction channel. A sampling technique is also designed for the total reaction number of a reactant species that undergoes two or more reaction channels. Samples for the total reaction number are not greater than the molecular number of this species. In addition, probability properties of the binomial random variables provide stepsize conditions for restricting reaction numbers in a chosen time interval. These stepsize conditions are important properties of robust leap control strategies. Numerical results indicate that the proposed binomial leap methods can be applied to a wide range of chemical reaction systems with very good accuracy and significant improvement on efficiency over existing approaches. (c) 2004 American Institute of Physics.

  18. Poisson and negative binomial item count techniques for surveys with sensitive question.

    PubMed

    Tian, Guo-Liang; Tang, Man-Lai; Wu, Qin; Liu, Yin

    2017-04-01

    Although the item count technique is useful in surveys with sensitive questions, privacy of those respondents who possess the sensitive characteristic of interest may not be well protected due to a defect in its original design. In this article, we propose two new survey designs (namely the Poisson item count technique and negative binomial item count technique) which replace several independent Bernoulli random variables required by the original item count technique with a single Poisson or negative binomial random variable, respectively. The proposed models not only provide closed form variance estimate and confidence interval within [0, 1] for the sensitive proportion, but also simplify the survey design of the original item count technique. Most importantly, the new designs do not leak respondents' privacy. Empirical results show that the proposed techniques perform satisfactorily in the sense that it yields accurate parameter estimate and confidence interval.

  19. Differential expression analysis for RNAseq using Poisson mixed models.

    PubMed

    Sun, Shiquan; Hood, Michelle; Scott, Laura; Peng, Qinke; Mukherjee, Sayan; Tung, Jenny; Zhou, Xiang

    2017-06-20

    Identifying differentially expressed (DE) genes from RNA sequencing (RNAseq) studies is among the most common analyses in genomics. However, RNAseq DE analysis presents several statistical and computational challenges, including over-dispersed read counts and, in some settings, sample non-independence. Previous count-based methods rely on simple hierarchical Poisson models (e.g. negative binomial) to model independent over-dispersion, but do not account for sample non-independence due to relatedness, population structure and/or hidden confounders. Here, we present a Poisson mixed model with two random effects terms that account for both independent over-dispersion and sample non-independence. We also develop a scalable sampling-based inference algorithm using a latent variable representation of the Poisson distribution. With simulations, we show that our method properly controls for type I error and is generally more powerful than other widely used approaches, except in small samples (n <15) with other unfavorable properties (e.g. small effect sizes). We also apply our method to three real datasets that contain related individuals, population stratification or hidden confounders. Our results show that our method increases power in all three data compared to other approaches, though the power gain is smallest in the smallest sample (n = 6). Our method is implemented in MACAU, freely available at www.xzlab.org/software.html. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. Spacelab system analysis: The modified free access protocol: An access protocol for communication systems with periodic and Poisson traffic

    NASA Technical Reports Server (NTRS)

    Ingels, Frank; Owens, John; Daniel, Steven

    1989-01-01

    The protocol definition and terminal hardware for the modified free access protocol, a communications protocol similar to Ethernet, are developed. A MFA protocol simulator and a CSMA/CD math model are also developed. The protocol is tailored to communication systems where the total traffic may be divided into scheduled traffic and Poisson traffic. The scheduled traffic should occur on a periodic basis but may occur after a given event such as a request for data from a large number of stations. The Poisson traffic will include alarms and other random traffic. The purpose of the protocol is to guarantee that scheduled packets will be delivered without collision. This is required in many control and data collection systems. The protocol uses standard Ethernet hardware and software requiring minimum modifications to an existing system. The modification to the protocol only affects the Ethernet transmission privileges and does not effect the Ethernet receiver.

  1. The contribution of simple random sampling to observed variations in faecal egg counts.

    PubMed

    Torgerson, Paul R; Paul, Michaela; Lewis, Fraser I

    2012-09-10

    It has been over 100 years since the classical paper published by Gosset in 1907, under the pseudonym "Student", demonstrated that yeast cells suspended in a fluid and measured by a haemocytometer conformed to a Poisson process. Similarly parasite eggs in a faecal suspension also conform to a Poisson process. Despite this there are common misconceptions how to analyse or interpret observations from the McMaster or similar quantitative parasitic diagnostic techniques, widely used for evaluating parasite eggs in faeces. The McMaster technique can easily be shown from a theoretical perspective to give variable results that inevitably arise from the random distribution of parasite eggs in a well mixed faecal sample. The Poisson processes that lead to this variability are described and illustrative examples of the potentially large confidence intervals that can arise from observed faecal eggs counts that are calculated from the observations on a McMaster slide. Attempts to modify the McMaster technique, or indeed other quantitative techniques, to ensure uniform egg counts are doomed to failure and belie ignorance of Poisson processes. A simple method to immediately identify excess variation/poor sampling from replicate counts is provided. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. A random-censoring Poisson model for underreported data.

    PubMed

    de Oliveira, Guilherme Lopes; Loschi, Rosangela Helena; Assunção, Renato Martins

    2017-12-30

    A major challenge when monitoring risks in socially deprived areas of under developed countries is that economic, epidemiological, and social data are typically underreported. Thus, statistical models that do not take the data quality into account will produce biased estimates. To deal with this problem, counts in suspected regions are usually approached as censored information. The censored Poisson model can be considered, but all censored regions must be precisely known a priori, which is not a reasonable assumption in most practical situations. We introduce the random-censoring Poisson model (RCPM) which accounts for the uncertainty about both the count and the data reporting processes. Consequently, for each region, we will be able to estimate the relative risk for the event of interest as well as the censoring probability. To facilitate the posterior sampling process, we propose a Markov chain Monte Carlo scheme based on the data augmentation technique. We run a simulation study comparing the proposed RCPM with 2 competitive models. Different scenarios are considered. RCPM and censored Poisson model are applied to account for potential underreporting of early neonatal mortality counts in regions of Minas Gerais State, Brazil, where data quality is known to be poor. Copyright © 2017 John Wiley & Sons, Ltd.

  3. Systematic review of treatment modalities for gingival depigmentation: a random-effects poisson regression analysis.

    PubMed

    Lin, Yi Hung; Tu, Yu Kang; Lu, Chun Tai; Chung, Wen Chen; Huang, Chiung Fang; Huang, Mao Suan; Lu, Hsein Kun

    2014-01-01

    Repigmentation variably occurs with different treatment methods in patients with gingival pigmentation. A systemic review was conducted of various treatment modalities for eliminating melanin pigmentation of the gingiva, comprising bur abrasion, scalpel surgery, cryosurgery, electrosurgery, gingival grafts, and laser techniques, to compare the recurrence rates (Rrs) of these treatment procedures. Electronic databases, including PubMed, Web of Science, Google, and Medline were comprehensively searched, and manual searches were conducted for studies published from January 1951 to June 2013. After applying inclusion and exclusion criteria, the final list of articles was reviewed in depth to achieve the objectives of this review. A Poisson regression was used to analyze the outcome of depigmentation using the various treatment methods. The systematic review was based on case reports mainly. In total, 61 eligible publications met the defined criteria. The various therapeutic procedures showed variable clinical results with a wide range of Rrs. A random-effects Poisson regression showed that cryosurgery (Rr = 0.32%), electrosurgery (Rr = 0.74%), and laser depigmentation (Rr = 1.16%) yielded superior result, whereas bur abrasion yielded the highest Rr (8.89%). Within the limit of the sampling level, the present evidence-based results show that cryosurgery exhibits the optimal predictability for depigmentation of the gingiva among all procedures examined, followed by electrosurgery and laser techniques. It is possible to treat melanin pigmentation of the gingiva with various methods and prevent repigmentation. Among those treatment modalities, cryosurgery, electrosurgery, and laser surgery appear to be the best choices for treating gingival pigmentation. © 2014 Wiley Periodicals, Inc.

  4. Possible Statistics of Two Coupled Random Fields: Application to Passive Scalar

    NASA Technical Reports Server (NTRS)

    Dubrulle, B.; He, Guo-Wei; Bushnell, Dennis M. (Technical Monitor)

    2000-01-01

    We use the relativity postulate of scale invariance to derive the similarity transformations between two coupled scale-invariant random elds at different scales. We nd the equations leading to the scaling exponents. This formulation is applied to the case of passive scalars advected i) by a random Gaussian velocity field; and ii) by a turbulent velocity field. In the Gaussian case, we show that the passive scalar increments follow a log-Levy distribution generalizing Kraichnan's solution and, in an appropriate limit, a log-normal distribution. In the turbulent case, we show that when the velocity increments follow a log-Poisson statistics, the passive scalar increments follow a statistics close to log-Poisson. This result explains the experimental observations of Ruiz et al. about the temperature increments.

  5. Morphology and linear-elastic moduli of random network solids.

    PubMed

    Nachtrab, Susan; Kapfer, Sebastian C; Arns, Christoph H; Madadi, Mahyar; Mecke, Klaus; Schröder-Turk, Gerd E

    2011-06-17

    The effective linear-elastic moduli of disordered network solids are analyzed by voxel-based finite element calculations. We analyze network solids given by Poisson-Voronoi processes and by the structure of collagen fiber networks imaged by confocal microscopy. The solid volume fraction ϕ is varied by adjusting the fiber radius, while keeping the structural mesh or pore size of the underlying network fixed. For intermediate ϕ, the bulk and shear modulus are approximated by empirical power-laws K(phi)proptophin and G(phi)proptophim with n≈1.4 and m≈1.7. The exponents for the collagen and the Poisson-Voronoi network solids are similar, and are close to the values n=1.22 and m=2.11 found in a previous voxel-based finite element study of Poisson-Voronoi systems with different boundary conditions. However, the exponents of these empirical power-laws are at odds with the analytic values of n=1 and m=2, valid for low-density cellular structures in the limit of thin beams. We propose a functional form for K(ϕ) that models the cross-over from a power-law at low densities to a porous solid at high densities; a fit of the data to this functional form yields the asymptotic exponent n≈1.00, as expected. Further, both the intensity of the Poisson-Voronoi process and the collagen concentration in the samples, both of which alter the typical pore or mesh size, affect the effective moduli only by the resulting change of the solid volume fraction. These findings suggest that a network solid with the structure of the collagen networks can be modeled in quantitative agreement by a Poisson-Voronoi process. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Finite-range Coulomb gas models of banded random matrices and quantum kicked rotors

    NASA Astrophysics Data System (ADS)

    Pandey, Akhilesh; Kumar, Avanish; Puri, Sanjay

    2017-11-01

    Dyson demonstrated an equivalence between infinite-range Coulomb gas models and classical random matrix ensembles for the study of eigenvalue statistics. We introduce finite-range Coulomb gas (FRCG) models via a Brownian matrix process, and study them analytically and by Monte Carlo simulations. These models yield new universality classes, and provide a theoretical framework for the study of banded random matrices (BRMs) and quantum kicked rotors (QKRs). We demonstrate that, for a BRM of bandwidth b and a QKR of chaos parameter α , the appropriate FRCG model has the effective range d =b2/N =α2/N , for large N matrix dimensionality. As d increases, there is a transition from Poisson to classical random matrix statistics.

  7. Finite-range Coulomb gas models of banded random matrices and quantum kicked rotors.

    PubMed

    Pandey, Akhilesh; Kumar, Avanish; Puri, Sanjay

    2017-11-01

    Dyson demonstrated an equivalence between infinite-range Coulomb gas models and classical random matrix ensembles for the study of eigenvalue statistics. We introduce finite-range Coulomb gas (FRCG) models via a Brownian matrix process, and study them analytically and by Monte Carlo simulations. These models yield new universality classes, and provide a theoretical framework for the study of banded random matrices (BRMs) and quantum kicked rotors (QKRs). We demonstrate that, for a BRM of bandwidth b and a QKR of chaos parameter α, the appropriate FRCG model has the effective range d=b^{2}/N=α^{2}/N, for large N matrix dimensionality. As d increases, there is a transition from Poisson to classical random matrix statistics.

  8. A stochastical event-based continuous time step rainfall generator based on Poisson rectangular pulse and microcanonical random cascade models

    NASA Astrophysics Data System (ADS)

    Pohle, Ina; Niebisch, Michael; Zha, Tingting; Schümberg, Sabine; Müller, Hannes; Maurer, Thomas; Hinz, Christoph

    2017-04-01

    Rainfall variability within a storm is of major importance for fast hydrological processes, e.g. surface runoff, erosion and solute dissipation from surface soils. To investigate and simulate the impacts of within-storm variabilities on these processes, long time series of rainfall with high resolution are required. Yet, observed precipitation records of hourly or higher resolution are in most cases available only for a small number of stations and only for a few years. To obtain long time series of alternating rainfall events and interstorm periods while conserving the statistics of observed rainfall events, the Poisson model can be used. Multiplicative microcanonical random cascades have been widely applied to disaggregate rainfall time series from coarse to fine temporal resolution. We present a new coupling approach of the Poisson rectangular pulse model and the multiplicative microcanonical random cascade model that preserves the characteristics of rainfall events as well as inter-storm periods. In the first step, a Poisson rectangular pulse model is applied to generate discrete rainfall events (duration and mean intensity) and inter-storm periods (duration). The rainfall events are subsequently disaggregated to high-resolution time series (user-specified, e.g. 10 min resolution) by a multiplicative microcanonical random cascade model. One of the challenges of coupling these models is to parameterize the cascade model for the event durations generated by the Poisson model. In fact, the cascade model is best suited to downscale rainfall data with constant time step such as daily precipitation data. Without starting from a fixed time step duration (e.g. daily), the disaggregation of events requires some modifications of the multiplicative microcanonical random cascade model proposed by Olsson (1998): Firstly, the parameterization of the cascade model for events of different durations requires continuous functions for the probabilities of the multiplicative weights, which we implemented through sigmoid functions. Secondly, the branching of the first and last box is constrained to preserve the rainfall event durations generated by the Poisson rectangular pulse model. The event-based continuous time step rainfall generator has been developed and tested using 10 min and hourly rainfall data of four stations in North-Eastern Germany. The model performs well in comparison to observed rainfall in terms of event durations and mean event intensities as well as wet spell and dry spell durations. It is currently being tested using data from other stations across Germany and in different climate zones. Furthermore, the rainfall event generator is being applied in modelling approaches aimed at understanding the impact of rainfall variability on hydrological processes. Reference Olsson, J.: Evaluation of a scaling cascade model for temporal rainfall disaggregation, Hydrology and Earth System Sciences, 2, 19.30

  9. Variable- and Person-Centered Approaches to the Analysis of Early Adolescent Substance Use: Linking Peer, Family, and Intervention Effects with Developmental Trajectories

    ERIC Educational Resources Information Center

    Connell, Arin M.; Dishion, Thomas J.; Deater-Deckard, Kirby

    2006-01-01

    This 4-year study of 698 young adolescents examined the covariates of early onset substance use from Grade 6 through Grade 9. The youth were randomly assigned to a family-centered Adolescent Transitions Program (ATP) condition. Variable-centered (zero-inflated Poisson growth model) and person-centered (latent growth mixture model) approaches were…

  10. Formulation of the Multi-Hit Model With a Non-Poisson Distribution of Hits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vassiliev, Oleg N., E-mail: Oleg.Vassiliev@albertahealthservices.ca

    2012-07-15

    Purpose: We proposed a formulation of the multi-hit single-target model in which the Poisson distribution of hits was replaced by a combination of two distributions: one for the number of particles entering the target and one for the number of hits a particle entering the target produces. Such an approach reflects the fact that radiation damage is a result of two different random processes: particle emission by a radiation source and interaction of particles with matter inside the target. Methods and Materials: Poisson distribution is well justified for the first of the two processes. The second distribution depends on howmore » a hit is defined. To test our approach, we assumed that the second distribution was also a Poisson distribution. The two distributions combined resulted in a non-Poisson distribution. We tested the proposed model by comparing it with previously reported data for DNA single- and double-strand breaks induced by protons and electrons, for survival of a range of cell lines, and variation of the initial slopes of survival curves with radiation quality for heavy-ion beams. Results: Analysis of cell survival equations for this new model showed that they had realistic properties overall, such as the initial and high-dose slopes of survival curves, the shoulder, and relative biological effectiveness (RBE) In most cases tested, a better fit of survival curves was achieved with the new model than with the linear-quadratic model. The results also suggested that the proposed approach may extend the multi-hit model beyond its traditional role in analysis of survival curves to predicting effects of radiation quality and analysis of DNA strand breaks. Conclusions: Our model, although conceptually simple, performed well in all tests. The model was able to consistently fit data for both cell survival and DNA single- and double-strand breaks. It correctly predicted the dependence of radiation effects on parameters of radiation quality.« less

  11. Waiting-time distributions of magnetic discontinuities: clustering or Poisson process?

    PubMed

    Greco, A; Matthaeus, W H; Servidio, S; Dmitruk, P

    2009-10-01

    Using solar wind data from the Advanced Composition Explorer spacecraft, with the support of Hall magnetohydrodynamic simulations, the waiting-time distributions of magnetic discontinuities have been analyzed. A possible phenomenon of clusterization of these discontinuities is studied in detail. We perform a local Poisson's analysis in order to establish if these intermittent events are randomly distributed or not. Possible implications about the nature of solar wind discontinuities are discussed.

  12. The Poisson Random Process. Applications of Probability Theory to Operations Research. Modules and Monographs in Undergraduate Mathematics and Its Applications Project. UMAP Unit 340.

    ERIC Educational Resources Information Center

    Wilde, Carroll O.

    The Poisson probability distribution is seen to provide a mathematical model from which useful information can be obtained in practical applications. The distribution and some situations to which it applies are studied, and ways to find answers to practical questions are noted. The unit includes exercises and a model exam, and provides answers to…

  13. Waiting-time distributions of magnetic discontinuities: Clustering or Poisson process?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greco, A.; Matthaeus, W. H.; Servidio, S.

    2009-10-15

    Using solar wind data from the Advanced Composition Explorer spacecraft, with the support of Hall magnetohydrodynamic simulations, the waiting-time distributions of magnetic discontinuities have been analyzed. A possible phenomenon of clusterization of these discontinuities is studied in detail. We perform a local Poisson's analysis in order to establish if these intermittent events are randomly distributed or not. Possible implications about the nature of solar wind discontinuities are discussed.

  14. Harmonic statistics

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo

    2017-05-01

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their 'public relations' for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of this object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford's law, and 1/f noise.

  15. A comparison of bivariate, multivariate random-effects, and Poisson correlated gamma-frailty models to meta-analyze individual patient data of ordinal scale diagnostic tests.

    PubMed

    Simoneau, Gabrielle; Levis, Brooke; Cuijpers, Pim; Ioannidis, John P A; Patten, Scott B; Shrier, Ian; Bombardier, Charles H; de Lima Osório, Flavia; Fann, Jesse R; Gjerdingen, Dwenda; Lamers, Femke; Lotrakul, Manote; Löwe, Bernd; Shaaban, Juwita; Stafford, Lesley; van Weert, Henk C P M; Whooley, Mary A; Wittkampf, Karin A; Yeung, Albert S; Thombs, Brett D; Benedetti, Andrea

    2017-11-01

    Individual patient data (IPD) meta-analyses are increasingly common in the literature. In the context of estimating the diagnostic accuracy of ordinal or semi-continuous scale tests, sensitivity and specificity are often reported for a given threshold or a small set of thresholds, and a meta-analysis is conducted via a bivariate approach to account for their correlation. When IPD are available, sensitivity and specificity can be pooled for every possible threshold. Our objective was to compare the bivariate approach, which can be applied separately at every threshold, to two multivariate methods: the ordinal multivariate random-effects model and the Poisson correlated gamma-frailty model. Our comparison was empirical, using IPD from 13 studies that evaluated the diagnostic accuracy of the 9-item Patient Health Questionnaire depression screening tool, and included simulations. The empirical comparison showed that the implementation of the two multivariate methods is more laborious in terms of computational time and sensitivity to user-supplied values compared to the bivariate approach. Simulations showed that ignoring the within-study correlation of sensitivity and specificity across thresholds did not worsen inferences with the bivariate approach compared to the Poisson model. The ordinal approach was not suitable for simulations because the model was highly sensitive to user-supplied starting values. We tentatively recommend the bivariate approach rather than more complex multivariate methods for IPD diagnostic accuracy meta-analyses of ordinal scale tests, although the limited type of diagnostic data considered in the simulation study restricts the generalization of our findings. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. The Significance of an Excess in a Counting Experiment: Assessing the Impact of Systematic Uncertainties and the Case with a Gaussian Background

    NASA Astrophysics Data System (ADS)

    Vianello, Giacomo

    2018-05-01

    Several experiments in high-energy physics and astrophysics can be treated as on/off measurements, where an observation potentially containing a new source or effect (“on” measurement) is contrasted with a background-only observation free of the effect (“off” measurement). In counting experiments, the significance of the new source or effect can be estimated with a widely used formula from Li & Ma, which assumes that both measurements are Poisson random variables. In this paper we study three other cases: (i) the ideal case where the background measurement has no uncertainty, which can be used to study the maximum sensitivity that an instrument can achieve, (ii) the case where the background estimate b in the off measurement has an additional systematic uncertainty, and (iii) the case where b is a Gaussian random variable instead of a Poisson random variable. The latter case applies when b comes from a model fitted on archival or ancillary data, or from the interpolation of a function fitted on data surrounding the candidate new source/effect. Practitioners typically use a formula that is only valid when b is large and when its uncertainty is very small, while we derive a general formula that can be applied in all regimes. We also develop simple methods that can be used to assess how much an estimate of significance is sensitive to systematic uncertainties on the efficiency or on the background. Examples of applications include the detection of short gamma-ray bursts and of new X-ray or γ-ray sources. All the techniques presented in this paper are made available in a Python code that is ready to use.

  17. Applying the zero-inflated Poisson model with random effects to detect abnormal rises in school absenteeism indicating infectious diseases outbreak.

    PubMed

    Song, X X; Zhao, Q; Tao, T; Zhou, C M; Diwan, V K; Xu, B

    2018-05-30

    Records of absenteeism from primary schools are valuable data for infectious diseases surveillance. However, the analysis of the absenteeism is complicated by the data features of clustering at zero, non-independence and overdispersion. This study aimed to generate an appropriate model to handle the absenteeism data collected in a European Commission granted project for infectious disease surveillance in rural China and to evaluate the validity and timeliness of the resulting model for early warnings of infectious disease outbreak. Four steps were taken: (1) building a 'well-fitting' model by the zero-inflated Poisson model with random effects (ZIP-RE) using the absenteeism data from the first implementation year; (2) applying the resulting model to predict the 'expected' number of absenteeism events in the second implementation year; (3) computing the differences between the observations and the expected values (O-E values) to generate an alternative series of data; (4) evaluating the early warning validity and timeliness of the observational data and model-based O-E values via the EARS-3C algorithms with regard to the detection of real cluster events. The results indicate that ZIP-RE and its corresponding O-E values could improve the detection of aberrations, reduce the false-positive signals and are applicable to the zero-inflated data.

  18. Log-normal frailty models fitted as Poisson generalized linear mixed models.

    PubMed

    Hirsch, Katharina; Wienke, Andreas; Kuss, Oliver

    2016-12-01

    The equivalence of a survival model with a piecewise constant baseline hazard function and a Poisson regression model has been known since decades. As shown in recent studies, this equivalence carries over to clustered survival data: A frailty model with a log-normal frailty term can be interpreted and estimated as a generalized linear mixed model with a binary response, a Poisson likelihood, and a specific offset. Proceeding this way, statistical theory and software for generalized linear mixed models are readily available for fitting frailty models. This gain in flexibility comes at the small price of (1) having to fix the number of pieces for the baseline hazard in advance and (2) having to "explode" the data set by the number of pieces. In this paper we extend the simulations of former studies by using a more realistic baseline hazard (Gompertz) and by comparing the model under consideration with competing models. Furthermore, the SAS macro %PCFrailty is introduced to apply the Poisson generalized linear mixed approach to frailty models. The simulations show good results for the shared frailty model. Our new %PCFrailty macro provides proper estimates, especially in case of 4 events per piece. The suggested Poisson generalized linear mixed approach for log-normal frailty models based on the %PCFrailty macro provides several advantages in the analysis of clustered survival data with respect to more flexible modelling of fixed and random effects, exact (in the sense of non-approximate) maximum likelihood estimation, and standard errors and different types of confidence intervals for all variance parameters. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. The analysis of incontinence episodes and other count data in patients with overactive bladder by Poisson and negative binomial regression.

    PubMed

    Martina, R; Kay, R; van Maanen, R; Ridder, A

    2015-01-01

    Clinical studies in overactive bladder have traditionally used analysis of covariance or nonparametric methods to analyse the number of incontinence episodes and other count data. It is known that if the underlying distributional assumptions of a particular parametric method do not hold, an alternative parametric method may be more efficient than a nonparametric one, which makes no assumptions regarding the underlying distribution of the data. Therefore, there are advantages in using methods based on the Poisson distribution or extensions of that method, which incorporate specific features that provide a modelling framework for count data. One challenge with count data is overdispersion, but methods are available that can account for this through the introduction of random effect terms in the modelling, and it is this modelling framework that leads to the negative binomial distribution. These models can also provide clinicians with a clearer and more appropriate interpretation of treatment effects in terms of rate ratios. In this paper, the previously used parametric and non-parametric approaches are contrasted with those based on Poisson regression and various extensions in trials evaluating solifenacin and mirabegron in patients with overactive bladder. In these applications, negative binomial models are seen to fit the data well. Copyright © 2014 John Wiley & Sons, Ltd.

  20. Isolation and Connectivity in Random Geometric Graphs with Self-similar Intensity Measures

    NASA Astrophysics Data System (ADS)

    Dettmann, Carl P.

    2018-05-01

    Random geometric graphs consist of randomly distributed nodes (points), with pairs of nodes within a given mutual distance linked. In the usual model the distribution of nodes is uniform on a square, and in the limit of infinitely many nodes and shrinking linking range, the number of isolated nodes is Poisson distributed, and the probability of no isolated nodes is equal to the probability the whole graph is connected. Here we examine these properties for several self-similar node distributions, including smooth and fractal, uniform and nonuniform, and finitely ramified or otherwise. We show that nonuniformity can break the Poisson distribution property, but it strengthens the link between isolation and connectivity. It also stretches out the connectivity transition. Finite ramification is another mechanism for lack of connectivity. The same considerations apply to fractal distributions as smooth, with some technical differences in evaluation of the integrals and analytical arguments.

  1. Approximations to camera sensor noise

    NASA Astrophysics Data System (ADS)

    Jin, Xiaodan; Hirakawa, Keigo

    2013-02-01

    Noise is present in all image sensor data. Poisson distribution is said to model the stochastic nature of the photon arrival process, while it is common to approximate readout/thermal noise by additive white Gaussian noise (AWGN). Other sources of signal-dependent noise such as Fano and quantization also contribute to the overall noise profile. Question remains, however, about how best to model the combined sensor noise. Though additive Gaussian noise with signal-dependent noise variance (SD-AWGN) and Poisson corruption are two widely used models to approximate the actual sensor noise distribution, the justification given to these types of models are based on limited evidence. The goal of this paper is to provide a more comprehensive characterization of random noise. We concluded by presenting concrete evidence that Poisson model is a better approximation to real camera model than SD-AWGN. We suggest further modification to Poisson that may improve the noise model.

  2. Random matrices and the New York City subway system

    NASA Astrophysics Data System (ADS)

    Jagannath, Aukosh; Trogdon, Thomas

    2017-09-01

    We analyze subway arrival times in the New York City subway system. We find regimes where the gaps between trains are well modeled by (unitarily invariant) random matrix statistics and Poisson statistics. The departure from random matrix statistics is captured by the value of the Coulomb potential along the subway route. This departure becomes more pronounced as trains make more stops.

  3. Simulation of flight maneuver-load distributions by utilizing stationary, non-Gaussian random load histories

    NASA Technical Reports Server (NTRS)

    Leybold, H. A.

    1971-01-01

    Random numbers were generated with the aid of a digital computer and transformed such that the probability density function of a discrete random load history composed of these random numbers had one of the following non-Gaussian distributions: Poisson, binomial, log-normal, Weibull, and exponential. The resulting random load histories were analyzed to determine their peak statistics and were compared with cumulative peak maneuver-load distributions for fighter and transport aircraft in flight.

  4. Eruption patterns of the chilean volcanoes Villarrica, Llaima, and Tupungatito

    NASA Astrophysics Data System (ADS)

    Muñoz, Miguel

    1983-09-01

    The historical eruption records of three Chilean volcanoes have been subjected to many statistical tests, and none have been found to differ significantly from random, or Poissonian, behaviour. The statistical analysis shows rough conformity with the descriptions determined from the eruption rate functions. It is possible that a constant eruption rate describes the activity of Villarrica; Llaima and Tupungatito present complex eruption rate patterns that appear, however, to have no statistical significance. Questions related to loading and extinction processes and to the existence of shallow secondary magma chambers to which magma is supplied from a deeper system are also addressed. The analysis and the computation of the serial correlation coefficients indicate that the three series may be regarded as stationary renewal processes. None of the test statistics indicates rejection of the Poisson hypothesis at a level less than 5%, but the coefficient of variation for the eruption series at Llaima is significantly different from the value expected for a Poisson process. Also, the estimates of the normalized spectrum of the counting process for the three series suggest a departure from the random model, but the deviations are not found to be significant at the 5% level. Kolmogorov-Smirnov and chi-squared test statistics, applied directly to ascertaining to which probability P the random Poisson model fits the data, indicate that there is significant agreement in the case of Villarrica ( P=0.59) and Tupungatito ( P=0.3). Even though the P-value for Llaima is a marginally significant 0.1 (which is equivalent to rejecting the Poisson model at the 90% confidence level), the series suggests that nonrandom features are possibly present in the eruptive activity of this volcano.

  5. Applying the Anderson-Darling test to suicide clusters: evidence of contagion at U. S. universities?

    PubMed

    MacKenzie, Donald W

    2013-01-01

    Suicide clusters at Cornell University and the Massachusetts Institute of Technology (MIT) prompted popular and expert speculation of suicide contagion. However, some clustering is to be expected in any random process. This work tested whether suicide clusters at these two universities differed significantly from those expected under a homogeneous Poisson process, in which suicides occur randomly and independently of one another. Suicide dates were collected for MIT and Cornell for 1990-2012. The Anderson-Darling statistic was used to test the goodness-of-fit of the intervals between suicides to distribution expected under the Poisson process. Suicides at MIT were consistent with the homogeneous Poisson process, while those at Cornell showed clustering inconsistent with such a process (p = .05). The Anderson-Darling test provides a statistically powerful means to identify suicide clustering in small samples. Practitioners can use this method to test for clustering in relevant communities. The difference in clustering behavior between the two institutions suggests that more institutions should be studied to determine the prevalence of suicide clustering in universities and its causes.

  6. De Rham-Hodge decomposition and vanishing of harmonic forms by derivation operators on the Poisson space

    NASA Astrophysics Data System (ADS)

    Privault, Nicolas

    2016-05-01

    We construct differential forms of all orders and a covariant derivative together with its adjoint on the probability space of a standard Poisson process, using derivation operators. In this framewok we derive a de Rham-Hodge-Kodaira decomposition as well as Weitzenböck and Clark-Ocone formulas for random differential forms. As in the Wiener space setting, this construction provides two distinct approaches to the vanishing of harmonic differential forms.

  7. [Application of negative binomial regression and modified Poisson regression in the research of risk factors for injury frequency].

    PubMed

    Cao, Qingqing; Wu, Zhenqiang; Sun, Ying; Wang, Tiezhu; Han, Tengwei; Gu, Chaomei; Sun, Yehuan

    2011-11-01

    To Eexplore the application of negative binomial regression and modified Poisson regression analysis in analyzing the influential factors for injury frequency and the risk factors leading to the increase of injury frequency. 2917 primary and secondary school students were selected from Hefei by cluster random sampling method and surveyed by questionnaire. The data on the count event-based injuries used to fitted modified Poisson regression and negative binomial regression model. The risk factors incurring the increase of unintentional injury frequency for juvenile students was explored, so as to probe the efficiency of these two models in studying the influential factors for injury frequency. The Poisson model existed over-dispersion (P < 0.0001) based on testing by the Lagrangemultiplier. Therefore, the over-dispersion dispersed data using a modified Poisson regression and negative binomial regression model, was fitted better. respectively. Both showed that male gender, younger age, father working outside of the hometown, the level of the guardian being above junior high school and smoking might be the results of higher injury frequencies. On a tendency of clustered frequency data on injury event, both the modified Poisson regression analysis and negative binomial regression analysis can be used. However, based on our data, the modified Poisson regression fitted better and this model could give a more accurate interpretation of relevant factors affecting the frequency of injury.

  8. A poisson process model for hip fracture risk.

    PubMed

    Schechner, Zvi; Luo, Gangming; Kaufman, Jonathan J; Siffert, Robert S

    2010-08-01

    The primary method for assessing fracture risk in osteoporosis relies primarily on measurement of bone mass. Estimation of fracture risk is most often evaluated using logistic or proportional hazards models. Notwithstanding the success of these models, there is still much uncertainty as to who will or will not suffer a fracture. This has led to a search for other components besides mass that affect bone strength. The purpose of this paper is to introduce a new mechanistic stochastic model that characterizes the risk of hip fracture in an individual. A Poisson process is used to model the occurrence of falls, which are assumed to occur at a rate, lambda. The load induced by a fall is assumed to be a random variable that has a Weibull probability distribution. The combination of falls together with loads leads to a compound Poisson process. By retaining only those occurrences of the compound Poisson process that result in a hip fracture, a thinned Poisson process is defined that itself is a Poisson process. The fall rate is modeled as an affine function of age, and hip strength is modeled as a power law function of bone mineral density (BMD). The risk of hip fracture can then be computed as a function of age and BMD. By extending the analysis to a Bayesian framework, the conditional densities of BMD given a prior fracture and no prior fracture can be computed and shown to be consistent with clinical observations. In addition, the conditional probabilities of fracture given a prior fracture and no prior fracture can also be computed, and also demonstrate results similar to clinical data. The model elucidates the fact that the hip fracture process is inherently random and improvements in hip strength estimation over and above that provided by BMD operate in a highly "noisy" environment and may therefore have little ability to impact clinical practice.

  9. Kernel-Correlated Levy Field Driven Forward Rate and Application to Derivative Pricing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bo Lijun; Wang Yongjin; Yang Xuewei, E-mail: xwyangnk@yahoo.com.cn

    2013-08-01

    We propose a term structure of forward rates driven by a kernel-correlated Levy random field under the HJM framework. The kernel-correlated Levy random field is composed of a kernel-correlated Gaussian random field and a centered Poisson random measure. We shall give a criterion to preclude arbitrage under the risk-neutral pricing measure. As applications, an interest rate derivative with general payoff functional is priced under this pricing measure.

  10. Saint-Venant end effects for materials with negative Poisson's ratios

    NASA Technical Reports Server (NTRS)

    Lakes, R. S.

    1992-01-01

    Results are presented from an analysis of Saint-Venant end effects for materials with negative Poisson's ratio. Examples are presented showing that slow decay of end stress occurs in circular cylinders of negative Poisson's ratio, whereas a sandwich panel containing rigid face sheets and a compliant core exhibits no anomalous effects for negative Poisson's ratio (but exhibits slow stress decay for core Poisson's ratios approaching 0.5). In sand panels with stiff but not perfectly rigid face sheets, a negative Poisson's ratio results in end stress decay, which is faster than it would be otherwise. It is suggested that the slow decay previously predicted for sandwich strips in plane deformation as a result of the geometry can be mitigated by the use of a negative Poisson's ratio material for the core.

  11. Survival analysis of clinical mastitis data using a nested frailty Cox model fit as a mixed-effects Poisson model.

    PubMed

    Elghafghuf, Adel; Dufour, Simon; Reyher, Kristen; Dohoo, Ian; Stryhn, Henrik

    2014-12-01

    Mastitis is a complex disease affecting dairy cows and is considered to be the most costly disease of dairy herds. The hazard of mastitis is a function of many factors, both managerial and environmental, making its control a difficult issue to milk producers. Observational studies of clinical mastitis (CM) often generate datasets with a number of characteristics which influence the analysis of those data: the outcome of interest may be the time to occurrence of a case of mastitis, predictors may change over time (time-dependent predictors), the effects of factors may change over time (time-dependent effects), there are usually multiple hierarchical levels, and datasets may be very large. Analysis of such data often requires expansion of the data into the counting-process format - leading to larger datasets - thus complicating the analysis and requiring excessive computing time. In this study, a nested frailty Cox model with time-dependent predictors and effects was applied to Canadian Bovine Mastitis Research Network data in which 10,831 lactations of 8035 cows from 69 herds were followed through lactation until the first occurrence of CM. The model was fit to the data as a Poisson model with nested normally distributed random effects at the cow and herd levels. Risk factors associated with the hazard of CM during the lactation were identified, such as parity, calving season, herd somatic cell score, pasture access, fore-stripping, and proportion of treated cases of CM in a herd. The analysis showed that most of the predictors had a strong effect early in lactation and also demonstrated substantial variation in the baseline hazard among cows and between herds. A small simulation study for a setting similar to the real data was conducted to evaluate the Poisson maximum likelihood estimation approach with both Gaussian quadrature method and Laplace approximation. Further, the performance of the two methods was compared with the performance of a widely used estimation approach for frailty Cox models based on the penalized partial likelihood. The simulation study showed good performance for the Poisson maximum likelihood approach with Gaussian quadrature and biased variance component estimates for both the Poisson maximum likelihood with Laplace approximation and penalized partial likelihood approaches. Copyright © 2014. Published by Elsevier B.V.

  12. Multilevel poisson regression modelling for determining factors of dengue fever cases in bandung

    NASA Astrophysics Data System (ADS)

    Arundina, Davila Rubianti; Tantular, Bertho; Pontoh, Resa Septiani

    2017-03-01

    Scralatina or Dengue Fever is a kind of fever caused by serotype virus which Flavivirus genus and be known as Dengue Virus. Dengue Fever caused by Aedes Aegipty Mosquito bites who infected by a dengue virus. The study was conducted in 151 villages in Bandung. Health Analysts believes that there are two factors that affect the dengue cases, Internal factor (individual) and external factor (environment). The data who used in this research is hierarchical data. The method is used for hierarchical data modelling is multilevel method. Which is, the level 1 is village and level 2 is sub-district. According exploration data analysis, the suitable Multilevel Method is Random Intercept Model. Penalized Quasi Likelihood (PQL) approach on multilevel Poisson is a proper analysis to determine factors that affecting dengue cases in the city of Bandung. Clean and Healthy Behavior factor from the village level have an effect on the number of cases of dengue fever in the city of Bandung. Factor from the sub-district level has no effect.

  13. Outcomes of a pilot hand hygiene randomized cluster trial to reduce communicable infections among US office-based employees.

    PubMed

    Stedman-Smith, Maggie; DuBois, Cathy L Z; Grey, Scott F; Kingsbury, Diana M; Shakya, Sunita; Scofield, Jennifer; Slenkovich, Ken

    2015-04-01

    To determine the effectiveness of an office-based multimodal hand hygiene improvement intervention in reducing self-reported communicable infections and work-related absence. A randomized cluster trial including an electronic training video, hand sanitizer, and educational posters (n = 131, intervention; n = 193, control). Primary outcomes include (1) self-reported acute respiratory infections (ARIs)/influenza-like illness (ILI) and/or gastrointestinal (GI) infections during the prior 30 days; and (2) related lost work days. Incidence rate ratios calculated using generalized linear mixed models with a Poisson distribution, adjusted for confounders and random cluster effects. A 31% relative reduction in self-reported combined ARI-ILI/GI infections (incidence rate ratio: 0.69; 95% confidence interval, 0.49 to 0.98). A 21% nonsignificant relative reduction in lost work days. An office-based multimodal hand hygiene improvement intervention demonstrated a substantive reduction in self-reported combined ARI-ILI/GI infections.

  14. Random transitions described by the stochastic Smoluchowski-Poisson system and by the stochastic Keller-Segel model.

    PubMed

    Chavanis, P H; Delfini, L

    2014-03-01

    We study random transitions between two metastable states that appear below a critical temperature in a one-dimensional self-gravitating Brownian gas with a modified Poisson equation experiencing a second order phase transition from a homogeneous phase to an inhomogeneous phase [P. H. Chavanis and L. Delfini, Phys. Rev. E 81, 051103 (2010)]. We numerically solve the N-body Langevin equations and the stochastic Smoluchowski-Poisson system, which takes fluctuations (finite N effects) into account. The system switches back and forth between the two metastable states (bistability) and the particles accumulate successively at the center or at the boundary of the domain. We explicitly show that these random transitions exhibit the phenomenology of the ordinary Kramers problem for a Brownian particle in a double-well potential. The distribution of the residence time is Poissonian and the average lifetime of a metastable state is given by the Arrhenius law; i.e., it is proportional to the exponential of the barrier of free energy ΔF divided by the energy of thermal excitation kBT. Since the free energy is proportional to the number of particles N for a system with long-range interactions, the lifetime of metastable states scales as eN and is considerable for N≫1. As a result, in many applications, metastable states of systems with long-range interactions can be considered as stable states. However, for moderate values of N, or close to a critical point, the lifetime of the metastable states is reduced since the barrier of free energy decreases. In that case, the fluctuations become important and the mean field approximation is no more valid. This is the situation considered in this paper. By an appropriate change of notations, our results also apply to bacterial populations experiencing chemotaxis in biology. Their dynamics can be described by a stochastic Keller-Segel model that takes fluctuations into account and goes beyond the usual mean field approximation.

  15. Filling of a Poisson trap by a population of random intermittent searchers.

    PubMed

    Bressloff, Paul C; Newby, Jay M

    2012-03-01

    We extend the continuum theory of random intermittent search processes to the case of N independent searchers looking to deliver cargo to a single hidden target located somewhere on a semi-infinite track. Each searcher randomly switches between a stationary state and either a leftward or rightward constant velocity state. We assume that all of the particles start at one end of the track and realize sample trajectories independently generated from the same underlying stochastic process. The hidden target is treated as a partially absorbing trap in which a particle can only detect the target and deliver its cargo if it is stationary and within range of the target; the particle is removed from the system after delivering its cargo. As a further generalization of previous models, we assume that up to n successive particles can find the target and deliver its cargo. Assuming that the rate of target detection scales as 1/N, we show that there exists a well-defined mean-field limit N→∞, in which the stochastic model reduces to a deterministic system of linear reaction-hyperbolic equations for the concentrations of particles in each of the internal states. These equations decouple from the stochastic process associated with filling the target with cargo. The latter can be modeled as a Poisson process in which the time-dependent rate of filling λ(t) depends on the concentration of stationary particles within the target domain. Hence, we refer to the target as a Poisson trap. We analyze the efficiency of filling the Poisson trap with n particles in terms of the waiting time density f(n)(t). The latter is determined by the integrated Poisson rate μ(t)=∫(0)(t)λ(s)ds, which in turn depends on the solution to the reaction-hyperbolic equations. We obtain an approximate solution for the particle concentrations by reducing the system of reaction-hyperbolic equations to a scalar advection-diffusion equation using a quasisteady-state analysis. We compare our analytical results for the mean-field model with Monte Carlo simulations for finite N. We thus determine how the mean first passage time (MFPT) for filling the target depends on N and n.

  16. Probabilistic reasoning in data analysis.

    PubMed

    Sirovich, Lawrence

    2011-09-20

    This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.

  17. Fractional Brownian motion and long term clinical trial recruitment

    PubMed Central

    Zhang, Qiang; Lai, Dejian

    2015-01-01

    Prediction of recruitment in clinical trials has been a challenging task. Many methods have been studied, including models based on Poisson process and its large sample approximation by Brownian motion (BM), however, when the independent incremental structure is violated for BM model, we could use fractional Brownian motion to model and approximate the underlying Poisson processes with random rates. In this paper, fractional Brownian motion (FBM) is considered for such conditions and compared to BM model with illustrated examples from different trials and simulations. PMID:26347306

  18. Fractional Brownian motion and long term clinical trial recruitment.

    PubMed

    Zhang, Qiang; Lai, Dejian

    2011-05-01

    Prediction of recruitment in clinical trials has been a challenging task. Many methods have been studied, including models based on Poisson process and its large sample approximation by Brownian motion (BM), however, when the independent incremental structure is violated for BM model, we could use fractional Brownian motion to model and approximate the underlying Poisson processes with random rates. In this paper, fractional Brownian motion (FBM) is considered for such conditions and compared to BM model with illustrated examples from different trials and simulations.

  19. Modeling urban coastal flood severity from crowd-sourced flood reports using Poisson regression and Random Forest

    NASA Astrophysics Data System (ADS)

    Sadler, J. M.; Goodall, J. L.; Morsy, M. M.; Spencer, K.

    2018-04-01

    Sea level rise has already caused more frequent and severe coastal flooding and this trend will likely continue. Flood prediction is an essential part of a coastal city's capacity to adapt to and mitigate this growing problem. Complex coastal urban hydrological systems however, do not always lend themselves easily to physically-based flood prediction approaches. This paper presents a method for using a data-driven approach to estimate flood severity in an urban coastal setting using crowd-sourced data, a non-traditional but growing data source, along with environmental observation data. Two data-driven models, Poisson regression and Random Forest regression, are trained to predict the number of flood reports per storm event as a proxy for flood severity, given extensive environmental data (i.e., rainfall, tide, groundwater table level, and wind conditions) as input. The method is demonstrated using data from Norfolk, Virginia USA from September 2010 to October 2016. Quality-controlled, crowd-sourced street flooding reports ranging from 1 to 159 per storm event for 45 storm events are used to train and evaluate the models. Random Forest performed better than Poisson regression at predicting the number of flood reports and had a lower false negative rate. From the Random Forest model, total cumulative rainfall was by far the most dominant input variable in predicting flood severity, followed by low tide and lower low tide. These methods serve as a first step toward using data-driven methods for spatially and temporally detailed coastal urban flood prediction.

  20. Atomic clocks and the continuous-time random-walk

    NASA Astrophysics Data System (ADS)

    Formichella, Valerio; Camparo, James; Tavella, Patrizia

    2017-11-01

    Atomic clocks play a fundamental role in many fields, most notably they generate Universal Coordinated Time and are at the heart of all global navigation satellite systems. Notwithstanding their excellent timekeeping performance, their output frequency does vary: it can display deterministic frequency drift; diverse continuous noise processes result in nonstationary clock noise (e.g., random-walk frequency noise, modelled as a Wiener process), and the clock frequency may display sudden changes (i.e., "jumps"). Typically, the clock's frequency instability is evaluated by the Allan or Hadamard variances, whose functional forms can identify the different operative noise processes. Here, we show that the Allan and Hadamard variances of a particular continuous-time random-walk, the compound Poisson process, have the same functional form as for a Wiener process with drift. The compound Poisson process, introduced as a model for observed frequency jumps, is an alternative to the Wiener process for modelling random walk frequency noise. This alternate model fits well the behavior of the rubidium clocks flying on GPS Block-IIR satellites. Further, starting from jump statistics, the model can be improved by considering a more general form of continuous-time random-walk, and this could bring new insights into the physics of atomic clocks.

  1. Poisson-Like Spiking in Circuits with Probabilistic Synapses

    PubMed Central

    Moreno-Bote, Rubén

    2014-01-01

    Neuronal activity in cortex is variable both spontaneously and during stimulation, and it has the remarkable property that it is Poisson-like over broad ranges of firing rates covering from virtually zero to hundreds of spikes per second. The mechanisms underlying cortical-like spiking variability over such a broad continuum of rates are currently unknown. We show that neuronal networks endowed with probabilistic synaptic transmission, a well-documented source of variability in cortex, robustly generate Poisson-like variability over several orders of magnitude in their firing rate without fine-tuning of the network parameters. Other sources of variability, such as random synaptic delays or spike generation jittering, do not lead to Poisson-like variability at high rates because they cannot be sufficiently amplified by recurrent neuronal networks. We also show that probabilistic synapses predict Fano factor constancy of synaptic conductances. Our results suggest that synaptic noise is a robust and sufficient mechanism for the type of variability found in cortex. PMID:25032705

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il

    The exponential, the normal, and the Poisson statistical laws are of major importance due to their universality. Harmonic statistics are as universal as the three aforementioned laws, but yet they fall short in their ‘public relations’ for the following reason: the full scope of harmonic statistics cannot be described in terms of a statistical law. In this paper we describe harmonic statistics, in their full scope, via an object termed harmonic Poisson process: a Poisson process, over the positive half-line, with a harmonic intensity. The paper reviews the harmonic Poisson process, investigates its properties, and presents the connections of thismore » object to an assortment of topics: uniform statistics, scale invariance, random multiplicative perturbations, Pareto and inverse-Pareto statistics, exponential growth and exponential decay, power-law renormalization, convergence and domains of attraction, the Langevin equation, diffusions, Benford’s law, and 1/f noise. - Highlights: • Harmonic statistics are described and reviewed in detail. • Connections to various statistical laws are established. • Connections to perturbation, renormalization and dynamics are established.« less

  3. A new multivariate zero-adjusted Poisson model with applications to biomedicine.

    PubMed

    Liu, Yin; Tian, Guo-Liang; Tang, Man-Lai; Yuen, Kam Chuen

    2018-05-25

    Recently, although advances were made on modeling multivariate count data, existing models really has several limitations: (i) The multivariate Poisson log-normal model (Aitchison and Ho, ) cannot be used to fit multivariate count data with excess zero-vectors; (ii) The multivariate zero-inflated Poisson (ZIP) distribution (Li et al., 1999) cannot be used to model zero-truncated/deflated count data and it is difficult to apply to high-dimensional cases; (iii) The Type I multivariate zero-adjusted Poisson (ZAP) distribution (Tian et al., 2017) could only model multivariate count data with a special correlation structure for random components that are all positive or negative. In this paper, we first introduce a new multivariate ZAP distribution, based on a multivariate Poisson distribution, which allows the correlations between components with a more flexible dependency structure, that is some of the correlation coefficients could be positive while others could be negative. We then develop its important distributional properties, and provide efficient statistical inference methods for multivariate ZAP model with or without covariates. Two real data examples in biomedicine are used to illustrate the proposed methods. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Effect of Poisson's loss factor of rubbery material on underwater sound absorption of anechoic coatings

    NASA Astrophysics Data System (ADS)

    Zhong, Jie; Zhao, Honggang; Yang, Haibin; Yin, Jianfei; Wen, Jihong

    2018-06-01

    Rubbery coatings embedded with air cavities are commonly used on underwater structures to reduce reflection of incoming sound waves. In this paper, the relationships between Poisson's and modulus loss factors of rubbery materials are theoretically derived, the different effects of the tiny Poisson's loss factor on characterizing the loss factors of shear and longitudinal moduli are revealed. Given complex Young's modulus and dynamic Poisson's ratio, it is found that the shear loss factor has almost invisible variation with the Poisson's loss factor and is very close to the loss factor of Young's modulus, while the longitudinal loss factor almost linearly decreases with the increase of Poisson's loss factor. Then, a finite element (FE) model is used to investigate the effect of the tiny Poisson's loss factor, which is generally neglected in some FE models, on the underwater sound absorption of rubbery coatings. Results show that the tiny Poisson's loss factor has a significant effect on the sound absorption of homogeneous coatings within the concerned frequency range, while it has both frequency- and structure-dependent influence on the sound absorption of inhomogeneous coatings with embedded air cavities. Given the material parameters and cavity dimensions, more obvious effect can be observed for the rubbery coating with a larger lattice constant and/or a thicker cover layer.

  5. Method of model reduction and multifidelity models for solute transport in random layered porous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Zhijie; Tartakovsky, Alexandre M.

    This work presents a hierarchical model for solute transport in bounded layered porous media with random permeability. The model generalizes the Taylor-Aris dispersion theory to stochastic transport in random layered porous media with a known velocity covariance function. In the hierarchical model, we represent (random) concentration in terms of its cross-sectional average and a variation function. We derive a one-dimensional stochastic advection-dispersion-type equation for the average concentration and a stochastic Poisson equation for the variation function, as well as expressions for the effective velocity and dispersion coefficient. We observe that velocity fluctuations enhance dispersion in a non-monotonic fashion: the dispersionmore » initially increases with correlation length λ, reaches a maximum, and decreases to zero at infinity. Maximum enhancement can be obtained at the correlation length about 0.25 the size of the porous media perpendicular to flow.« less

  6. Conditional modeling of antibody titers using a zero-inflated poisson random effects model: application to Fabrazyme.

    PubMed

    Bonate, Peter L; Sung, Crystal; Welch, Karen; Richards, Susan

    2009-10-01

    Patients that are exposed to biotechnology-derived therapeutics often develop antibodies to the therapeutic, the magnitude of which is assessed by measuring antibody titers. A statistical approach for analyzing antibody titer data conditional on seroconversion is presented. The proposed method is to first transform the antibody titer data based on a geometric series using a common ratio of 2 and a scale factor of 50 and then analyze the exponent using a zero-inflated or hurdle model assuming a Poisson or negative binomial distribution with random effects to account for patient heterogeneity. Patient specific covariates can be used to model the probability of developing an antibody response, i.e., seroconversion, as well as the magnitude of the antibody titer itself. The method was illustrated using antibody titer data from 87 male seroconverted Fabry patients receiving Fabrazyme. Titers from five clinical trials were collected over 276 weeks of therapy with anti-Fabrazyme IgG titers ranging from 100 to 409,600 after exclusion of seronegative patients. The best model to explain seroconversion was a zero-inflated Poisson (ZIP) model where cumulative dose (under a constant dose regimen of dosing every 2 weeks) influenced the probability of seroconversion. There was an 80% chance of seroconversion when the cumulative dose reached 210 mg (90% confidence interval: 194-226 mg). No difference in antibody titers was noted between Japanese or Western patients. Once seroconverted, antibody titers did not remain constant but decreased in an exponential manner from an initial magnitude to a new lower steady-state value. The expected titer after the new steady-state titer had been achieved was 870 (90% CI: 630-1109). The half-life to the new steady-state value after seroconversion was 44 weeks (90% CI: 17-70 weeks). Time to seroconversion did not appear to be correlated with titer at the time of seroconversion. The method can be adequately used to model antibody titer data.

  7. Measures of clustering and heterogeneity in multilevel Poisson regression analyses of rates/count data

    PubMed Central

    Austin, Peter C.; Stryhn, Henrik; Leckie, George; Merlo, Juan

    2017-01-01

    Multilevel data occur frequently in many research areas like health services research and epidemiology. A suitable way to analyze such data is through the use of multilevel regression models. These models incorporate cluster‐specific random effects that allow one to partition the total variation in the outcome into between‐cluster variation and between‐individual variation. The magnitude of the effect of clustering provides a measure of the general contextual effect. When outcomes are binary or time‐to‐event in nature, the general contextual effect can be quantified by measures of heterogeneity like the median odds ratio or the median hazard ratio, respectively, which can be calculated from a multilevel regression model. Outcomes that are integer counts denoting the number of times that an event occurred are common in epidemiological and medical research. The median (incidence) rate ratio in multilevel Poisson regression for counts that corresponds to the median odds ratio or median hazard ratio for binary or time‐to‐event outcomes respectively is relatively unknown and is rarely used. The median rate ratio is the median relative change in the rate of the occurrence of the event when comparing identical subjects from 2 randomly selected different clusters that are ordered by rate. We also describe how the variance partition coefficient, which denotes the proportion of the variation in the outcome that is attributable to between‐cluster differences, can be computed with count outcomes. We illustrate the application and interpretation of these measures in a case study analyzing the rate of hospital readmission in patients discharged from hospital with a diagnosis of heart failure. PMID:29114926

  8. Coupling Poisson rectangular pulse and multiplicative microcanonical random cascade models to generate sub-daily precipitation timeseries

    NASA Astrophysics Data System (ADS)

    Pohle, Ina; Niebisch, Michael; Müller, Hannes; Schümberg, Sabine; Zha, Tingting; Maurer, Thomas; Hinz, Christoph

    2018-07-01

    To simulate the impacts of within-storm rainfall variabilities on fast hydrological processes, long precipitation time series with high temporal resolution are required. Due to limited availability of observed data such time series are typically obtained from stochastic models. However, most existing rainfall models are limited in their ability to conserve rainfall event statistics which are relevant for hydrological processes. Poisson rectangular pulse models are widely applied to generate long time series of alternating precipitation events durations and mean intensities as well as interstorm period durations. Multiplicative microcanonical random cascade (MRC) models are used to disaggregate precipitation time series from coarse to fine temporal resolution. To overcome the inconsistencies between the temporal structure of the Poisson rectangular pulse model and the MRC model, we developed a new coupling approach by introducing two modifications to the MRC model. These modifications comprise (a) a modified cascade model ("constrained cascade") which preserves the event durations generated by the Poisson rectangular model by constraining the first and last interval of a precipitation event to contain precipitation and (b) continuous sigmoid functions of the multiplicative weights to consider the scale-dependency in the disaggregation of precipitation events of different durations. The constrained cascade model was evaluated in its ability to disaggregate observed precipitation events in comparison to existing MRC models. For that, we used a 20-year record of hourly precipitation at six stations across Germany. The constrained cascade model showed a pronounced better agreement with the observed data in terms of both the temporal pattern of the precipitation time series (e.g. the dry and wet spell durations and autocorrelations) and event characteristics (e.g. intra-event intermittency and intensity fluctuation within events). The constrained cascade model also slightly outperformed the other MRC models with respect to the intensity-frequency relationship. To assess the performance of the coupled Poisson rectangular pulse and constrained cascade model, precipitation events were stochastically generated by the Poisson rectangular pulse model and then disaggregated by the constrained cascade model. We found that the coupled model performs satisfactorily in terms of the temporal pattern of the precipitation time series, event characteristics and the intensity-frequency relationship.

  9. Pervasive randomness in physics: an introduction to its modelling and spectral characterisation

    NASA Astrophysics Data System (ADS)

    Howard, Roy

    2017-10-01

    An introduction to the modelling and spectral characterisation of random phenomena is detailed at a level consistent with a first exposure to the subject at an undergraduate level. A signal framework for defining a random process is provided and this underpins an introduction to common random processes including the Poisson point process, the random walk, the random telegraph signal, shot noise, information signalling random processes, jittered pulse trains, birth-death random processes and Markov chains. An introduction to the spectral characterisation of signals and random processes, via either an energy spectral density or a power spectral density, is detailed. The important case of defining a white noise random process concludes the paper.

  10. Preference heterogeneity in a count data model of demand for off-highway vehicle recreation

    Treesearch

    Thomas P Holmes; Jeffrey E Englin

    2010-01-01

    This paper examines heterogeneity in the preferences for OHV recreation by applying the random parameters Poisson model to a data set of off-highway vehicle (OHV) users at four National Forest sites in North Carolina. The analysis develops estimates of individual consumer surplus and finds that estimates are systematically affected by the random parameter specification...

  11. Poisson-Box Sampling algorithms for three-dimensional Markov binary mixtures

    NASA Astrophysics Data System (ADS)

    Larmier, Coline; Zoia, Andrea; Malvagi, Fausto; Dumonteil, Eric; Mazzolo, Alain

    2018-02-01

    Particle transport in Markov mixtures can be addressed by the so-called Chord Length Sampling (CLS) methods, a family of Monte Carlo algorithms taking into account the effects of stochastic media on particle propagation by generating on-the-fly the material interfaces crossed by the random walkers during their trajectories. Such methods enable a significant reduction of computational resources as opposed to reference solutions obtained by solving the Boltzmann equation for a large number of realizations of random media. CLS solutions, which neglect correlations induced by the spatial disorder, are faster albeit approximate, and might thus show discrepancies with respect to reference solutions. In this work we propose a new family of algorithms (called 'Poisson Box Sampling', PBS) aimed at improving the accuracy of the CLS approach for transport in d-dimensional binary Markov mixtures. In order to probe the features of PBS methods, we will focus on three-dimensional Markov media and revisit the benchmark problem originally proposed by Adams, Larsen and Pomraning [1] and extended by Brantley [2]: for these configurations we will compare reference solutions, standard CLS solutions and the new PBS solutions for scalar particle flux, transmission and reflection coefficients. PBS will be shown to perform better than CLS at the expense of a reasonable increase in computational time.

  12. Efficacy of a savings-led microfinance intervention to reduce sexual risk for HIV among women engaged in sex work: a randomized clinical trial.

    PubMed

    Witte, Susan S; Aira, Toivgoo; Tsai, Laura Cordisco; Riedel, Marion; Offringa, Reid; Chang, Mingway; El-Bassel, Nabila; Ssewamala, Fred

    2015-03-01

    We tested whether a structural intervention combining savings-led microfinance and HIV prevention components would achieve enhanced reductions in sexual risk among women engaging in street-based sex work in Ulaanbaatar, Mongolia, compared with an HIV prevention intervention alone. Between November 2011 and August 2012, we randomized 107 eligible women who completed baseline assessments to either a 4-session HIV sexual risk reduction intervention (HIVSRR) alone (n=50) or a 34-session HIVSRR plus a savings-led microfinance intervention (n=57). At 3- and 6-month follow-up assessments, participants reported unprotected acts of vaginal intercourse with paying partners and number of paying partners with whom they engaged in sexual intercourse in the previous 90 days. Using Poisson and zero-inflated Poisson model regressions, we examined the effects of assignment to treatment versus control condition on outcomes. At 6-month follow-up, the HIVSRR plus microfinance participants reported significantly fewer paying sexual partners and were more likely to report zero unprotected vaginal sex acts with paying sexual partners. Findings advance the HIV prevention repertoire for women, demonstrating that risk reduction may be achieved through a structural intervention that relies on asset building, including savings, and alternatives to income from sex work.

  13. Dual Roles for Spike Signaling in Cortical Neural Populations

    PubMed Central

    Ballard, Dana H.; Jehee, Janneke F. M.

    2011-01-01

    A prominent feature of signaling in cortical neurons is that of randomness in the action potential. The output of a typical pyramidal cell can be well fit with a Poisson model, and variations in the Poisson rate repeatedly have been shown to be correlated with stimuli. However while the rate provides a very useful characterization of neural spike data, it may not be the most fundamental description of the signaling code. Recent data showing γ frequency range multi-cell action potential correlations, together with spike timing dependent plasticity, are spurring a re-examination of the classical model, since precise timing codes imply that the generation of spikes is essentially deterministic. Could the observed Poisson randomness and timing determinism reflect two separate modes of communication, or do they somehow derive from a single process? We investigate in a timing-based model whether the apparent incompatibility between these probabilistic and deterministic observations may be resolved by examining how spikes could be used in the underlying neural circuits. The crucial component of this model draws on dual roles for spike signaling. In learning receptive fields from ensembles of inputs, spikes need to behave probabilistically, whereas for fast signaling of individual stimuli, the spikes need to behave deterministically. Our simulations show that this combination is possible if deterministic signals using γ latency coding are probabilistically routed through different members of a cortical cell population at different times. This model exhibits standard features characteristic of Poisson models such as orientation tuning and exponential interval histograms. In addition, it makes testable predictions that follow from the γ latency coding. PMID:21687798

  14. Linear and Poisson models for genetic evaluation of tick resistance in cross-bred Hereford x Nellore cattle.

    PubMed

    Ayres, D R; Pereira, R J; Boligon, A A; Silva, F F; Schenkel, F S; Roso, V M; Albuquerque, L G

    2013-12-01

    Cattle resistance to ticks is measured by the number of ticks infesting the animal. The model used for the genetic analysis of cattle resistance to ticks frequently requires logarithmic transformation of the observations. The objective of this study was to evaluate the predictive ability and goodness of fit of different models for the analysis of this trait in cross-bred Hereford x Nellore cattle. Three models were tested: a linear model using logarithmic transformation of the observations (MLOG); a linear model without transformation of the observations (MLIN); and a generalized linear Poisson model with residual term (MPOI). All models included the classificatory effects of contemporary group and genetic group and the covariates age of animal at the time of recording and individual heterozygosis, as well as additive genetic effects as random effects. Heritability estimates were 0.08 ± 0.02, 0.10 ± 0.02 and 0.14 ± 0.04 for MLIN, MLOG and MPOI models, respectively. The model fit quality, verified by deviance information criterion (DIC) and residual mean square, indicated fit superiority of MPOI model. The predictive ability of the models was compared by validation test in independent sample. The MPOI model was slightly superior in terms of goodness of fit and predictive ability, whereas the correlations between observed and predicted tick counts were practically the same for all models. A higher rank correlation between breeding values was observed between models MLOG and MPOI. Poisson model can be used for the selection of tick-resistant animals. © 2013 Blackwell Verlag GmbH.

  15. The Poisson model limits in NBA basketball: Complexity in team sports

    NASA Astrophysics Data System (ADS)

    Martín-González, Juan Manuel; de Saá Guerra, Yves; García-Manso, Juan Manuel; Arriaza, Enrique; Valverde-Estévez, Teresa

    2016-12-01

    Team sports are frequently studied by researchers. There is presumption that scoring in basketball is a random process and that can be described using the Poisson Model. Basketball is a collaboration-opposition sport, where the non-linear local interactions among players are reflected in the evolution of the score that ultimately determines the winner. In the NBA, the outcomes of close games are often decided in the last minute, where fouls play a main role. We examined 6130 NBA games in order to analyze the time intervals between baskets and scoring dynamics. Most numbers of baskets (n) over a time interval (ΔT) follow a Poisson distribution, but some (e.g., ΔT = 10 s, n > 3) behave as a Power Law. The Poisson distribution includes most baskets in any game, in most game situations, but in close games in the last minute, the numbers of events are distributed following a Power Law. The number of events can be adjusted by a mixture of two distributions. In close games, both teams try to maintain their advantage solely in order to reach the last minute: a completely different game. For this reason, we propose to use the Poisson model as a reference. The complex dynamics will emerge from the limits of this model.

  16. Outcomes of a Pilot Hand Hygiene Randomized Cluster Trial to Reduce Communicable Infections Among US Office-Based Employees

    PubMed Central

    DuBois, Cathy L.Z.; Grey, Scott F.; Kingsbury, Diana M.; Shakya, Sunita; Scofield, Jennifer; Slenkovich, Ken

    2015-01-01

    Objective: To determine the effectiveness of an office-based multimodal hand hygiene improvement intervention in reducing self-reported communicable infections and work-related absence. Methods: A randomized cluster trial including an electronic training video, hand sanitizer, and educational posters (n = 131, intervention; n = 193, control). Primary outcomes include (1) self-reported acute respiratory infections (ARIs)/influenza-like illness (ILI) and/or gastrointestinal (GI) infections during the prior 30 days; and (2) related lost work days. Incidence rate ratios calculated using generalized linear mixed models with a Poisson distribution, adjusted for confounders and random cluster effects. Results: A 31% relative reduction in self-reported combined ARI-ILI/GI infections (incidence rate ratio: 0.69; 95% confidence interval, 0.49 to 0.98). A 21% nonsignificant relative reduction in lost work days. Conclusions: An office-based multimodal hand hygiene improvement intervention demonstrated a substantive reduction in self-reported combined ARI-ILI/GI infections. PMID:25719534

  17. Nonlinear Poisson Equation for Heterogeneous Media

    PubMed Central

    Hu, Langhua; Wei, Guo-Wei

    2012-01-01

    The Poisson equation is a widely accepted model for electrostatic analysis. However, the Poisson equation is derived based on electric polarizations in a linear, isotropic, and homogeneous dielectric medium. This article introduces a nonlinear Poisson equation to take into consideration of hyperpolarization effects due to intensive charges and possible nonlinear, anisotropic, and heterogeneous media. Variational principle is utilized to derive the nonlinear Poisson model from an electrostatic energy functional. To apply the proposed nonlinear Poisson equation for the solvation analysis, we also construct a nonpolar solvation energy functional based on the nonlinear Poisson equation by using the geometric measure theory. At a fixed temperature, the proposed nonlinear Poisson theory is extensively validated by the electrostatic analysis of the Kirkwood model and a set of 20 proteins, and the solvation analysis of a set of 17 small molecules whose experimental measurements are also available for a comparison. Moreover, the nonlinear Poisson equation is further applied to the solvation analysis of 21 compounds at different temperatures. Numerical results are compared to theoretical prediction, experimental measurements, and those obtained from other theoretical methods in the literature. A good agreement between our results and experimental data as well as theoretical results suggests that the proposed nonlinear Poisson model is a potentially useful model for electrostatic analysis involving hyperpolarization effects. PMID:22947937

  18. Statistical properties of several models of fractional random point processes

    NASA Astrophysics Data System (ADS)

    Bendjaballah, C.

    2011-08-01

    Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.

  19. The non-equilibrium allele frequency spectrum in a Poisson random field framework.

    PubMed

    Kaj, Ingemar; Mugal, Carina F

    2016-10-01

    In population genetic studies, the allele frequency spectrum (AFS) efficiently summarizes genome-wide polymorphism data and shapes a variety of allele frequency-based summary statistics. While existing theory typically features equilibrium conditions, emerging methodology requires an analytical understanding of the build-up of the allele frequencies over time. In this work, we use the framework of Poisson random fields to derive new representations of the non-equilibrium AFS for the case of a Wright-Fisher population model with selection. In our approach, the AFS is a scaling-limit of the expectation of a Poisson stochastic integral and the representation of the non-equilibrium AFS arises in terms of a fixation time probability distribution. The known duality between the Wright-Fisher diffusion process and a birth and death process generalizing Kingman's coalescent yields an additional representation. The results carry over to the setting of a random sample drawn from the population and provide the non-equilibrium behavior of sample statistics. Our findings are consistent with and extend a previous approach where the non-equilibrium AFS solves a partial differential forward equation with a non-traditional boundary condition. Moreover, we provide a bridge to previous coalescent-based work, and hence tie several frameworks together. Since frequency-based summary statistics are widely used in population genetics, for example, to identify candidate loci of adaptive evolution, to infer the demographic history of a population, or to improve our understanding of the underlying mechanics of speciation events, the presented results are potentially useful for a broad range of topics. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Poisson Spot with Magnetic Levitation

    ERIC Educational Resources Information Center

    Hoover, Matthew; Everhart, Michael; D'Arruda, Jose

    2010-01-01

    In this paper we describe a unique method for obtaining the famous Poisson spot without adding obstacles to the light path, which could interfere with the effect. A Poisson spot is the interference effect from parallel rays of light diffracting around a solid spherical object, creating a bright spot in the center of the shadow.

  1. Optimal linear reconstruction of dark matter from halo catalogues

    DOE PAGES

    Cai, Yan -Chuan; Bernstein, Gary; Sheth, Ravi K.

    2011-04-01

    The dark matter lumps (or "halos") that contain galaxies have locations in the Universe that are to some extent random with respect to the overall matter distributions. We investigate how best to estimate the total matter distribution from the locations of the halos. We derive the weight function w(M) to apply to dark-matter haloes that minimizes the stochasticity between the weighted halo distribution and its underlying mass density field. The optimal w(M) depends on the range of masses of halos being used. While the standard biased-Poisson model of the halo distribution predicts that bias weighting is optimal, the simple factmore » that the mass is comprised of haloes implies that the optimal w(M) will be a mixture of mass-weighting and bias-weighting. In N-body simulations, the Poisson estimator is up to 15× noisier than the optimal. Optimal weighting could make cosmological tests based on the matter power spectrum or cross-correlations much more powerful and/or cost effective.« less

  2. MODEL FOR INSTANTANEOUS RESIDENTIAL WATER DEMANDS

    EPA Science Inventory

    Residential wateer use is visualized as a customer-server interaction often encountered in queueing theory. Individual customers are assumed to arrive according to a nonhomogeneous Poisson process, then engage water servers for random lengths of time. Busy servers are assumed t...

  3. Nonlinear Poisson equation for heterogeneous media.

    PubMed

    Hu, Langhua; Wei, Guo-Wei

    2012-08-22

    The Poisson equation is a widely accepted model for electrostatic analysis. However, the Poisson equation is derived based on electric polarizations in a linear, isotropic, and homogeneous dielectric medium. This article introduces a nonlinear Poisson equation to take into consideration of hyperpolarization effects due to intensive charges and possible nonlinear, anisotropic, and heterogeneous media. Variational principle is utilized to derive the nonlinear Poisson model from an electrostatic energy functional. To apply the proposed nonlinear Poisson equation for the solvation analysis, we also construct a nonpolar solvation energy functional based on the nonlinear Poisson equation by using the geometric measure theory. At a fixed temperature, the proposed nonlinear Poisson theory is extensively validated by the electrostatic analysis of the Kirkwood model and a set of 20 proteins, and the solvation analysis of a set of 17 small molecules whose experimental measurements are also available for a comparison. Moreover, the nonlinear Poisson equation is further applied to the solvation analysis of 21 compounds at different temperatures. Numerical results are compared to theoretical prediction, experimental measurements, and those obtained from other theoretical methods in the literature. A good agreement between our results and experimental data as well as theoretical results suggests that the proposed nonlinear Poisson model is a potentially useful model for electrostatic analysis involving hyperpolarization effects. Copyright © 2012 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  4. Nonlocal Poisson-Fermi model for ionic solvent.

    PubMed

    Xie, Dexuan; Liu, Jinn-Liang; Eisenberg, Bob

    2016-07-01

    We propose a nonlocal Poisson-Fermi model for ionic solvent that includes ion size effects and polarization correlations among water molecules in the calculation of electrostatic potential. It includes the previous Poisson-Fermi models as special cases, and its solution is the convolution of a solution of the corresponding nonlocal Poisson dielectric model with a Yukawa-like kernel function. The Fermi distribution is shown to be a set of optimal ionic concentration functions in the sense of minimizing an electrostatic potential free energy. Numerical results are reported to show the difference between a Poisson-Fermi solution and a corresponding Poisson solution.

  5. Influence diagnostics for count data under AB-BA crossover trials.

    PubMed

    Hao, Chengcheng; von Rosen, Dietrich; von Rosen, Tatjana

    2017-12-01

    This paper aims to develop diagnostic measures to assess the influence of data perturbations on estimates in AB-BA crossover studies with a Poisson distributed response. Generalised mixed linear models with normally distributed random effects are utilised. We show that in this special case, the model can be decomposed into two independent sub-models which allow to derive closed-form expressions to evaluate the changes in the maximum likelihood estimates under several perturbation schemes. The performance of the new influence measures is illustrated by simulation studies and the analysis of a real dataset.

  6. [Study protocol on the effect of the economic crisis on mortality and reproductive health and health inequalities in Spain].

    PubMed

    Pérez, Glòria; Gotsens, Mercè; Palència, Laia; Marí-Dell'Olmo, Marc; Domínguez-Berjón, M Felicitas; Rodríguez-Sanz, Maica; Puig, Vanessa; Bartoll, Xavier; Gandarillas, Ana; Martín, Unai; Bacigalupe, Amaia; Díez, Elia; Ruiz, Miguel; Esnaola, Santiago; Calvo, Montserrat; Sánchez, Pablo; Luque Fernández, Miguel Ángel; Borrell, Carme

    The aim is to present the protocol of the two sub-studies on the effect of the economic crisis on mortality and reproductive health and health inequalities in Spain. Substudy 1: describe the evolution of mortality and reproductive health between 1990 and 2013 through a longitudinal ecological study in the Autonomous Communities. This study will identify changes caused by the economic crisis in trends or reproductive health and mortality indicators using panel data (17 Autonomous Communities per study year) and adjusting Poisson models with random effects variance. Substudy 2: analyse inequalities by socioeconomic deprivation in mortality and reproductive health in several areas of Spain. An ecological study analysing trends in the pre-crisis (1999-2003 and 2004-2008) and crisis (2009-2013) periods will be performed. Random effects models Besag York and Mollié will be adjusted to estimate mortality indicators softened in reproductive health and census tracts. Copyright © 2016 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  7. Some functional limit theorems for compound Cox processes

    NASA Astrophysics Data System (ADS)

    Korolev, Victor Yu.; Chertok, A. V.; Korchagin, A. Yu.; Kossova, E. V.; Zeifman, Alexander I.

    2016-06-01

    An improved version of the functional limit theorem is proved establishing weak convergence of random walks generated by compound doubly stochastic Poisson processes (compound Cox processes) to Lévy processes in the Skorokhod space under more realistic moment conditions. As corollaries, theorems are proved on convergence of random walks with jumps having finite variances to Lévy processes with variance-mean mixed normal distributions, in particular, to stable Lévy processes.

  8. Ages of Records in Random Walks

    NASA Astrophysics Data System (ADS)

    Szabó, Réka; Vető, Bálint

    2016-12-01

    We consider random walks with continuous and symmetric step distributions. We prove universal asymptotics for the average proportion of the age of the kth longest lasting record for k=1,2,ldots and for the probability that the record of the kth longest age is broken at step n. Due to the relation to the Chinese restaurant process, the ranked sequence of proportions of ages converges to the Poisson-Dirichlet distribution.

  9. Slow diffusion by Markov random flights

    NASA Astrophysics Data System (ADS)

    Kolesnik, Alexander D.

    2018-06-01

    We present a conception of the slow diffusion processes in the Euclidean spaces Rm , m ≥ 1, based on the theory of random flights with small constant speed that are driven by a homogeneous Poisson process of small rate. The slow diffusion condition that, on long time intervals, leads to the stationary distributions, is given. The stationary distributions of slow diffusion processes in some Euclidean spaces of low dimensions, are presented.

  10. Some functional limit theorems for compound Cox processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korolev, Victor Yu.; Institute of Informatics Problems FRC CSC RAS; Chertok, A. V.

    2016-06-08

    An improved version of the functional limit theorem is proved establishing weak convergence of random walks generated by compound doubly stochastic Poisson processes (compound Cox processes) to Lévy processes in the Skorokhod space under more realistic moment conditions. As corollaries, theorems are proved on convergence of random walks with jumps having finite variances to Lévy processes with variance-mean mixed normal distributions, in particular, to stable Lévy processes.

  11. Assessing historical rate changes in global tsunami occurrence

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2011-01-01

    The global catalogue of tsunami events is examined to determine if transient variations in tsunami rates are consistent with a Poisson process commonly assumed for tsunami hazard assessments. The primary data analyzed are tsunamis with maximum sizes >1m. The record of these tsunamis appears to be complete since approximately 1890. A secondary data set of tsunamis >0.1m is also analyzed that appears to be complete since approximately 1960. Various kernel density estimates used to determine the rate distribution with time indicate a prominent rate change in global tsunamis during the mid-1990s. Less prominent rate changes occur in the early- and mid-20th century. To determine whether these rate fluctuations are anomalous, the distribution of annual event numbers for the tsunami catalogue is compared to Poisson and negative binomial distributions, the latter of which includes the effects of temporal clustering. Compared to a Poisson distribution, the negative binomial distribution model provides a consistent fit to tsunami event numbers for the >1m data set, but the Poisson null hypothesis cannot be falsified for the shorter duration >0.1m data set. Temporal clustering of tsunami sources is also indicated by the distribution of interevent times for both data sets. Tsunami event clusters consist only of two to four events, in contrast to protracted sequences of earthquakes that make up foreshock-main shock-aftershock sequences. From past studies of seismicity, it is likely that there is a physical triggering mechanism responsible for events within the tsunami source 'mini-clusters'. In conclusion, prominent transient rate increases in the occurrence of global tsunamis appear to be caused by temporal grouping of geographically distinct mini-clusters, in addition to the random preferential location of global M >7 earthquakes along offshore fault zones.

  12. Efficacy of a Savings-Led Microfinance Intervention to Reduce Sexual Risk for HIV Among Women Engaged in Sex Work: A Randomized Clinical Trial

    PubMed Central

    Aira, Toivgoo; Tsai, Laura Cordisco; Riedel, Marion; Offringa, Reid; Chang, Mingway; El-Bassel, Nabila; Ssewamala, Fred

    2015-01-01

    Objectives. We tested whether a structural intervention combining savings-led microfinance and HIV prevention components would achieve enhanced reductions in sexual risk among women engaging in street-based sex work in Ulaanbaatar, Mongolia, compared with an HIV prevention intervention alone. Methods. Between November 2011 and August 2012, we randomized 107 eligible women who completed baseline assessments to either a 4-session HIV sexual risk reduction intervention (HIVSRR) alone (n = 50) or a 34-session HIVSRR plus a savings-led microfinance intervention (n = 57). At 3- and 6-month follow-up assessments, participants reported unprotected acts of vaginal intercourse with paying partners and number of paying partners with whom they engaged in sexual intercourse in the previous 90 days. Using Poisson and zero-inflated Poisson model regressions, we examined the effects of assignment to treatment versus control condition on outcomes. Results. At 6-month follow-up, the HIVSRR plus microfinance participants reported significantly fewer paying sexual partners and were more likely to report zero unprotected vaginal sex acts with paying sexual partners. Conclusions. Findings advance the HIV prevention repertoire for women, demonstrating that risk reduction may be achieved through a structural intervention that relies on asset building, including savings, and alternatives to income from sex work. PMID:25602889

  13. Punctuated equilibrium dynamics in human communications

    NASA Astrophysics Data System (ADS)

    Peng, Dan; Han, Xiao-Pu; Wei, Zong-Wen; Wang, Bing-Hong

    2015-10-01

    A minimal model based on network incorporating individual interactions is proposed to study the non-Poisson statistical properties of human behavior: individuals in system interact with their neighbors, the probability of an individual acting correlates to its activity, and all the individuals involved in action will change their activities randomly. The model reproduces varieties of spatial-temporal patterns observed in empirical studies of human daily communications, providing insight into various human activities and embracing a range of realistic social interacting systems, particularly, intriguing bimodal phenomenon. This model bridges priority queueing theory and punctuated equilibrium dynamics, and our modeling and analysis is likely to shed light on non-Poisson phenomena in many complex systems.

  14. A New Model that Generates Lotka's Law.

    ERIC Educational Resources Information Center

    Huber, John C.

    2002-01-01

    Develops a new model for a process that generates Lotka's Law. Topics include measuring scientific productivity through the number of publications; rate of production; career duration; randomness; Poisson distribution; computer simulations; goodness-of-fit; theoretical support for the model; and future research. (Author/LRW)

  15. Effect of Temperature on Mechanical Properties of Nanoclay Reinforced Polymeric Nanocomposites. Part 1. Experimental Results

    DTIC Science & Technology

    2012-04-23

    Temperature and nanoclay reinforcement affect the Poisson ?s ratio also, but this effect is less significant. In general, as the temperature increases...the Poisson ?s ratio also increases. However, an increase in nanoclay reinforcement generally reduces the Poisson ?s ratio . It is also noted that the...nanoclay reinforcement generally reduces the Poisson’s ratio . It is also noted that the type of resin used may have a significant effect on the

  16. Generalized master equation via aging continuous-time random walks.

    PubMed

    Allegrini, Paolo; Aquino, Gerardo; Grigolini, Paolo; Palatella, Luigi; Rosa, Angelo

    2003-11-01

    We discuss the problem of the equivalence between continuous-time random walk (CTRW) and generalized master equation (GME). The walker, making instantaneous jumps from one site of the lattice to another, resides in each site for extended times. The sojourn times have a distribution density psi(t) that is assumed to be an inverse power law with the power index micro. We assume that the Onsager principle is fulfilled, and we use this assumption to establish a complete equivalence between GME and the Montroll-Weiss CTRW. We prove that this equivalence is confined to the case where psi(t) is an exponential. We argue that is so because the Montroll-Weiss CTRW, as recently proved by Barkai [E. Barkai, Phys. Rev. Lett. 90, 104101 (2003)], is nonstationary, thereby implying aging, while the Onsager principle is valid only in the case of fully aged systems. The case of a Poisson distribution of sojourn times is the only one with no aging associated to it, and consequently with no need to establish special initial conditions to fulfill the Onsager principle. We consider the case of a dichotomous fluctuation, and we prove that the Onsager principle is fulfilled for any form of regression to equilibrium provided that the stationary condition holds true. We set the stationary condition on both the CTRW and the GME, thereby creating a condition of total equivalence, regardless of the nature of the waiting-time distribution. As a consequence of this procedure we create a GME that is a bona fide master equation, in spite of being non-Markov. We note that the memory kernel of the GME affords information on the interaction between system of interest and its bath. The Poisson case yields a bath with infinitely fast fluctuations. We argue that departing from the Poisson form has the effect of creating a condition of infinite memory and that these results might be useful to shed light on the problem of how to unravel non-Markov quantum master equations.

  17. ELLIPTICAL WEIGHTED HOLICs FOR WEAK LENSING SHEAR MEASUREMENT. III. THE EFFECT OF RANDOM COUNT NOISE ON IMAGE MOMENTS IN WEAK LENSING ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Okura, Yuki; Futamase, Toshifumi, E-mail: yuki.okura@nao.ac.jp, E-mail: tof@astr.tohoku.ac.jp

    This is the third paper on the improvement of systematic errors in weak lensing analysis using an elliptical weight function, referred to as E-HOLICs. In previous papers, we succeeded in avoiding errors that depend on the ellipticity of the background image. In this paper, we investigate the systematic error that depends on the signal-to-noise ratio of the background image. We find that the origin of this error is the random count noise that comes from the Poisson noise of sky counts. The random count noise makes additional moments and centroid shift error, and those first-order effects are canceled in averaging,more » but the second-order effects are not canceled. We derive the formulae that correct this systematic error due to the random count noise in measuring the moments and ellipticity of the background image. The correction formulae obtained are expressed as combinations of complex moments of the image, and thus can correct the systematic errors caused by each object. We test their validity using a simulated image and find that the systematic error becomes less than 1% in the measured ellipticity for objects with an IMCAT significance threshold of {nu} {approx} 11.7.« less

  18. Spectral statistics of random geometric graphs

    NASA Astrophysics Data System (ADS)

    Dettmann, C. P.; Georgiou, O.; Knight, G.

    2017-04-01

    We use random matrix theory to study the spectrum of random geometric graphs, a fundamental model of spatial networks. Considering ensembles of random geometric graphs we look at short-range correlations in the level spacings of the spectrum via the nearest-neighbour and next-nearest-neighbour spacing distribution and long-range correlations via the spectral rigidity Δ3 statistic. These correlations in the level spacings give information about localisation of eigenvectors, level of community structure and the level of randomness within the networks. We find a parameter-dependent transition between Poisson and Gaussian orthogonal ensemble statistics. That is the spectral statistics of spatial random geometric graphs fits the universality of random matrix theory found in other models such as Erdős-Rényi, Barabási-Albert and Watts-Strogatz random graphs.

  19. Universal Hitting Time Statistics for Integrable Flows

    NASA Astrophysics Data System (ADS)

    Dettmann, Carl P.; Marklof, Jens; Strömbergsson, Andreas

    2017-02-01

    The perceived randomness in the time evolution of "chaotic" dynamical systems can be characterized by universal probabilistic limit laws, which do not depend on the fine features of the individual system. One important example is the Poisson law for the times at which a particle with random initial data hits a small set. This was proved in various settings for dynamical systems with strong mixing properties. The key result of the present study is that, despite the absence of mixing, the hitting times of integrable flows also satisfy universal limit laws which are, however, not Poisson. We describe the limit distributions for "generic" integrable flows and a natural class of target sets, and illustrate our findings with two examples: the dynamics in central force fields and ellipse billiards. The convergence of the hitting time process follows from a new equidistribution theorem in the space of lattices, which is of independent interest. Its proof exploits Ratner's measure classification theorem for unipotent flows, and extends earlier work of Elkies and McMullen.

  20. Probabilistic structural analysis methods for improving Space Shuttle engine reliability

    NASA Technical Reports Server (NTRS)

    Boyce, L.

    1989-01-01

    Probabilistic structural analysis methods are particularly useful in the design and analysis of critical structural components and systems that operate in very severe and uncertain environments. These methods have recently found application in space propulsion systems to improve the structural reliability of Space Shuttle Main Engine (SSME) components. A computer program, NESSUS, based on a deterministic finite-element program and a method of probabilistic analysis (fast probability integration) provides probabilistic structural analysis for selected SSME components. While computationally efficient, it considers both correlated and nonnormal random variables as well as an implicit functional relationship between independent and dependent variables. The program is used to determine the response of a nickel-based superalloy SSME turbopump blade. Results include blade tip displacement statistics due to the variability in blade thickness, modulus of elasticity, Poisson's ratio or density. Modulus of elasticity significantly contributed to blade tip variability while Poisson's ratio did not. Thus, a rational method for choosing parameters to be modeled as random is provided.

  1. Robustness of Quadratic Hedging Strategies in Finance via Backward Stochastic Differential Equations with Jumps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di Nunno, Giulia, E-mail: giulian@math.uio.no; Khedher, Asma, E-mail: asma.khedher@tum.de; Vanmaele, Michèle, E-mail: michele.vanmaele@ugent.be

    We consider a backward stochastic differential equation with jumps (BSDEJ) which is driven by a Brownian motion and a Poisson random measure. We present two candidate-approximations to this BSDEJ and we prove that the solution of each candidate-approximation converges to the solution of the original BSDEJ in a space which we specify. We use this result to investigate in further detail the consequences of the choice of the model to (partial) hedging in incomplete markets in finance. As an application, we consider models in which the small variations in the price dynamics are modeled with a Poisson random measure withmore » infinite activity and models in which these small variations are modeled with a Brownian motion or are cut off. Using the convergence results on BSDEJs, we show that quadratic hedging strategies are robust towards the approximation of the market prices and we derive an estimation of the model risk.« less

  2. Concurrent topological design of composite structures and materials containing multiple phases of distinct Poisson's ratios

    NASA Astrophysics Data System (ADS)

    Long, Kai; Yuan, Philip F.; Xu, Shanqing; Xie, Yi Min

    2018-04-01

    Most studies on composites assume that the constituent phases have different values of stiffness. Little attention has been paid to the effect of constituent phases having distinct Poisson's ratios. This research focuses on a concurrent optimization method for simultaneously designing composite structures and materials with distinct Poisson's ratios. The proposed method aims to minimize the mean compliance of the macrostructure with a given mass of base materials. In contrast to the traditional interpolation of the stiffness matrix through numerical results, an interpolation scheme of the Young's modulus and Poisson's ratio using different parameters is adopted. The numerical results demonstrate that the Poisson effect plays a key role in reducing the mean compliance of the final design. An important contribution of the present study is that the proposed concurrent optimization method can automatically distribute base materials with distinct Poisson's ratios between the macrostructural and microstructural levels under a single constraint of the total mass.

  3. Stochastic and Deterministic Models for the Metastatic Emission Process: Formalisms and Crosslinks.

    PubMed

    Gomez, Christophe; Hartung, Niklas

    2018-01-01

    Although the detection of metastases radically changes prognosis of and treatment decisions for a cancer patient, clinically undetectable micrometastases hamper a consistent classification into localized or metastatic disease. This chapter discusses mathematical modeling efforts that could help to estimate the metastatic risk in such a situation. We focus on two approaches: (1) a stochastic framework describing metastatic emission events at random times, formalized via Poisson processes, and (2) a deterministic framework describing the micrometastatic state through a size-structured density function in a partial differential equation model. Three aspects are addressed in this chapter. First, a motivation for the Poisson process framework is presented and modeling hypotheses and mechanisms are introduced. Second, we extend the Poisson model to account for secondary metastatic emission. Third, we highlight an inherent crosslink between the stochastic and deterministic frameworks and discuss its implications. For increased accessibility the chapter is split into an informal presentation of the results using a minimum of mathematical formalism and a rigorous mathematical treatment for more theoretically interested readers.

  4. Fractional Poisson-Nernst-Planck Model for Ion Channels I: Basic Formulations and Algorithms.

    PubMed

    Chen, Duan

    2017-11-01

    In this work, we propose a fractional Poisson-Nernst-Planck model to describe ion permeation in gated ion channels. Due to the intrinsic conformational changes, crowdedness in narrow channel pores, binding and trapping introduced by functioning units of channel proteins, ionic transport in the channel exhibits a power-law-like anomalous diffusion dynamics. We start from continuous-time random walk model for a single ion and use a long-tailed density distribution function for the particle jump waiting time, to derive the fractional Fokker-Planck equation. Then, it is generalized to the macroscopic fractional Poisson-Nernst-Planck model for ionic concentrations. Necessary computational algorithms are designed to implement numerical simulations for the proposed model, and the dynamics of gating current is investigated. Numerical simulations show that the fractional PNP model provides a more qualitatively reasonable match to the profile of gating currents from experimental observations. Meanwhile, the proposed model motivates new challenges in terms of mathematical modeling and computations.

  5. Poisson Mixture Regression Models for Heart Disease Prediction.

    PubMed

    Mufudza, Chipo; Erol, Hamza

    2016-01-01

    Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model.

  6. Poisson Mixture Regression Models for Heart Disease Prediction

    PubMed Central

    Erol, Hamza

    2016-01-01

    Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model. PMID:27999611

  7. Percolation and Reinforcement on Complex Networks

    NASA Astrophysics Data System (ADS)

    Yuan, Xin

    Complex networks appear in almost every aspect of our daily life and are widely studied in the fields of physics, mathematics, finance, biology and computer science. This work utilizes percolation theory in statistical physics to explore the percolation properties of complex networks and develops a reinforcement scheme on improving network resilience. This dissertation covers two major parts of my Ph.D. research on complex networks: i) probe--in the context of both traditional percolation and k-core percolation--the resilience of complex networks with tunable degree distributions or directed dependency links under random, localized or targeted attacks; ii) develop and propose a reinforcement scheme to eradicate catastrophic collapses that occur very often in interdependent networks. We first use generating function and probabilistic methods to obtain analytical solutions to percolation properties of interest, such as the giant component size and the critical occupation probability. We study uncorrelated random networks with Poisson, bi-Poisson, power-law, and Kronecker-delta degree distributions and construct those networks which are based on the configuration model. The computer simulation results show remarkable agreement with theoretical predictions. We discover an increase of network robustness as the degree distribution broadens and a decrease of network robustness as directed dependency links come into play under random attacks. We also find that targeted attacks exert the biggest damage to the structure of both single and interdependent networks in k-core percolation. To strengthen the resilience of interdependent networks, we develop and propose a reinforcement strategy and obtain the critical amount of reinforced nodes analytically for interdependent Erdḧs-Renyi networks and numerically for scale-free and for random regular networks. Our mechanism leads to improvement of network stability of the West U.S. power grid. This dissertation provides us with a deeper understanding of the effects of structural features on network stability and fresher insights into designing resilient interdependent infrastructure networks.

  8. Soft network materials with isotropic negative Poisson's ratios over large strains.

    PubMed

    Liu, Jianxing; Zhang, Yihui

    2018-01-31

    Auxetic materials with negative Poisson's ratios have important applications across a broad range of engineering areas, such as biomedical devices, aerospace engineering and automotive engineering. A variety of design strategies have been developed to achieve artificial auxetic materials with controllable responses in the Poisson's ratio. The development of designs that can offer isotropic negative Poisson's ratios over large strains can open up new opportunities in emerging biomedical applications, which, however, remains a challenge. Here, we introduce deterministic routes to soft architected materials that can be tailored precisely to yield the values of Poisson's ratio in the range from -1 to 1, in an isotropic manner, with a tunable strain range from 0% to ∼90%. The designs rely on a network construction in a periodic lattice topology, which incorporates zigzag microstructures as building blocks to connect lattice nodes. Combined experimental and theoretical studies on broad classes of network topologies illustrate the wide-ranging utility of these concepts. Quantitative mechanics modeling under both infinitesimal and finite deformations allows the development of a rigorous design algorithm that determines the necessary network geometries to yield target Poisson ratios over desired strain ranges. Demonstrative examples in artificial skin with both the negative Poisson's ratio and the nonlinear stress-strain curve precisely matching those of the cat's skin and in unusual cylindrical structures with engineered Poisson effect and shape memory effect suggest potential applications of these network materials.

  9. Statistical properties of a filtered Poisson process with additive random noise: distributions, correlations and moment estimation

    NASA Astrophysics Data System (ADS)

    Theodorsen, A.; E Garcia, O.; Rypdal, M.

    2017-05-01

    Filtered Poisson processes are often used as reference models for intermittent fluctuations in physical systems. Such a process is here extended by adding a noise term, either as a purely additive term to the process or as a dynamical term in a stochastic differential equation. The lowest order moments, probability density function, auto-correlation function and power spectral density are derived and used to identify and compare the effects of the two different noise terms. Monte-Carlo studies of synthetic time series are used to investigate the accuracy of model parameter estimation and to identify methods for distinguishing the noise types. It is shown that the probability density function and the three lowest order moments provide accurate estimations of the model parameters, but are unable to separate the noise types. The auto-correlation function and the power spectral density also provide methods for estimating the model parameters, as well as being capable of identifying the noise type. The number of times the signal crosses a prescribed threshold level in the positive direction also promises to be able to differentiate the noise type.

  10. Lindley frailty model for a class of compound Poisson processes

    NASA Astrophysics Data System (ADS)

    Kadilar, Gamze Özel; Ata, Nihal

    2013-10-01

    The Lindley distribution gain importance in survival analysis for the similarity of exponential distribution and allowance for the different shapes of hazard function. Frailty models provide an alternative to proportional hazards model where misspecified or omitted covariates are described by an unobservable random variable. Despite of the distribution of the frailty is generally assumed to be continuous, it is appropriate to consider discrete frailty distributions In some circumstances. In this paper, frailty models with discrete compound Poisson process for the Lindley distributed failure time are introduced. Survival functions are derived and maximum likelihood estimation procedures for the parameters are studied. Then, the fit of the models to the earthquake data set of Turkey are examined.

  11. Intervention-Based Stochastic Disease Eradication

    NASA Astrophysics Data System (ADS)

    Billings, Lora; Mier-Y-Teran-Romero, Luis; Lindley, Brandon; Schwartz, Ira

    2013-03-01

    Disease control is of paramount importance in public health with infectious disease extinction as the ultimate goal. Intervention controls, such as vaccination of susceptible individuals and/or treatment of infectives, are typically based on a deterministic schedule, such as periodically vaccinating susceptible children based on school calendars. In reality, however, such policies are administered as a random process, while still possessing a mean period. Here, we consider the effect of randomly distributed intervention as disease control on large finite populations. We show explicitly how intervention control, based on mean period and treatment fraction, modulates the average extinction times as a function of population size and the speed of infection. In particular, our results show an exponential improvement in extinction times even though the controls are implemented using a random Poisson distribution. Finally, we discover those parameter regimes where random treatment yields an exponential improvement in extinction times over the application of strictly periodic intervention. The implication of our results is discussed in light of the availability of limited resources for control. Supported by the National Institute of General Medical Sciences Award No. R01GM090204

  12. Smooth invariant densities for random switching on the torus

    NASA Astrophysics Data System (ADS)

    Bakhtin, Yuri; Hurth, Tobias; Lawley, Sean D.; Mattingly, Jonathan C.

    2018-04-01

    We consider a random dynamical system obtained by switching between the flows generated by two smooth vector fields on the 2d-torus, with the random switchings happening according to a Poisson process. Assuming that the driving vector fields are transversal to each other at all points of the torus and that each of them allows for a smooth invariant density and no periodic orbits, we prove that the switched system also has a smooth invariant density, for every switching rate. Our approach is based on an integration by parts formula inspired by techniques from Malliavin calculus.

  13. Intermediate quantum maps for quantum computation

    NASA Astrophysics Data System (ADS)

    Giraud, O.; Georgeot, B.

    2005-10-01

    We study quantum maps displaying spectral statistics intermediate between Poisson and Wigner-Dyson. It is shown that they can be simulated on a quantum computer with a small number of gates, and efficiently yield information about fidelity decay or spectral statistics. We study their matrix elements and entanglement production and show that they converge with time to distributions which differ from random matrix predictions. A randomized version of these maps can be implemented even more economically and yields pseudorandom operators with original properties, enabling, for example, one to produce fractal random vectors. These algorithms are within reach of present-day quantum computers.

  14. Negative Binomial Process Count and Mixture Modeling.

    PubMed

    Zhou, Mingyuan; Carin, Lawrence

    2015-02-01

    The seemingly disjoint problems of count and mixture modeling are united under the negative binomial (NB) process. A gamma process is employed to model the rate measure of a Poisson process, whose normalization provides a random probability measure for mixture modeling and whose marginalization leads to an NB process for count modeling. A draw from the NB process consists of a Poisson distributed finite number of distinct atoms, each of which is associated with a logarithmic distributed number of data samples. We reveal relationships between various count- and mixture-modeling distributions and construct a Poisson-logarithmic bivariate distribution that connects the NB and Chinese restaurant table distributions. Fundamental properties of the models are developed, and we derive efficient Bayesian inference. It is shown that with augmentation and normalization, the NB process and gamma-NB process can be reduced to the Dirichlet process and hierarchical Dirichlet process, respectively. These relationships highlight theoretical, structural, and computational advantages of the NB process. A variety of NB processes, including the beta-geometric, beta-NB, marked-beta-NB, marked-gamma-NB and zero-inflated-NB processes, with distinct sharing mechanisms, are also constructed. These models are applied to topic modeling, with connections made to existing algorithms under Poisson factor analysis. Example results show the importance of inferring both the NB dispersion and probability parameters.

  15. On Models for Binomial Data with Random Numbers of Trials

    PubMed Central

    Comulada, W. Scott; Weiss, Robert E.

    2010-01-01

    Summary A binomial outcome is a count s of the number of successes out of the total number of independent trials n = s + f, where f is a count of the failures. The n are random variables not fixed by design in many studies. Joint modeling of (s, f) can provide additional insight into the science and into the probability π of success that cannot be directly incorporated by the logistic regression model. Observations where n = 0 are excluded from the binomial analysis yet may be important to understanding how π is influenced by covariates. Correlation between s and f may exist and be of direct interest. We propose Bayesian multivariate Poisson models for the bivariate response (s, f), correlated through random effects. We extend our models to the analysis of longitudinal and multivariate longitudinal binomial outcomes. Our methodology was motivated by two disparate examples, one from teratology and one from an HIV tertiary intervention study. PMID:17688514

  16. Post-stratification sampling in small area estimation (SAE) model for unemployment rate estimation by Bayes approach

    NASA Astrophysics Data System (ADS)

    Hanike, Yusrianti; Sadik, Kusman; Kurnia, Anang

    2016-02-01

    This research implemented unemployment rate in Indonesia that based on Poisson distribution. It would be estimated by modified the post-stratification and Small Area Estimation (SAE) model. Post-stratification was one of technique sampling that stratified after collected survey data. It's used when the survey data didn't serve for estimating the interest area. Interest area here was the education of unemployment which separated in seven category. The data was obtained by Labour Employment National survey (Sakernas) that's collected by company survey in Indonesia, BPS, Statistic Indonesia. This company served the national survey that gave too small sample for level district. Model of SAE was one of alternative to solved it. According the problem above, we combined this post-stratification sampling and SAE model. This research gave two main model of post-stratification sampling. Model I defined the category of education was the dummy variable and model II defined the category of education was the area random effect. Two model has problem wasn't complied by Poisson assumption. Using Poisson-Gamma model, model I has over dispersion problem was 1.23 solved to 0.91 chi square/df and model II has under dispersion problem was 0.35 solved to 0.94 chi square/df. Empirical Bayes was applied to estimate the proportion of every category education of unemployment. Using Bayesian Information Criteria (BIC), Model I has smaller mean square error (MSE) than model II.

  17. Minimum risk wavelet shrinkage operator for Poisson image denoising.

    PubMed

    Cheng, Wu; Hirakawa, Keigo

    2015-05-01

    The pixel values of images taken by an image sensor are said to be corrupted by Poisson noise. To date, multiscale Poisson image denoising techniques have processed Haar frame and wavelet coefficients--the modeling of coefficients is enabled by the Skellam distribution analysis. We extend these results by solving for shrinkage operators for Skellam that minimizes the risk functional in the multiscale Poisson image denoising setting. The minimum risk shrinkage operator of this kind effectively produces denoised wavelet coefficients with minimum attainable L2 error.

  18. On a Stochastic Failure Model under Random Shocks

    NASA Astrophysics Data System (ADS)

    Cha, Ji Hwan

    2013-02-01

    In most conventional settings, the events caused by an external shock are initiated at the moments of its occurrence. In this paper, we study a new classes of shock model, where each shock from a nonhomogeneous Poisson processes can trigger a failure of a system not immediately, as in classical extreme shock models, but with delay of some random time. We derive the corresponding survival and failure rate functions. Furthermore, we study the limiting behaviour of the failure rate function where it is applicable.

  19. A Tutorial on Multilevel Survival Analysis: Methods, Models and Applications

    PubMed Central

    Austin, Peter C.

    2017-01-01

    Summary Data that have a multilevel structure occur frequently across a range of disciplines, including epidemiology, health services research, public health, education and sociology. We describe three families of regression models for the analysis of multilevel survival data. First, Cox proportional hazards models with mixed effects incorporate cluster-specific random effects that modify the baseline hazard function. Second, piecewise exponential survival models partition the duration of follow-up into mutually exclusive intervals and fit a model that assumes that the hazard function is constant within each interval. This is equivalent to a Poisson regression model that incorporates the duration of exposure within each interval. By incorporating cluster-specific random effects, generalised linear mixed models can be used to analyse these data. Third, after partitioning the duration of follow-up into mutually exclusive intervals, one can use discrete time survival models that use a complementary log–log generalised linear model to model the occurrence of the outcome of interest within each interval. Random effects can be incorporated to account for within-cluster homogeneity in outcomes. We illustrate the application of these methods using data consisting of patients hospitalised with a heart attack. We illustrate the application of these methods using three statistical programming languages (R, SAS and Stata). PMID:29307954

  20. A Tutorial on Multilevel Survival Analysis: Methods, Models and Applications.

    PubMed

    Austin, Peter C

    2017-08-01

    Data that have a multilevel structure occur frequently across a range of disciplines, including epidemiology, health services research, public health, education and sociology. We describe three families of regression models for the analysis of multilevel survival data. First, Cox proportional hazards models with mixed effects incorporate cluster-specific random effects that modify the baseline hazard function. Second, piecewise exponential survival models partition the duration of follow-up into mutually exclusive intervals and fit a model that assumes that the hazard function is constant within each interval. This is equivalent to a Poisson regression model that incorporates the duration of exposure within each interval. By incorporating cluster-specific random effects, generalised linear mixed models can be used to analyse these data. Third, after partitioning the duration of follow-up into mutually exclusive intervals, one can use discrete time survival models that use a complementary log-log generalised linear model to model the occurrence of the outcome of interest within each interval. Random effects can be incorporated to account for within-cluster homogeneity in outcomes. We illustrate the application of these methods using data consisting of patients hospitalised with a heart attack. We illustrate the application of these methods using three statistical programming languages (R, SAS and Stata).

  1. A two-arm cluster randomized control trial to determine the effectiveness of a pressure ulcer prevention bundle for critically ill patients.

    PubMed

    Tayyib, Nahla; Coyer, Fiona; Lewis, Peter A

    2015-05-01

    This study tested the effectiveness of a pressure ulcer (PU) prevention bundle in reducing the incidence of PUs in critically ill patients in two Saudi intensive care units (ICUs). A two-arm cluster randomized experimental control trial. Participants in the intervention group received the PU prevention bundle, while the control group received standard skin care as per the local ICU policies. Data collected included demographic variables (age, diagnosis, comorbidities, admission trajectory, length of stay) and clinical variables (Braden Scale score, severity of organ function score, mechanical ventilation, PU presence, and staging). All patients were followed every two days from admission through to discharge, death, or up to a maximum of 28 days. Data were analyzed with descriptive correlation statistics, Kaplan-Meier survival analysis, and Poisson regression. The total number of participants recruited was 140: 70 control participants (with a total of 728 days of observation) and 70 intervention participants (784 days of observation). PU cumulative incidence was significantly lower in the intervention group (7.14%) compared to the control group (32.86%). Poisson regression revealed the likelihood of PU development was 70% lower in the intervention group. The intervention group had significantly less Stage I (p = .002) and Stage II PU development (p = .026). Significant improvements were observed in PU-related outcomes with the implementation of the PU prevention bundle in the ICU; PU incidence, severity, and total number of PUs per patient were reduced. Utilizing a bundle approach and standardized nursing language through skin assessment and translation of the knowledge to practice has the potential to impact positively on the quality of care and patient outcome. © 2015 Sigma Theta Tau International.

  2. Performance advantages of maximum likelihood methods in PRBS-modulated time-of-flight electron energy loss spectroscopy

    NASA Astrophysics Data System (ADS)

    Yang, Zhongyu

    This thesis describes the design, experimental performance, and theoretical simulation of a novel time-of-flight analyzer that was integrated into a high resolution electron energy loss spectrometer (TOF-HREELS). First we examined the use of an interleaved comb chopper for chopping a continuous electron beam. Both static and dynamic behaviors were simulated theoretically and measured experimentally, with very good agreement. The finite penetration of the field beyond the plane of the chopper leads to non-ideal chopper response, which is characterized in terms of an "energy corruption" effect and a lead or lag in the time at which the beam responds to the chopper potential. Second we considered the recovery of spectra from pseudo-random binary sequence (PRBS) modulated TOF-HREELS data. The effects of the Poisson noise distribution and the non-ideal behavior of the "interleaved comb" chopper were simulated. We showed, for the first time, that maximum likelihood methods can be combined with PRBS modulation to achieve resolution enhancement, while properly accounting for the Poisson noise distribution and artifacts introduced by the chopper. Our results indicate that meV resolution, similar to that of modern high resolution electron energy loss spectrometers, can be achieved with a dramatic performance advantage over conventional, serial detection analyzers. To demonstrate the capabilities of the TOF-HREELS instrument, we made measurements on a highly oriented thin film polytetrafluoroethylene (PTFE) sample. We demonstrated that the TOF-HREELS can achieve a throughput advantage of a factor of 85 compared to the conventional HREELS instrument. Comparisons were made between the experimental results and theoretical simulations. We discuss various factors which affect inversion of PRBS modulated Time of Flight (TOF) data with the Lucy algorithm. Using simulations, we conclude that the convolution assumption was good under the conditions of our experiment. The chopper rise time, Poisson noise, and artifacts of the chopper response are evaluated. Finally, we conclude that the maximum likelihood algorithms are able to gain a multiplex advantage in PRBS modulation, despite the Poisson noise in the detector.

  3. Functional response and capture timing in an individual-based model: predation by northern squawfish (Ptychocheilus oregonensis) on juvenile salmonids in the Columbia River

    USGS Publications Warehouse

    Petersen, James H.; DeAngelis, Donald L.

    1992-01-01

    The behavior of individual northern squawfish (Ptychocheilus oregonensis) preying on juvenile salmonids was modeled to address questions about capture rate and the timing of prey captures (random versus contagious). Prey density, predator weight, prey weight, temperature, and diel feeding pattern were first incorporated into predation equations analogous to Holling Type 2 and Type 3 functional response models. Type 2 and Type 3 equations fit field data from the Columbia River equally well, and both models predicted predation rates on five of seven independent dates. Selecting a functional response type may be complicated by variable predation rates, analytical methods, and assumptions of the model equations. Using the Type 2 functional response, random versus contagious timing of prey capture was tested using two related models. ln the simpler model, salmon captures were assumed to be controlled by a Poisson renewal process; in the second model, several salmon captures were assumed to occur during brief "feeding bouts", modeled with a compound Poisson process. Salmon captures by individual northern squawfish were clustered through time, rather than random, based on comparison of model simulations and field data. The contagious-feeding result suggests that salmonids may be encountered as patches or schools in the river.

  4. Modeling and simulation of electronic structure, material interface and random doping in nano electronic devices

    PubMed Central

    Chen, Duan; Wei, Guo-Wei

    2010-01-01

    The miniaturization of nano-scale electronic devices, such as metal oxide semiconductor field effect transistors (MOSFETs), has given rise to a pressing demand in the new theoretical understanding and practical tactic for dealing with quantum mechanical effects in integrated circuits. Modeling and simulation of this class of problems have emerged as an important topic in applied and computational mathematics. This work presents mathematical models and computational algorithms for the simulation of nano-scale MOSFETs. We introduce a unified two-scale energy functional to describe the electrons and the continuum electrostatic potential of the nano-electronic device. This framework enables us to put microscopic and macroscopic descriptions in an equal footing at nano scale. By optimization of the energy functional, we derive consistently-coupled Poisson-Kohn-Sham equations. Additionally, layered structures are crucial to the electrostatic and transport properties of nano transistors. A material interface model is proposed for more accurate description of the electrostatics governed by the Poisson equation. Finally, a new individual dopant model that utilizes the Dirac delta function is proposed to understand the random doping effect in nano electronic devices. Two mathematical algorithms, the matched interface and boundary (MIB) method and the Dirichlet-to-Neumann mapping (DNM) technique, are introduced to improve the computational efficiency of nano-device simulations. Electronic structures are computed via subband decomposition and the transport properties, such as the I-V curves and electron density, are evaluated via the non-equilibrium Green's functions (NEGF) formalism. Two distinct device configurations, a double-gate MOSFET and a four-gate MOSFET, are considered in our three-dimensional numerical simulations. For these devices, the current fluctuation and voltage threshold lowering effect induced by the discrete dopant model are explored. Numerical convergence and model well-posedness are also investigated in the present work. PMID:20396650

  5. Use of ceramic water filtration in the prevention of diarrheal disease: a randomized controlled trial in rural South Africa and zimbabwe.

    PubMed

    du Preez, Martella; Conroy, Ronán M; Wright, James A; Moyo, Sibonginkosi; Potgieter, Natasha; Gundry, Stephen W

    2008-11-01

    To determine the effectiveness of ceramic filters in reducing diarrhea, we conducted a randomized controlled trial in Zimbabwe and South Africa, in which 61 of 115 households received ceramic filters. Incidence of non-bloody and bloody diarrhea was recorded daily over 6 months using pictorial diaries for children 24-36 months of age. Poisson regression was used to compare incidence rates in intervention and control households. Adjusted for source quality, intervention household drinking water showed reduced Escherichia coli counts (relative risk, 0.67; 95% CI, 0.50-0.89). Zero E. coli were obtained for drinking water in 56.9% of intervention households. The incidence rate ratio for bloody diarrhea was 0.20 (95% CI, 0.09-0.43; P < 0.001) and for non-bloody diarrhea was 0.17 (95% CI, 0.08-0.38; P < 0.001), indicating much lower diarrhea incidence among filter users. The results suggest that ceramic filters are effective in reducing diarrheal disease incidence.

  6. Independence of the effective dielectric constant of an electrolytic solution on the ionic distribution in the linear Poisson-Nernst-Planck model.

    PubMed

    Alexe-Ionescu, A L; Barbero, G; Lelidis, I

    2014-08-28

    We consider the influence of the spatial dependence of the ions distribution on the effective dielectric constant of an electrolytic solution. We show that in the linear version of the Poisson-Nernst-Planck model, the effective dielectric constant of the solution has to be considered independent of any ionic distribution induced by the external field. This result follows from the fact that, in the linear approximation of the Poisson-Nernst-Planck model, the redistribution of the ions in the solvent due to the external field gives rise to a variation of the dielectric constant that is of the first order in the effective potential, and therefore it has to be neglected in the Poisson's equation that relates the actual electric potential across the electrolytic cell to the bulk density of ions. The analysis is performed in the case where the electrodes are perfectly blocking and the adsorption at the electrodes is negligible, and in the absence of any ion dissociation-recombination effect.

  7. Replication of Cancellation Orders Using First-Passage Time Theory in Foreign Currency Market

    NASA Astrophysics Data System (ADS)

    Boilard, Jean-François; Kanazawa, Kiyoshi; Takayasu, Hideki; Takayasu, Misako

    Our research focuses on the annihilation dynamics of limit orders in a spot foreign currency market for various currency pairs. We analyze the cancellation order distribution conditioned on the normalized distance from the mid-price; where the normalized distance is defined as the final distance divided by the initial distance. To reproduce real data, we introduce two simple models that assume the market price moves randomly and cancellation occurs either after fixed time t or following the Poisson process. Results of our model qualitatively reproduce basic statistical properties of cancellation orders of the data when limit orders are cancelled according to the Poisson process. We briefly discuss implication of our findings in the construction of more detailed microscopic models.

  8. A new method for the construction of a mutant library with a predictable occurrence rate using Poisson distribution.

    PubMed

    Seong, Ki Moon; Park, Hweon; Kim, Seong Jung; Ha, Hyo Nam; Lee, Jae Yung; Kim, Joon

    2007-06-01

    A yeast transcriptional activator, Gcn4p, induces the expression of genes that are involved in amino acid and purine biosynthetic pathways under amino acid starvation. Gcn4p has an acidic activation domain in the central region and a bZIP domain in the C-terminus that is divided into the DNA-binding motif and dimerization leucine zipper motif. In order to identify amino acids in the DNA-binding motif of Gcn4p which are involved in transcriptional activation, we constructed mutant libraries in the DNA-binding motif through an innovative application of random mutagenesis. Mutant library made by oligonucleotides which were mutated randomly using the Poisson distribution showed that the actual mutation frequency was in good agreement with expected values. This method could save the time and effort to create a mutant library with a predictable mutation frequency. Based on the studies using the mutant libraries constructed by the new method, the specific residues of the DNA-binding domain in Gcn4p appear to be involved in the transcriptional activities on a conserved binding site.

  9. The origin of bursts and heavy tails in human dynamics.

    PubMed

    Barabási, Albert-László

    2005-05-12

    The dynamics of many social, technological and economic phenomena are driven by individual human actions, turning the quantitative understanding of human behaviour into a central question of modern science. Current models of human dynamics, used from risk assessment to communications, assume that human actions are randomly distributed in time and thus well approximated by Poisson processes. In contrast, there is increasing evidence that the timing of many human activities, ranging from communication to entertainment and work patterns, follow non-Poisson statistics, characterized by bursts of rapidly occurring events separated by long periods of inactivity. Here I show that the bursty nature of human behaviour is a consequence of a decision-based queuing process: when individuals execute tasks based on some perceived priority, the timing of the tasks will be heavy tailed, with most tasks being rapidly executed, whereas a few experience very long waiting times. In contrast, random or priority blind execution is well approximated by uniform inter-event statistics. These finding have important implications, ranging from resource management to service allocation, in both communications and retail.

  10. Properties of plane discrete Poisson-Voronoi tessellations on triangular tiling formed by the Kolmogorov-Johnson-Mehl-Avrami growth of triangular islands

    NASA Astrophysics Data System (ADS)

    Korobov, A.

    2011-08-01

    Discrete uniform Poisson-Voronoi tessellations of two-dimensional triangular tilings resulting from the Kolmogorov-Johnson-Mehl-Avrami (KJMA) growth of triangular islands have been studied. This shape of tiles and islands, rarely considered in the field of random tessellations, is prompted by the birth-growth process of Ir(210) faceting. The growth mode determines a triangular metric different from the Euclidean metric. Kinetic characteristics of tessellations appear to be metric sensitive, in contrast to area distributions. The latter have been studied for the variant of nuclei growth to the first impingement in addition to the conventional case of complete growth. Kiang conjecture works in both cases. The averaged number of neighbors is six for all studied densities of random tessellations, but neighbors appear to be mainly different in triangular and Euclidean metrics. Also, the applicability of the obtained results for simulating birth-growth processes when the 2D nucleation and impingements are combined with the 3D growth in the particular case of similar shape and the same orientation of growing nuclei is briefly discussed.

  11. Properties of plane discrete Poisson-Voronoi tessellations on triangular tiling formed by the Kolmogorov-Johnson-Mehl-Avrami growth of triangular islands.

    PubMed

    Korobov, A

    2011-08-01

    Discrete uniform Poisson-Voronoi tessellations of two-dimensional triangular tilings resulting from the Kolmogorov-Johnson-Mehl-Avrami (KJMA) growth of triangular islands have been studied. This shape of tiles and islands, rarely considered in the field of random tessellations, is prompted by the birth-growth process of Ir(210) faceting. The growth mode determines a triangular metric different from the Euclidean metric. Kinetic characteristics of tessellations appear to be metric sensitive, in contrast to area distributions. The latter have been studied for the variant of nuclei growth to the first impingement in addition to the conventional case of complete growth. Kiang conjecture works in both cases. The averaged number of neighbors is six for all studied densities of random tessellations, but neighbors appear to be mainly different in triangular and Euclidean metrics. Also, the applicability of the obtained results for simulating birth-growth processes when the 2D nucleation and impingements are combined with the 3D growth in the particular case of similar shape and the same orientation of growing nuclei is briefly discussed.

  12. A fourth order PDE based fuzzy c- means approach for segmentation of microscopic biopsy images in presence of Poisson noise for cancer detection.

    PubMed

    Kumar, Rajesh; Srivastava, Subodh; Srivastava, Rajeev

    2017-07-01

    For cancer detection from microscopic biopsy images, image segmentation step used for segmentation of cells and nuclei play an important role. Accuracy of segmentation approach dominate the final results. Also the microscopic biopsy images have intrinsic Poisson noise and if it is present in the image the segmentation results may not be accurate. The objective is to propose an efficient fuzzy c-means based segmentation approach which can also handle the noise present in the image during the segmentation process itself i.e. noise removal and segmentation is combined in one step. To address the above issues, in this paper a fourth order partial differential equation (FPDE) based nonlinear filter adapted to Poisson noise with fuzzy c-means segmentation method is proposed. This approach is capable of effectively handling the segmentation problem of blocky artifacts while achieving good tradeoff between Poisson noise removals and edge preservation of the microscopic biopsy images during segmentation process for cancer detection from cells. The proposed approach is tested on breast cancer microscopic biopsy data set with region of interest (ROI) segmented ground truth images. The microscopic biopsy data set contains 31 benign and 27 malignant images of size 896 × 768. The region of interest selected ground truth of all 58 images are also available for this data set. Finally, the result obtained from proposed approach is compared with the results of popular segmentation algorithms; fuzzy c-means, color k-means, texture based segmentation, and total variation fuzzy c-means approaches. The experimental results shows that proposed approach is providing better results in terms of various performance measures such as Jaccard coefficient, dice index, Tanimoto coefficient, area under curve, accuracy, true positive rate, true negative rate, false positive rate, false negative rate, random index, global consistency error, and variance of information as compared to other segmentation approaches used for cancer detection. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. This is SPIRAL-TAP: Sparse Poisson Intensity Reconstruction ALgorithms--theory and practice.

    PubMed

    Harmany, Zachary T; Marcia, Roummel F; Willett, Rebecca M

    2012-03-01

    Observations in many applications consist of counts of discrete events, such as photons hitting a detector, which cannot be effectively modeled using an additive bounded or Gaussian noise model, and instead require a Poisson noise model. As a result, accurate reconstruction of a spatially or temporally distributed phenomenon (f*) from Poisson data (y) cannot be effectively accomplished by minimizing a conventional penalized least-squares objective function. The problem addressed in this paper is the estimation of f* from y in an inverse problem setting, where the number of unknowns may potentially be larger than the number of observations and f* admits sparse approximation. The optimization formulation considered in this paper uses a penalized negative Poisson log-likelihood objective function with nonnegativity constraints (since Poisson intensities are naturally nonnegative). In particular, the proposed approach incorporates key ideas of using separable quadratic approximations to the objective function at each iteration and penalization terms related to l1 norms of coefficient vectors, total variation seminorms, and partition-based multiscale estimation methods.

  14. On the estimation variance for the specific Euler-Poincaré characteristic of random networks.

    PubMed

    Tscheschel, A; Stoyan, D

    2003-07-01

    The specific Euler number is an important topological characteristic in many applications. It is considered here for the case of random networks, which may appear in microscopy either as primary objects of investigation or as secondary objects describing in an approximate way other structures such as, for example, porous media. For random networks there is a simple and natural estimator of the specific Euler number. For its estimation variance, a simple Poisson approximation is given. It is based on the general exact formula for the estimation variance. In two examples of quite different nature and topology application of the formulas is demonstrated.

  15. Approximating SIR-B response characteristics and estimating wave height and wavelength for ocean imagery

    NASA Technical Reports Server (NTRS)

    Tilley, David G.

    1987-01-01

    NASA Space Shuttle Challenger SIR-B ocean scenes are used to derive directional wave spectra for which speckle noise is modeled as a function of Rayleigh random phase coherence downrange and Poisson random amplitude errors inherent in the Doppler measurement of along-track position. A Fourier filter that preserves SIR-B image phase relations is used to correct the stationary and dynamic response characteristics of the remote sensor and scene correlator, as well as to subtract an estimate of the speckle noise component. A two-dimensional map of sea surface elevation is obtained after the filtered image is corrected for both random and deterministic motions.

  16. Generalized Poisson-Kac Processes: Basic Properties and Implications in Extended Thermodynamics and Transport

    NASA Astrophysics Data System (ADS)

    Giona, Massimiliano; Brasiello, Antonio; Crescitelli, Silvestro

    2016-04-01

    We introduce a new class of stochastic processes in Rn,{{{mathbb R}}^n}, referred to as generalized Poisson-Kac (GPK) processes, that generalizes the Poisson-Kac telegrapher's random motion in higher dimensions. These stochastic processes possess finite propagation velocity, almost everywhere smooth trajectories, and converge in the Kac limit to Brownian motion. GPK processes are defined by coupling the selection of a bounded velocity vector from a family of N distinct ones with a Markovian dynamics controlling probabilistically this selection. This model can be used as a probabilistic tool for a stochastically consistent formulation of extended thermodynamic theories far from equilibrium.

  17. Generating clustered scale-free networks using Poisson based localization of edges

    NASA Astrophysics Data System (ADS)

    Türker, İlker

    2018-05-01

    We introduce a variety of network models using a Poisson-based edge localization strategy, which result in clustered scale-free topologies. We first verify the success of our localization strategy by realizing a variant of the well-known Watts-Strogatz model with an inverse approach, implying a small-world regime of rewiring from a random network through a regular one. We then apply the rewiring strategy to a pure Barabasi-Albert model and successfully achieve a small-world regime, with a limited capacity of scale-free property. To imitate the high clustering property of scale-free networks with higher accuracy, we adapted the Poisson-based wiring strategy to a growing network with the ingredients of both preferential attachment and local connectivity. To achieve the collocation of these properties, we used a routine of flattening the edges array, sorting it, and applying a mixing procedure to assemble both global connections with preferential attachment and local clusters. As a result, we achieved clustered scale-free networks with a computational fashion, diverging from the recent studies by following a simple but efficient approach.

  18. Note on the coefficient of variations of neuronal spike trains.

    PubMed

    Lengler, Johannes; Steger, Angelika

    2017-08-01

    It is known that many neurons in the brain show spike trains with a coefficient of variation (CV) of the interspike times of approximately 1, thus resembling the properties of Poisson spike trains. Computational studies have been able to reproduce this phenomenon. However, the underlying models were too complex to be examined analytically. In this paper, we offer a simple model that shows the same effect but is accessible to an analytic treatment. The model is a random walk model with a reflecting barrier; we give explicit formulas for the CV in the regime of excess inhibition. We also analyze the effect of probabilistic synapses in our model and show that it resembles previous findings that were obtained by simulation.

  19. Implementation of a quantum random number generator based on the optimal clustering of photocounts

    NASA Astrophysics Data System (ADS)

    Balygin, K. A.; Zaitsev, V. I.; Klimov, A. N.; Kulik, S. P.; Molotkov, S. N.

    2017-10-01

    To implement quantum random number generators, it is fundamentally important to have a mathematically provable and experimentally testable process of measurements of a system from which an initial random sequence is generated. This makes sure that randomness indeed has a quantum nature. A quantum random number generator has been implemented with the use of the detection of quasi-single-photon radiation by a silicon photomultiplier (SiPM) matrix, which makes it possible to reliably reach the Poisson statistics of photocounts. The choice and use of the optimal clustering of photocounts for the initial sequence of photodetection events and a method of extraction of a random sequence of 0's and 1's, which is polynomial in the length of the sequence, have made it possible to reach a yield rate of 64 Mbit/s of the output certainly random sequence.

  20. Application of the Hyper-Poisson Generalized Linear Model for Analyzing Motor Vehicle Crashes.

    PubMed

    Khazraee, S Hadi; Sáez-Castillo, Antonio Jose; Geedipally, Srinivas Reddy; Lord, Dominique

    2015-05-01

    The hyper-Poisson distribution can handle both over- and underdispersion, and its generalized linear model formulation allows the dispersion of the distribution to be observation-specific and dependent on model covariates. This study's objective is to examine the potential applicability of a newly proposed generalized linear model framework for the hyper-Poisson distribution in analyzing motor vehicle crash count data. The hyper-Poisson generalized linear model was first fitted to intersection crash data from Toronto, characterized by overdispersion, and then to crash data from railway-highway crossings in Korea, characterized by underdispersion. The results of this study are promising. When fitted to the Toronto data set, the goodness-of-fit measures indicated that the hyper-Poisson model with a variable dispersion parameter provided a statistical fit as good as the traditional negative binomial model. The hyper-Poisson model was also successful in handling the underdispersed data from Korea; the model performed as well as the gamma probability model and the Conway-Maxwell-Poisson model previously developed for the same data set. The advantages of the hyper-Poisson model studied in this article are noteworthy. Unlike the negative binomial model, which has difficulties in handling underdispersed data, the hyper-Poisson model can handle both over- and underdispersed crash data. Although not a major issue for the Conway-Maxwell-Poisson model, the effect of each variable on the expected mean of crashes is easily interpretable in the case of this new model. © 2014 Society for Risk Analysis.

  1. Disease Mapping of Zero-excessive Mesothelioma Data in Flanders

    PubMed Central

    Neyens, Thomas; Lawson, Andrew B.; Kirby, Russell S.; Nuyts, Valerie; Watjou, Kevin; Aregay, Mehreteab; Carroll, Rachel; Nawrot, Tim S.; Faes, Christel

    2016-01-01

    Purpose To investigate the distribution of mesothelioma in Flanders using Bayesian disease mapping models that account for both an excess of zeros and overdispersion. Methods The numbers of newly diagnosed mesothelioma cases within all Flemish municipalities between 1999 and 2008 were obtained from the Belgian Cancer Registry. To deal with overdispersion, zero-inflation and geographical association, the hurdle combined model was proposed, which has three components: a Bernoulli zero-inflation mixture component to account for excess zeros, a gamma random effect to adjust for overdispersion and a normal conditional autoregressive random effect to attribute spatial association. This model was compared with other existing methods in literature. Results The results indicate that hurdle models with a random effects term accounting for extra-variance in the Bernoulli zero-inflation component fit the data better than hurdle models that do not take overdispersion in the occurrence of zeros into account. Furthermore, traditional models that do not take into account excessive zeros but contain at least one random effects term that models extra-variance in the counts have better fits compared to their hurdle counterparts. In other words, the extra-variability, due to an excess of zeros, can be accommodated by spatially structured and/or unstructured random effects in a Poisson model such that the hurdle mixture model is not necessary. Conclusions Models taking into account zero-inflation do not always provide better fits to data with excessive zeros than less complex models. In this study, a simple conditional autoregressive model identified a cluster in mesothelioma cases near a former asbestos processing plant (Kapelle-op-den-Bos). This observation is likely linked with historical local asbestos exposures. Future research will clarify this. PMID:27908590

  2. Disease mapping of zero-excessive mesothelioma data in Flanders.

    PubMed

    Neyens, Thomas; Lawson, Andrew B; Kirby, Russell S; Nuyts, Valerie; Watjou, Kevin; Aregay, Mehreteab; Carroll, Rachel; Nawrot, Tim S; Faes, Christel

    2017-01-01

    To investigate the distribution of mesothelioma in Flanders using Bayesian disease mapping models that account for both an excess of zeros and overdispersion. The numbers of newly diagnosed mesothelioma cases within all Flemish municipalities between 1999 and 2008 were obtained from the Belgian Cancer Registry. To deal with overdispersion, zero inflation, and geographical association, the hurdle combined model was proposed, which has three components: a Bernoulli zero-inflation mixture component to account for excess zeros, a gamma random effect to adjust for overdispersion, and a normal conditional autoregressive random effect to attribute spatial association. This model was compared with other existing methods in literature. The results indicate that hurdle models with a random effects term accounting for extra variance in the Bernoulli zero-inflation component fit the data better than hurdle models that do not take overdispersion in the occurrence of zeros into account. Furthermore, traditional models that do not take into account excessive zeros but contain at least one random effects term that models extra variance in the counts have better fits compared to their hurdle counterparts. In other words, the extra variability, due to an excess of zeros, can be accommodated by spatially structured and/or unstructured random effects in a Poisson model such that the hurdle mixture model is not necessary. Models taking into account zero inflation do not always provide better fits to data with excessive zeros than less complex models. In this study, a simple conditional autoregressive model identified a cluster in mesothelioma cases near a former asbestos processing plant (Kapelle-op-den-Bos). This observation is likely linked with historical local asbestos exposures. Future research will clarify this. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Travelling Randomly on the Poincaré Half-Plane with a Pythagorean Compass

    NASA Astrophysics Data System (ADS)

    Cammarota, V.; Orsingher, E.

    2008-02-01

    A random motion on the Poincaré half-plane is studied. A particle runs on the geodesic lines changing direction at Poisson-paced times. The hyperbolic distance is analyzed, also in the case where returns to the starting point are admitted. The main results concern the mean hyperbolic distance (and also the conditional mean distance) in all versions of the motion envisaged. Also an analogous motion on orthogonal circles of the sphere is examined and the evolution of the mean distance from the starting point is investigated.

  4. Guidelines for Use of the Approximate Beta-Poisson Dose-Response Model.

    PubMed

    Xie, Gang; Roiko, Anne; Stratton, Helen; Lemckert, Charles; Dunn, Peter K; Mengersen, Kerrie

    2017-07-01

    For dose-response analysis in quantitative microbial risk assessment (QMRA), the exact beta-Poisson model is a two-parameter mechanistic dose-response model with parameters α>0 and β>0, which involves the Kummer confluent hypergeometric function. Evaluation of a hypergeometric function is a computational challenge. Denoting PI(d) as the probability of infection at a given mean dose d, the widely used dose-response model PI(d)=1-(1+dβ)-α is an approximate formula for the exact beta-Poisson model. Notwithstanding the required conditions α<β and β>1, issues related to the validity and approximation accuracy of this approximate formula have remained largely ignored in practice, partly because these conditions are too general to provide clear guidance. Consequently, this study proposes a probability measure Pr(0 < r < 1 | α̂, β̂) as a validity measure (r is a random variable that follows a gamma distribution; α̂ and β̂ are the maximum likelihood estimates of α and β in the approximate model); and the constraint conditions β̂>(22α̂)0.50 for 0.02<α̂<2 as a rule of thumb to ensure an accurate approximation (e.g., Pr(0 < r < 1 | α̂, β̂) >0.99) . This validity measure and rule of thumb were validated by application to all the completed beta-Poisson models (related to 85 data sets) from the QMRA community portal (QMRA Wiki). The results showed that the higher the probability Pr(0 < r < 1 | α̂, β̂), the better the approximation. The results further showed that, among the total 85 models examined, 68 models were identified as valid approximate model applications, which all had a near perfect match to the corresponding exact beta-Poisson model dose-response curve. © 2016 Society for Risk Analysis.

  5. Comparison of the Nernst-Planck model and the Poisson-Boltzmann model for electroosmotic flows in microchannels.

    PubMed

    Park, H M; Lee, J S; Kim, T W

    2007-11-15

    In the analysis of electroosmotic flows, the internal electric potential is usually modeled by the Poisson-Boltzmann equation. The Poisson-Boltzmann equation is derived from the assumption of thermodynamic equilibrium where the ionic distributions are not affected by fluid flows. Although this is a reasonable assumption for steady electroosmotic flows through straight microchannels, there are some important cases where convective transport of ions has nontrivial effects. In these cases, it is necessary to adopt the Nernst-Planck equation instead of the Poisson-Boltzmann equation to model the internal electric field. In the present work, the predictions of the Nernst-Planck equation are compared with those of the Poisson-Boltzmann equation for electroosmotic flows in various microchannels where the convective transport of ions is not negligible.

  6. Efficiency optimization of a fast Poisson solver in beam dynamics simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Dawei; Pöplau, Gisela; van Rienen, Ursula

    2016-01-01

    Calculating the solution of Poisson's equation relating to space charge force is still the major time consumption in beam dynamics simulations and calls for further improvement. In this paper, we summarize a classical fast Poisson solver in beam dynamics simulations: the integrated Green's function method. We introduce three optimization steps of the classical Poisson solver routine: using the reduced integrated Green's function instead of the integrated Green's function; using the discrete cosine transform instead of discrete Fourier transform for the Green's function; using a novel fast convolution routine instead of an explicitly zero-padded convolution. The new Poisson solver routine preserves the advantages of fast computation and high accuracy. This provides a fast routine for high performance calculation of the space charge effect in accelerators.

  7. SIERRA - A 3-D device simulator for reliability modeling

    NASA Astrophysics Data System (ADS)

    Chern, Jue-Hsien; Arledge, Lawrence A., Jr.; Yang, Ping; Maeda, John T.

    1989-05-01

    SIERRA is a three-dimensional general-purpose semiconductor-device simulation program which serves as a foundation for investigating integrated-circuit (IC) device and reliability issues. This program solves the Poisson and continuity equations in silicon under dc, transient, and small-signal conditions. Executing on a vector/parallel minisupercomputer, SIERRA utilizes a matrix solver which uses an incomplete LU (ILU) preconditioned conjugate gradient square (CGS, BCG) method. The ILU-CGS method provides a good compromise between memory size and convergence rate. The authors have observed a 5x to 7x speedup over standard direct methods in simulations of transient problems containing highly coupled Poisson and continuity equations such as those found in reliability-oriented simulations. The application of SIERRA to parasitic CMOS latchup and dynamic random-access memory single-event-upset studies is described.

  8. Selecting a distributional assumption for modelling relative densities of benthic macroinvertebrates

    USGS Publications Warehouse

    Gray, B.R.

    2005-01-01

    The selection of a distributional assumption suitable for modelling macroinvertebrate density data is typically challenging. Macroinvertebrate data often exhibit substantially larger variances than expected under a standard count assumption, that of the Poisson distribution. Such overdispersion may derive from multiple sources, including heterogeneity of habitat (historically and spatially), differing life histories for organisms collected within a single collection in space and time, and autocorrelation. Taken to extreme, heterogeneity of habitat may be argued to explain the frequent large proportions of zero observations in macroinvertebrate data. Sampling locations may consist of habitats defined qualitatively as either suitable or unsuitable. The former category may yield random or stochastic zeroes and the latter structural zeroes. Heterogeneity among counts may be accommodated by treating the count mean itself as a random variable, while extra zeroes may be accommodated using zero-modified count assumptions, including zero-inflated and two-stage (or hurdle) approaches. These and linear assumptions (following log- and square root-transformations) were evaluated using 9 years of mayfly density data from a 52 km, ninth-order reach of the Upper Mississippi River (n = 959). The data exhibited substantial overdispersion relative to that expected under a Poisson assumption (i.e. variance:mean ratio = 23 ??? 1), and 43% of the sampling locations yielded zero mayflies. Based on the Akaike Information Criterion (AIC), count models were improved most by treating the count mean as a random variable (via a Poisson-gamma distributional assumption) and secondarily by zero modification (i.e. improvements in AIC values = 9184 units and 47-48 units, respectively). Zeroes were underestimated by the Poisson, log-transform and square root-transform models, slightly by the standard negative binomial model but not by the zero-modified models (61%, 24%, 32%, 7%, and 0%, respectively). However, the zero-modified Poisson models underestimated small counts (1 ??? y ??? 4) and overestimated intermediate counts (7 ??? y ??? 23). Counts greater than zero were estimated well by zero-modified negative binomial models, while counts greater than one were also estimated well by the standard negative binomial model. Based on AIC and percent zero estimation criteria, the two-stage and zero-inflated models performed similarly. The above inferences were largely confirmed when the models were used to predict values from a separate, evaluation data set (n = 110). An exception was that, using the evaluation data set, the standard negative binomial model appeared superior to its zero-modified counterparts using the AIC (but not percent zero criteria). This and other evidence suggest that a negative binomial distributional assumption should be routinely considered when modelling benthic macroinvertebrate data from low flow environments. Whether negative binomial models should themselves be routinely examined for extra zeroes requires, from a statistical perspective, more investigation. However, this question may best be answered by ecological arguments that may be specific to the sampled species and locations. ?? 2004 Elsevier B.V. All rights reserved.

  9. Composite laminates with negative through-the-thickness Poisson's ratios

    NASA Technical Reports Server (NTRS)

    Herakovich, C. T.

    1984-01-01

    A simple analysis using two dimensional lamination theory combined with the appropriate three dimensional anisotropic constitutive equation is presented to show some rather surprising results for the range of values of the through-the-thickness effective Poisson's ratio nu sub xz for angle ply laminates. Results for graphite-epoxy show that the through-the-thickness effective Poisson's ratio can range from a high of 0.49 for a 90 laminate to a low of -0.21 for a + or - 25s laminate. It is shown that negative values of nu sub xz are also possible for other laminates.

  10. Composite laminates with negative through-the-thickness Poisson's ratios

    NASA Technical Reports Server (NTRS)

    Herakovich, C. T.

    1984-01-01

    A simple analysis using two-dimensional lamination theory combined with the appropriate three-dimensional anisotropic constitutive equation is presented to show some rather surprising results for the range of values of the through-the-thickness effective Poisson's ratio nu sub xz for angle ply laminates. Results for graphite-epoxy show that the through-the-thickness effective Poisson's ratio can range from a high of 0.49 for a 90 laminate to a low of -0.21 for a + or - 25s laminate. It is shown that negative values of nu sub xz are also possible for other laminates.

  11. A probabilistic approach to randomness in geometric configuration of scalable origami structures

    NASA Astrophysics Data System (ADS)

    Liu, Ke; Paulino, Glaucio; Gardoni, Paolo

    2015-03-01

    Origami, an ancient paper folding art, has inspired many solutions to modern engineering challenges. The demand for actual engineering applications motivates further investigation in this field. Although rooted from the historic art form, many applications of origami are based on newly designed origami patterns to match the specific requirenments of an engineering problem. The application of origami to structural design problems ranges from micro-structure of materials to large scale deployable shells. For instance, some origami-inspired designs have unique properties such as negative Poisson ratio and flat foldability. However, origami structures are typically constrained by strict mathematical geometric relationships, which in reality, can be easily violated, due to, for example, random imperfections introduced during manufacturing, or non-uniform deformations under working conditions (e.g. due to non-uniform thermal effects). Therefore, the effects of uncertainties in origami-like structures need to be studied in further detail in order to provide a practical guide for scalable origami-inspired engineering designs. Through reliability and probabilistic analysis, we investigate the effect of randomness in origami structures on their mechanical properties. Dislocations of vertices of an origami structure have different impacts on different mechanical properties, and different origami designs could have different sensitivities to imperfections. Thus we aim to provide a preliminary understanding of the structural behavior of some common scalable origami structures subject to randomness in their geometric configurations in order to help transition the technology toward practical applications of origami engineering.

  12. Effect of non-Poisson samples on turbulence spectra from laser velocimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sree, D.; Kjelgaard, S.O.; Sellers, W.L. III

    1994-12-01

    Spectral estimations from LV data are typically based on the assumption of a Poisson sampling process. It is demonstrated here that the sampling distribution must be considered before spectral estimates are used to infer turbulence scales. A non-Poisson sampling process can occur if there is nonhomogeneous distribution of particles in the flow. Based on the study of a simulated first-order spectrum, it has been shown that a non-Poisson sampling process causes the estimated spectrum to deviate from the true spectrum. Also, in this case the prefiltering techniques do not improve the spectral estimates at higher frequencies. 4 refs.

  13. Quantitative model of price diffusion and market friction based on trading as a mechanistic random process.

    PubMed

    Daniels, Marcus G; Farmer, J Doyne; Gillemot, László; Iori, Giulia; Smith, Eric

    2003-03-14

    We model trading and price formation in a market under the assumption that order arrival and cancellations are Poisson random processes. This model makes testable predictions for the most basic properties of markets, such as the diffusion rate of prices (which is the standard measure of financial risk) and the spread and price impact functions (which are the main determinants of transaction cost). Guided by dimensional analysis, simulation, and mean-field theory, we find scaling relations in terms of order flow rates. We show that even under completely random order flow the need to store supply and demand to facilitate trading induces anomalous diffusion and temporal structure in prices.

  14. Quantitative Model of Price Diffusion and Market Friction Based on Trading as a Mechanistic Random Process

    NASA Astrophysics Data System (ADS)

    Daniels, Marcus G.; Farmer, J. Doyne; Gillemot, László; Iori, Giulia; Smith, Eric

    2003-03-01

    We model trading and price formation in a market under the assumption that order arrival and cancellations are Poisson random processes. This model makes testable predictions for the most basic properties of markets, such as the diffusion rate of prices (which is the standard measure of financial risk) and the spread and price impact functions (which are the main determinants of transaction cost). Guided by dimensional analysis, simulation, and mean-field theory, we find scaling relations in terms of order flow rates. We show that even under completely random order flow the need to store supply and demand to facilitate trading induces anomalous diffusion and temporal structure in prices.

  15. Bayesian inference on multiscale models for poisson intensity estimation: applications to photon-limited image denoising.

    PubMed

    Lefkimmiatis, Stamatios; Maragos, Petros; Papandreou, George

    2009-08-01

    We present an improved statistical model for analyzing Poisson processes, with applications to photon-limited imaging. We build on previous work, adopting a multiscale representation of the Poisson process in which the ratios of the underlying Poisson intensities (rates) in adjacent scales are modeled as mixtures of conjugate parametric distributions. Our main contributions include: 1) a rigorous and robust regularized expectation-maximization (EM) algorithm for maximum-likelihood estimation of the rate-ratio density parameters directly from the noisy observed Poisson data (counts); 2) extension of the method to work under a multiscale hidden Markov tree model (HMT) which couples the mixture label assignments in consecutive scales, thus modeling interscale coefficient dependencies in the vicinity of image edges; 3) exploration of a 2-D recursive quad-tree image representation, involving Dirichlet-mixture rate-ratio densities, instead of the conventional separable binary-tree image representation involving beta-mixture rate-ratio densities; and 4) a novel multiscale image representation, which we term Poisson-Haar decomposition, that better models the image edge structure, thus yielding improved performance. Experimental results on standard images with artificially simulated Poisson noise and on real photon-limited images demonstrate the effectiveness of the proposed techniques.

  16. Ion-Conserving Modified Poisson-Boltzmann Theory Considering a Steric Effect in an Electrolyte

    NASA Astrophysics Data System (ADS)

    Sugioka, Hideyuki

    2016-12-01

    The modified Poisson-Nernst-Planck (MPNP) and modified Poisson-Boltzmann (MPB) equations are well known as fundamental equations that consider a steric effect, which prevents unphysical ion concentrations. However, it is unclear whether they are equivalent or not. To clarify this problem, we propose an improved free energy formulation that considers a steric limit with an ion-conserving condition and successfully derive the ion-conserving modified Poisson-Boltzmann (IC-MPB) equations that are equivalent to the MPNP equations. Furthermore, we numerically examine the equivalence by comparing between the IC-MPB solutions obtained by the Newton method and the steady MPNP solutions obtained by the finite-element finite-volume method. A surprising aspect of our finding is that the MPB solutions are much different from the MPNP (IC-MPB) solutions in a confined space. We consider that our findings will significantly contribute to understanding the surface science between solids and liquids.

  17. Research in Stochastic Processes.

    DTIC Science & Technology

    1983-10-01

    increases. A more detailed investigation for the exceedances themselves (rather than Just the cluster centers) was undertaken, together with J. HUsler and...J. HUsler and M.R. Leadbetter, Compoung Poisson limit theorems for high level exceedances by stationary sequences, Center for Stochastic Processes...stability by a random linear operator. C.D. Hardin, General (asymmetric) stable variables and processes. T. Hsing, J. HUsler and M.R. Leadbetter, Compound

  18. Prediction of Short-Distance Aerial Movement of Phakopsora pachyrhizi Urediniospores Using Machine Learning.

    PubMed

    Wen, L; Bowen, C R; Hartman, G L

    2017-10-01

    Dispersal of urediniospores by wind is the primary means of spread for Phakopsora pachyrhizi, the cause of soybean rust. Our research focused on the short-distance movement of urediniospores from within the soybean canopy and up to 61 m from field-grown rust-infected soybean plants. Environmental variables were used to develop and compare models including the least absolute shrinkage and selection operator regression, zero-inflated Poisson/regular Poisson regression, random forest, and neural network to describe deposition of urediniospores collected in passive and active traps. All four models identified distance of trap from source, humidity, temperature, wind direction, and wind speed as the five most important variables influencing short-distance movement of urediniospores. The random forest model provided the best predictions, explaining 76.1 and 86.8% of the total variation in the passive- and active-trap datasets, respectively. The prediction accuracy based on the correlation coefficient (r) between predicted values and the true values were 0.83 (P < 0.0001) and 0.94 (P < 0.0001) for the passive and active trap datasets, respectively. Overall, multiple machine learning techniques identified the most important variables to make the most accurate predictions of movement of P. pachyrhizi urediniospores short-distance.

  19. Solving the problem of negative populations in approximate accelerated stochastic simulations using the representative reaction approach.

    PubMed

    Kadam, Shantanu; Vanka, Kumar

    2013-02-15

    Methods based on the stochastic formulation of chemical kinetics have the potential to accurately reproduce the dynamical behavior of various biochemical systems of interest. However, the computational expense makes them impractical for the study of real systems. Attempts to render these methods practical have led to the development of accelerated methods, where the reaction numbers are modeled by Poisson random numbers. However, for certain systems, such methods give rise to physically unrealistic negative numbers for species populations. The methods which make use of binomial variables, in place of Poisson random numbers, have since become popular, and have been partially successful in addressing this problem. In this manuscript, the development of two new computational methods, based on the representative reaction approach (RRA), has been discussed. The new methods endeavor to solve the problem of negative numbers, by making use of tools like the stochastic simulation algorithm and the binomial method, in conjunction with the RRA. It is found that these newly developed methods perform better than other binomial methods used for stochastic simulations, in resolving the problem of negative populations. Copyright © 2012 Wiley Periodicals, Inc.

  20. The Effect of Improved Water Supply on Diarrhea Prevalence of Children under Five in the Volta Region of Ghana: A Cluster-Randomized Controlled Trial.

    PubMed

    Cha, Seungman; Kang, Douk; Tuffuor, Benedict; Lee, Gyuhong; Cho, Jungmyung; Chung, Jihye; Kim, Myongjin; Lee, Hoonsang; Lee, Jaeeun; Oh, Chunghyeon

    2015-09-25

    Although a number of studies have been conducted to explore the effect of water quality improvement, the majority of them have focused mainly on point-of-use water treatment, and the studies investigating the effect of improved water supply have been based on observational or inadequately randomized trials. We report the results of a matched cluster randomized trial investigating the effect of improved water supply on diarrheal prevalence of children under five living in rural areas of the Volta Region in Ghana. We compared the diarrheal prevalence of 305 children in 10 communities of intervention with 302 children in 10 matched communities with no intervention (October 2012 to February 2014). A modified Poisson regression was used to estimate the prevalence ratio. An intention-to-treat analysis was undertaken. The crude prevalence ratio of diarrhea in the intervention compared with the control communities was 0.85 (95% CI 0.74-0.97) for Krachi West, 0.96 (0.87-1.05) for Krachi East, and 0.91 (0.83-0.98) for both districts. Sanitation was adjusted for in the model to remove the bias due to residual imbalance since it was not balanced even after randomization. The adjusted prevalence ratio was 0.82 (95% CI 0.71-0.96) for Krachi West, 0.95 (0.86-1.04) for Krachi East, and 0.89 (0.82-0.97) for both districts. This study provides a basis for a better approach to water quality interventions.

  1. The Effect of Improved Water Supply on Diarrhea Prevalence of Children under Five in the Volta Region of Ghana: A Cluster-Randomized Controlled Trial

    PubMed Central

    Cha, Seungman; Kang, Douk; Tuffuor, Benedict; Lee, Gyuhong; Cho, Jungmyung; Chung, Jihye; Kim, Myongjin; Lee, Hoonsang; Lee, Jaeeun; Oh, Chunghyeon

    2015-01-01

    Although a number of studies have been conducted to explore the effect of water quality improvement, the majority of them have focused mainly on point-of-use water treatment, and the studies investigating the effect of improved water supply have been based on observational or inadequately randomized trials. We report the results of a matched cluster randomized trial investigating the effect of improved water supply on diarrheal prevalence of children under five living in rural areas of the Volta Region in Ghana. We compared the diarrheal prevalence of 305 children in 10 communities of intervention with 302 children in 10 matched communities with no intervention (October 2012 to February 2014). A modified Poisson regression was used to estimate the prevalence ratio. An intention-to-treat analysis was undertaken. The crude prevalence ratio of diarrhea in the intervention compared with the control communities was 0.85 (95% CI 0.74–0.97) for Krachi West, 0.96 (0.87–1.05) for Krachi East, and 0.91 (0.83–0.98) for both districts. Sanitation was adjusted for in the model to remove the bias due to residual imbalance since it was not balanced even after randomization. The adjusted prevalence ratio was 0.82 (95% CI 0.71–0.96) for Krachi West, 0.95 (0.86–1.04) for Krachi East, and 0.89 (0.82–0.97) for both districts. This study provides a basis for a better approach to water quality interventions. PMID:26404337

  2. Robust small area prediction for counts.

    PubMed

    Tzavidis, Nikos; Ranalli, M Giovanna; Salvati, Nicola; Dreassi, Emanuela; Chambers, Ray

    2015-06-01

    A new semiparametric approach to model-based small area prediction for counts is proposed and used for estimating the average number of visits to physicians for Health Districts in Central Italy. The proposed small area predictor can be viewed as an outlier robust alternative to the more commonly used empirical plug-in predictor that is based on a Poisson generalized linear mixed model with Gaussian random effects. Results from the real data application and from a simulation experiment confirm that the proposed small area predictor has good robustness properties and in some cases can be more efficient than alternative small area approaches. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  3. Seizure Forecasting and the Preictal State in Canine Epilepsy.

    PubMed

    Varatharajah, Yogatheesan; Iyer, Ravishankar K; Berry, Brent M; Worrell, Gregory A; Brinkmann, Benjamin H

    2017-02-01

    The ability to predict seizures may enable patients with epilepsy to better manage their medications and activities, potentially reducing side effects and improving quality of life. Forecasting epileptic seizures remains a challenging problem, but machine learning methods using intracranial electroencephalographic (iEEG) measures have shown promise. A machine-learning-based pipeline was developed to process iEEG recordings and generate seizure warnings. Results support the ability to forecast seizures at rates greater than a Poisson random predictor for all feature sets and machine learning algorithms tested. In addition, subject-specific neurophysiological changes in multiple features are reported preceding lead seizures, providing evidence supporting the existence of a distinct and identifiable preictal state.

  4. SEIZURE FORECASTING AND THE PREICTAL STATE IN CANINE EPILEPSY

    PubMed Central

    Varatharajah, Yogatheesan; Iyer, Ravishankar K.; Berry, Brent M.; Worrell, Gregory A.; Brinkmann, Benjamin H.

    2017-01-01

    The ability to predict seizures may enable patients with epilepsy to better manage their medications and activities, potentially reducing side effects and improving quality of life. Forecasting epileptic seizures remains a challenging problem, but machine learning methods using intracranial electroencephalographic (iEEG) measures have shown promise. A machine-learning-based pipeline was developed to process iEEG recordings and generate seizure warnings. Results support the ability to forecast seizures at rates greater than a Poisson random predictor for all feature sets and machine learning algorithms tested. In addition, subject-specific neurophysiological changes in multiple features are reported preceding lead seizures, providing evidence supporting the existence of a distinct and identifiable preictal state. PMID:27464854

  5. An investigation of stress wave propagation in a shear deformable nanobeam based on modified couple stress theory

    NASA Astrophysics Data System (ADS)

    Akbarzadeh Khorshidi, Majid; Shariati, Mahmoud

    2016-04-01

    This paper presents a new investigation for propagation of stress wave in a nanobeam based on modified couple stress theory. Using Euler-Bernoulli beam theory, Timoshenko beam theory, and Reddy beam theory, the effect of shear deformation is investigated. This nonclassical model contains a material length scale parameter to capture the size effect and the Poisson effect is incorporated in the current model. Governing equations of motion are obtained by Hamilton's principle and solved explicitly. This solution leads to obtain two phase velocities for shear deformable beams in different directions. Effects of shear deformation, material length scale parameter, and Poisson's ratio on the behavior of these phase velocities are investigated and discussed. The results also show a dual behavior for phase velocities against Poisson's ratio.

  6. Effectiveness of Healthy Relationships Video-Group—A Videoconferencing Group Intervention for Women Living with HIV: Preliminary Findings from a Randomized Controlled Trial

    PubMed Central

    Buhi, Eric R.; Baldwin, Julie; Chen, Henian; Johnson, Ayesha; Lynn, Vickie; Glueckauf, Robert

    2014-01-01

    Abstract Introduction: Expanded access to efficacious interventions is needed for women living with human immunodeficiency virus (WLH) in the United States. Availability of “prevention with (human immunodeficiency virus [HIV)] positives” interventions in rural/remote and low HIV prevalence areas remains limited, leaving WLH in these communities few options for receiving effective behavioral interventions such as Healthy Relationships (HR). Offering such programs via videoconferencing groups (VGs) may expand access. This analysis tests the effectiveness of HR-VG (versus wait-list control) for reducing sexual risk behavior among WLH and explores intervention satisfaction. Subjects and Methods: In this randomized controlled trial unprotected vaginal/anal sex occasions over the prior 3 months reported at the 6-month follow-up were compared across randomization groups through zero-inflated Poisson regression modeling, controlling for unprotected sex at baseline. Seventy-one WLH were randomized and completed the baseline assessment (n=36 intervention and n=35 control); 59 (83% in each group) had follow-up data. Results: Among those who engaged in unprotected sex at 6-month follow-up, intervention participants had approximately seven fewer unprotected occasions than control participants (95% confidence interval 5.43–7.43). Intervention participants reported high levels of satisfaction with HR-VG; 84% reported being “very satisfied” overall. Conclusions: This study found promising evidence for effective dissemination of HIV risk reduction interventions via VGs. Important next steps will be to determine whether VGs are effective with other subpopulations of people living with HIV (i.e., men and non-English speakers) and to assess cost-effectiveness. Possibilities for using VGs to expand access to other psychosocial and behavioral interventions and reduce stigma are discussed. PMID:24237482

  7. QMRA for Drinking Water: 2. The Effect of Pathogen Clustering in Single-Hit Dose-Response Models.

    PubMed

    Nilsen, Vegard; Wyller, John

    2016-01-01

    Spatial and/or temporal clustering of pathogens will invalidate the commonly used assumption of Poisson-distributed pathogen counts (doses) in quantitative microbial risk assessment. In this work, the theoretically predicted effect of spatial clustering in conventional "single-hit" dose-response models is investigated by employing the stuttering Poisson distribution, a very general family of count distributions that naturally models pathogen clustering and contains the Poisson and negative binomial distributions as special cases. The analysis is facilitated by formulating the dose-response models in terms of probability generating functions. It is shown formally that the theoretical single-hit risk obtained with a stuttering Poisson distribution is lower than that obtained with a Poisson distribution, assuming identical mean doses. A similar result holds for mixed Poisson distributions. Numerical examples indicate that the theoretical single-hit risk is fairly insensitive to moderate clustering, though the effect tends to be more pronounced for low mean doses. Furthermore, using Jensen's inequality, an upper bound on risk is derived that tends to better approximate the exact theoretical single-hit risk for highly overdispersed dose distributions. The bound holds with any dose distribution (characterized by its mean and zero inflation index) and any conditional dose-response model that is concave in the dose variable. Its application is exemplified with published data from Norovirus feeding trials, for which some of the administered doses were prepared from an inoculum of aggregated viruses. The potential implications of clustering for dose-response assessment as well as practical risk characterization are discussed. © 2016 Society for Risk Analysis.

  8. Prescription-induced jump distributions in multiplicative Poisson processes.

    PubMed

    Suweis, Samir; Porporato, Amilcare; Rinaldo, Andrea; Maritan, Amos

    2011-06-01

    Generalized Langevin equations (GLE) with multiplicative white Poisson noise pose the usual prescription dilemma leading to different evolution equations (master equations) for the probability distribution. Contrary to the case of multiplicative Gaussian white noise, the Stratonovich prescription does not correspond to the well-known midpoint (or any other intermediate) prescription. By introducing an inertial term in the GLE, we show that the Itô and Stratonovich prescriptions naturally arise depending on two time scales, one induced by the inertial term and the other determined by the jump event. We also show that, when the multiplicative noise is linear in the random variable, one prescription can be made equivalent to the other by a suitable transformation in the jump probability distribution. We apply these results to a recently proposed stochastic model describing the dynamics of primary soil salinization, in which the salt mass balance within the soil root zone requires the analysis of different prescriptions arising from the resulting stochastic differential equation forced by multiplicative white Poisson noise, the features of which are tailored to the characters of the daily precipitation. A method is finally suggested to infer the most appropriate prescription from the data.

  9. Prescription-induced jump distributions in multiplicative Poisson processes

    NASA Astrophysics Data System (ADS)

    Suweis, Samir; Porporato, Amilcare; Rinaldo, Andrea; Maritan, Amos

    2011-06-01

    Generalized Langevin equations (GLE) with multiplicative white Poisson noise pose the usual prescription dilemma leading to different evolution equations (master equations) for the probability distribution. Contrary to the case of multiplicative Gaussian white noise, the Stratonovich prescription does not correspond to the well-known midpoint (or any other intermediate) prescription. By introducing an inertial term in the GLE, we show that the Itô and Stratonovich prescriptions naturally arise depending on two time scales, one induced by the inertial term and the other determined by the jump event. We also show that, when the multiplicative noise is linear in the random variable, one prescription can be made equivalent to the other by a suitable transformation in the jump probability distribution. We apply these results to a recently proposed stochastic model describing the dynamics of primary soil salinization, in which the salt mass balance within the soil root zone requires the analysis of different prescriptions arising from the resulting stochastic differential equation forced by multiplicative white Poisson noise, the features of which are tailored to the characters of the daily precipitation. A method is finally suggested to infer the most appropriate prescription from the data.

  10. Gaussian orthogonal ensemble statistics in graphene billiards with the shape of classically integrable billiards.

    PubMed

    Yu, Pei; Li, Zi-Yuan; Xu, Hong-Ya; Huang, Liang; Dietz, Barbara; Grebogi, Celso; Lai, Ying-Cheng

    2016-12-01

    A crucial result in quantum chaos, which has been established for a long time, is that the spectral properties of classically integrable systems generically are described by Poisson statistics, whereas those of time-reversal symmetric, classically chaotic systems coincide with those of random matrices from the Gaussian orthogonal ensemble (GOE). Does this result hold for two-dimensional Dirac material systems? To address this fundamental question, we investigate the spectral properties in a representative class of graphene billiards with shapes of classically integrable circular-sector billiards. Naively one may expect to observe Poisson statistics, which is indeed true for energies close to the band edges where the quasiparticle obeys the Schrödinger equation. However, for energies near the Dirac point, where the quasiparticles behave like massless Dirac fermions, Poisson statistics is extremely rare in the sense that it emerges only under quite strict symmetry constraints on the straight boundary parts of the sector. An arbitrarily small amount of imperfection of the boundary results in GOE statistics. This implies that, for circular-sector confinements with arbitrary angle, the spectral properties will generically be GOE. These results are corroborated by extensive numerical computation. Furthermore, we provide a physical understanding for our results.

  11. Gaussian orthogonal ensemble statistics in graphene billiards with the shape of classically integrable billiards

    NASA Astrophysics Data System (ADS)

    Yu, Pei; Li, Zi-Yuan; Xu, Hong-Ya; Huang, Liang; Dietz, Barbara; Grebogi, Celso; Lai, Ying-Cheng

    2016-12-01

    A crucial result in quantum chaos, which has been established for a long time, is that the spectral properties of classically integrable systems generically are described by Poisson statistics, whereas those of time-reversal symmetric, classically chaotic systems coincide with those of random matrices from the Gaussian orthogonal ensemble (GOE). Does this result hold for two-dimensional Dirac material systems? To address this fundamental question, we investigate the spectral properties in a representative class of graphene billiards with shapes of classically integrable circular-sector billiards. Naively one may expect to observe Poisson statistics, which is indeed true for energies close to the band edges where the quasiparticle obeys the Schrödinger equation. However, for energies near the Dirac point, where the quasiparticles behave like massless Dirac fermions, Poisson statistics is extremely rare in the sense that it emerges only under quite strict symmetry constraints on the straight boundary parts of the sector. An arbitrarily small amount of imperfection of the boundary results in GOE statistics. This implies that, for circular-sector confinements with arbitrary angle, the spectral properties will generically be GOE. These results are corroborated by extensive numerical computation. Furthermore, we provide a physical understanding for our results.

  12. A Fock space representation for the quantum Lorentz gas

    NASA Astrophysics Data System (ADS)

    Maassen, H.; Tip, A.

    1995-02-01

    A Fock space representation is given for the quantum Lorentz gas, i.e., for random Schrödinger operators of the form H(ω)=p2+Vω=p2+∑ φ(x-xj(ω)), acting in H=L2(Rd), with Poisson distributed xjs. An operator H is defined in K=H⊗P=H⊗L2(Ω,P(dω))=L2(Ω,P(dω);H) by the action of H(ω) on its fibers in a direct integral decomposition. The stationarity of the Poisson process allows a unitarily equivalent description in terms of a new family {H(k)||k∈Rd}, where each H(k) acts in P [A. Tip, J. Math. Phys. 35, 113 (1994)]. The space P is then unitarily mapped upon the symmetric Fock space over L2(Rd,ρdx), with ρ the intensity of the Poisson process (the average number of points xj per unit volume; the scatterer density), and the equivalent of H(k) is determined. Averages now become vacuum expectation values and a further unitary transformation (removing ρ in ρdx) is made which leaves the former invariant. The resulting operator HF(k) has an interesting structure: On the nth Fock layer we encounter a single particle moving in the field of n scatterers and the randomness now appears in the coefficient √ρ in a coupling term connecting neighboring Fock layers. We also give a simple direct self-adjointness proof for HF(k), based upon Nelson's commutator theorem. Restriction to a finite number of layers (a kind of low scatterer density approximation) still gives nontrivial results, as is demonstrated by considering an example.

  13. Estimating relative risks in multicenter studies with a small number of centers - which methods to use? A simulation study.

    PubMed

    Pedroza, Claudia; Truong, Van Thi Thanh

    2017-11-02

    Analyses of multicenter studies often need to account for center clustering to ensure valid inference. For binary outcomes, it is particularly challenging to properly adjust for center when the number of centers or total sample size is small, or when there are few events per center. Our objective was to evaluate the performance of generalized estimating equation (GEE) log-binomial and Poisson models, generalized linear mixed models (GLMMs) assuming binomial and Poisson distributions, and a Bayesian binomial GLMM to account for center effect in these scenarios. We conducted a simulation study with few centers (≤30) and 50 or fewer subjects per center, using both a randomized controlled trial and an observational study design to estimate relative risk. We compared the GEE and GLMM models with a log-binomial model without adjustment for clustering in terms of bias, root mean square error (RMSE), and coverage. For the Bayesian GLMM, we used informative neutral priors that are skeptical of large treatment effects that are almost never observed in studies of medical interventions. All frequentist methods exhibited little bias, and the RMSE was very similar across the models. The binomial GLMM had poor convergence rates, ranging from 27% to 85%, but performed well otherwise. The results show that both GEE models need to use small sample corrections for robust SEs to achieve proper coverage of 95% CIs. The Bayesian GLMM had similar convergence rates but resulted in slightly more biased estimates for the smallest sample sizes. However, it had the smallest RMSE and good coverage across all scenarios. These results were very similar for both study designs. For the analyses of multicenter studies with a binary outcome and few centers, we recommend adjustment for center with either a GEE log-binomial or Poisson model with appropriate small sample corrections or a Bayesian binomial GLMM with informative priors.

  14. Poisson Regression Analysis of Illness and Injury Surveillance Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frome E.L., Watkins J.P., Ellis E.D.

    2012-12-12

    The Department of Energy (DOE) uses illness and injury surveillance to monitor morbidity and assess the overall health of the work force. Data collected from each participating site include health events and a roster file with demographic information. The source data files are maintained in a relational data base, and are used to obtain stratified tables of health event counts and person time at risk that serve as the starting point for Poisson regression analysis. The explanatory variables that define these tables are age, gender, occupational group, and time. Typical response variables of interest are the number of absences duemore » to illness or injury, i.e., the response variable is a count. Poisson regression methods are used to describe the effect of the explanatory variables on the health event rates using a log-linear main effects model. Results of fitting the main effects model are summarized in a tabular and graphical form and interpretation of model parameters is provided. An analysis of deviance table is used to evaluate the importance of each of the explanatory variables on the event rate of interest and to determine if interaction terms should be considered in the analysis. Although Poisson regression methods are widely used in the analysis of count data, there are situations in which over-dispersion occurs. This could be due to lack-of-fit of the regression model, extra-Poisson variation, or both. A score test statistic and regression diagnostics are used to identify over-dispersion. A quasi-likelihood method of moments procedure is used to evaluate and adjust for extra-Poisson variation when necessary. Two examples are presented using respiratory disease absence rates at two DOE sites to illustrate the methods and interpretation of the results. In the first example the Poisson main effects model is adequate. In the second example the score test indicates considerable over-dispersion and a more detailed analysis attributes the over-dispersion to extra-Poisson variation. The R open source software environment for statistical computing and graphics is used for analysis. Additional details about R and the data that were used in this report are provided in an Appendix. Information on how to obtain R and utility functions that can be used to duplicate results in this report are provided.« less

  15. Maternal dietary counseling reduces consumption of energy-dense foods among infants: a randomized controlled trial.

    PubMed

    Vitolo, Marcia Regina; Bortolini, Gisele Ane; Campagnolo, Paula Dal Bo; Hoffman, Daniel J

    2012-01-01

    To evaluate the impact of a dietary counseling in reducing the intake of energy-dense foods by infants. A randomized controlled trial. São Leopoldo, Brazil. Mothers and infants of a low-income-group population were randomized into intervention (n = 163) and received dietary counseling during 10 home visits, or control (n = 234) groups. Child consumption of sugar-dense (SD) and lipid-dense (LD) foods at 12 to 16 months. The effect of the intervention was expressed by relative risks and 95% confidence intervals. Poisson regression analysis was used to determine the association between exclusive breastfeeding and the energy-dense foods intake. A smaller proportion of infants from the intervention group consumed candy, soft drinks, honey, cookies, chocolate, and salty snacks. In the intervention group, there was a reduction of 40% and 50% in the proportion of infants who consumed LD and SD foods, respectively. Being breastfed up to 6 months reduced the risk for consumption of LD and SD foods by 58% and 67%, respectively. Dietary counseling to mothers may be effective in reducing the consumption of energy-dense foods among infants, and it is helpful in improving early dietary habits. Copyright © 2012 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  16. How well do mean field theories of spiking quadratic-integrate-and-fire networks work in realistic parameter regimes?

    PubMed

    Grabska-Barwińska, Agnieszka; Latham, Peter E

    2014-06-01

    We use mean field techniques to compute the distribution of excitatory and inhibitory firing rates in large networks of randomly connected spiking quadratic integrate and fire neurons. These techniques are based on the assumption that activity is asynchronous and Poisson. For most parameter settings these assumptions are strongly violated; nevertheless, so long as the networks are not too synchronous, we find good agreement between mean field prediction and network simulations. Thus, much of the intuition developed for randomly connected networks in the asynchronous regime applies to mildly synchronous networks.

  17. A heuristic for the distribution of point counts for random curves over a finite field.

    PubMed

    Achter, Jeffrey D; Erman, Daniel; Kedlaya, Kiran S; Wood, Melanie Matchett; Zureick-Brown, David

    2015-04-28

    How many rational points are there on a random algebraic curve of large genus g over a given finite field Fq? We propose a heuristic for this question motivated by a (now proven) conjecture of Mumford on the cohomology of moduli spaces of curves; this heuristic suggests a Poisson distribution with mean q+1+1/(q-1). We prove a weaker version of this statement in which g and q tend to infinity, with q much larger than g. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  18. Partial transpose of random quantum states: Exact formulas and meanders

    NASA Astrophysics Data System (ADS)

    Fukuda, Motohisa; Śniady, Piotr

    2013-04-01

    We investigate the asymptotic behavior of the empirical eigenvalues distribution of the partial transpose of a random quantum state. The limiting distribution was previously investigated via Wishart random matrices indirectly (by approximating the matrix of trace 1 by the Wishart matrix of random trace) and shown to be the semicircular distribution or the free difference of two free Poisson distributions, depending on how dimensions of the concerned spaces grow. Our use of Wishart matrices gives exact combinatorial formulas for the moments of the partial transpose of the random state. We find three natural asymptotic regimes in terms of geodesics on the permutation groups. Two of them correspond to the above two cases; the third one turns out to be a new matrix model for the meander polynomials. Moreover, we prove the convergence to the semicircular distribution together with its extreme eigenvalues under weaker assumptions, and show large deviation bound for the latter.

  19. Translating a Fall Prevention Intervention Into Practice: A Randomized Community Trial

    PubMed Central

    Peterson, Donna J.; Christiansen, Ann L.; Mahoney, Jane; Laud, Purushottam; Layde, Peter M.

    2015-01-01

    Objectives. We examined whether community translation of an effective evidence-based fall prevention program via standard monetary support can produce a community-wide reduction in fall injuries in older adults and evaluated whether an enhanced version with added technical support and capacity building amplified the fall reduction effect. Methods. We completed a randomized controlled community trial among adults aged 65 and older in (1) 10 control communities receiving no special resources or guidance on fall prevention, (2) 5 standard support communities receiving modest funding to implement Stepping On, and (3) 5 enhanced support communities receiving funding and technical support. The primary outcome was hospital inpatient and emergency department discharges for falls, examined with Poisson regression. Results. Compared with control communities, standard and enhanced support communities showed significantly higher community-wide reductions (9% and 8%, respectively) in fall injuries from baseline (2007–2008) to follow-up (2010–2011). No significant difference was found between enhanced and standard support communities. Conclusions. Population-based fall prevention interventions can be effective when implemented in community settings. More research is needed to identify the barriers and facilitators that influence the successful adoption and implementation of fall prevention interventions into broad community practice. PMID:25602891

  20. Translating a Fall Prevention Intervention Into Practice: A Randomized Community Trial.

    PubMed

    Guse, Clare E; Peterson, Donna J; Christiansen, Ann L; Mahoney, Jane; Laud, Purushottam; Layde, Peter M

    2015-07-01

    We examined whether community translation of an effective evidence-based fall prevention program via standard monetary support can produce a community-wide reduction in fall injuries in older adults and evaluated whether an enhanced version with added technical support and capacity building amplified the fall reduction effect. We completed a randomized controlled community trial among adults aged 65 and older in (1) 10 control communities receiving no special resources or guidance on fall prevention, (2) 5 standard support communities receiving modest funding to implement Stepping On, and (3) 5 enhanced support communities receiving funding and technical support. The primary outcome was hospital inpatient and emergency department discharges for falls, examined with Poisson regression. Compared with control communities, standard and enhanced support communities showed significantly higher community-wide reductions (9% and 8%, respectively) in fall injuries from baseline (2007-2008) to follow-up (2010-2011). No significant difference was found between enhanced and standard support communities. Population-based fall prevention interventions can be effective when implemented in community settings. More research is needed to identify the barriers and facilitators that influence the successful adoption and implementation of fall prevention interventions into broad community practice.

  1. A Bayesian ridge regression analysis of congestion's impact on urban expressway safety.

    PubMed

    Shi, Qi; Abdel-Aty, Mohamed; Lee, Jaeyoung

    2016-03-01

    With the rapid growth of traffic in urban areas, concerns about congestion and traffic safety have been heightened. This study leveraged both Automatic Vehicle Identification (AVI) system and Microwave Vehicle Detection System (MVDS) installed on an expressway in Central Florida to explore how congestion impacts the crash occurrence in urban areas. Multiple congestion measures from the two systems were developed. To ensure more precise estimates of the congestion's effects, the traffic data were aggregated into peak and non-peak hours. Multicollinearity among traffic parameters was examined. The results showed the presence of multicollinearity especially during peak hours. As a response, ridge regression was introduced to cope with this issue. Poisson models with uncorrelated random effects, correlated random effects, and both correlated random effects and random parameters were constructed within the Bayesian framework. It was proven that correlated random effects could significantly enhance model performance. The random parameters model has similar goodness-of-fit compared with the model with only correlated random effects. However, by accounting for the unobserved heterogeneity, more variables were found to be significantly related to crash frequency. The models indicated that congestion increased crash frequency during peak hours while during non-peak hours it was not a major crash contributing factor. Using the random parameter model, the three congestion measures were compared. It was found that all congestion indicators had similar effects while Congestion Index (CI) derived from MVDS data was a better congestion indicator for safety analysis. Also, analyses showed that the segments with higher congestion intensity could not only increase property damage only (PDO) crashes, but also more severe crashes. In addition, the issues regarding the necessity to incorporate specific congestion indicator for congestion's effects on safety and to take care of the multicollinearity between explanatory variables were also discussed. By including a specific congestion indicator, the model performance significantly improved. When comparing models with and without ridge regression, the magnitude of the coefficients was altered in the existence of multicollinearity. These conclusions suggest that the use of appropriate congestion measure and consideration of multicolilnearity among the variables would improve the models and our understanding about the effects of congestion on traffic safety. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. A comparison of observation-level random effect and Beta-Binomial models for modelling overdispersion in Binomial data in ecology & evolution.

    PubMed

    Harrison, Xavier A

    2015-01-01

    Overdispersion is a common feature of models of biological data, but researchers often fail to model the excess variation driving the overdispersion, resulting in biased parameter estimates and standard errors. Quantifying and modeling overdispersion when it is present is therefore critical for robust biological inference. One means to account for overdispersion is to add an observation-level random effect (OLRE) to a model, where each data point receives a unique level of a random effect that can absorb the extra-parametric variation in the data. Although some studies have investigated the utility of OLRE to model overdispersion in Poisson count data, studies doing so for Binomial proportion data are scarce. Here I use a simulation approach to investigate the ability of both OLRE models and Beta-Binomial models to recover unbiased parameter estimates in mixed effects models of Binomial data under various degrees of overdispersion. In addition, as ecologists often fit random intercept terms to models when the random effect sample size is low (<5 levels), I investigate the performance of both model types under a range of random effect sample sizes when overdispersion is present. Simulation results revealed that the efficacy of OLRE depends on the process that generated the overdispersion; OLRE failed to cope with overdispersion generated from a Beta-Binomial mixture model, leading to biased slope and intercept estimates, but performed well for overdispersion generated by adding random noise to the linear predictor. Comparison of parameter estimates from an OLRE model with those from its corresponding Beta-Binomial model readily identified when OLRE were performing poorly due to disagreement between effect sizes, and this strategy should be employed whenever OLRE are used for Binomial data to assess their reliability. Beta-Binomial models performed well across all contexts, but showed a tendency to underestimate effect sizes when modelling non-Beta-Binomial data. Finally, both OLRE and Beta-Binomial models performed poorly when models contained <5 levels of the random intercept term, especially for estimating variance components, and this effect appeared independent of total sample size. These results suggest that OLRE are a useful tool for modelling overdispersion in Binomial data, but that they do not perform well in all circumstances and researchers should take care to verify the robustness of parameter estimates of OLRE models.

  3. The scaling of oblique plasma double layers

    NASA Technical Reports Server (NTRS)

    Borovsky, J. E.

    1983-01-01

    Strong oblique plasma double layers are investigated using three methods, i.e., electrostatic particle-in-cell simulations, numerical solutions to the Poisson-Vlasov equations, and analytical approximations to the Poisson-Vlasov equations. The solutions to the Poisson-Vlasov equations and numerical simulations show that strong oblique double layers scale in terms of Debye lengths. For very large potential jumps, theory and numerical solutions indicate that all effects of the magnetic field vanish and the oblique double layers follow the same scaling relation as the field-aligned double layers.

  4. An application of queueing theory to the design of channel requirements for special purpose communications satellites. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Hein, G. F.

    1974-01-01

    Special purpose satellites are very cost sensitive to the number of broadcast channels, usually will have Poisson arrivals, fairly low utilization (less than 35%), and a very high availability requirement. To solve the problem of determining the effects of limiting C the number of channels, the Poisson arrival, infinite server queueing model will be modified to describe the many server case. The model is predicated on the reproductive property of the Poisson distribution.

  5. Collisional effects on the numerical recurrence in Vlasov-Poisson simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pezzi, Oreste; Valentini, Francesco; Camporeale, Enrico

    The initial state recurrence in numerical simulations of the Vlasov-Poisson system is a well-known phenomenon. Here, we study the effect on recurrence of artificial collisions modeled through the Lenard-Bernstein operator [A. Lenard and I. B. Bernstein, Phys. Rev. 112, 1456–1459 (1958)]. By decomposing the linear Vlasov-Poisson system in the Fourier-Hermite space, the recurrence problem is investigated in the linear regime of the damping of a Langmuir wave and of the onset of the bump-on-tail instability. The analysis is then confirmed and extended to the nonlinear regime through an Eulerian collisional Vlasov-Poisson code. It is found that, despite being routinely used,more » an artificial collisionality is not a viable way of preventing recurrence in numerical simulations without compromising the kinetic nature of the solution. Moreover, it is shown how numerical effects associated to the generation of fine velocity scales can modify the physical features of the system evolution even in nonlinear regime. This means that filamentation-like phenomena, usually associated with low amplitude fluctuations contexts, can play a role even in nonlinear regime.« less

  6. Statistical properties of superimposed stationary spike trains.

    PubMed

    Deger, Moritz; Helias, Moritz; Boucsein, Clemens; Rotter, Stefan

    2012-06-01

    The Poisson process is an often employed model for the activity of neuronal populations. It is known, though, that superpositions of realistic, non- Poisson spike trains are not in general Poisson processes, not even for large numbers of superimposed processes. Here we construct superimposed spike trains from intracellular in vivo recordings from rat neocortex neurons and compare their statistics to specific point process models. The constructed superimposed spike trains reveal strong deviations from the Poisson model. We find that superpositions of model spike trains that take the effective refractoriness of the neurons into account yield a much better description. A minimal model of this kind is the Poisson process with dead-time (PPD). For this process, and for superpositions thereof, we obtain analytical expressions for some second-order statistical quantities-like the count variability, inter-spike interval (ISI) variability and ISI correlations-and demonstrate the match with the in vivo data. We conclude that effective refractoriness is the key property that shapes the statistical properties of the superposition spike trains. We present new, efficient algorithms to generate superpositions of PPDs and of gamma processes that can be used to provide more realistic background input in simulations of networks of spiking neurons. Using these generators, we show in simulations that neurons which receive superimposed spike trains as input are highly sensitive for the statistical effects induced by neuronal refractoriness.

  7. Effects of various boundary conditions on the response of Poisson-Nernst-Planck impedance spectroscopy analysis models and comparison with a continuous-time random-walk model.

    PubMed

    Macdonald, J Ross

    2011-11-24

    Various electrode reaction rate boundary conditions suitable for mean-field Poisson-Nernst-Planck (PNP) mobile charge frequency response continuum models are defined and incorporated in the resulting Chang-Jaffe (CJ) CJPNP model, the ohmic OHPNP one, and a simplified GPNP one in order to generalize from full to partial blocking of mobile charges at the two plane parallel electrodes. Model responses using exact synthetic PNP data involving only mobile negative charges are discussed and compared for a wide range of CJ dimensionless reaction rate values. The CJPNP and OHPNP ones are shown to be fully equivalent, except possibly for the analysis of nanomaterial structures. The dielectric strengths associated with the CJPNP diffuse double layers at the electrodes were found to decrease toward 0 as the reaction rate increased, consistent with fewer blocked charges and more reacting ones. Parameter estimates from GPNP fits of CJPNP data were shown to lead to accurate calculated values of the CJ reaction rate and of some other CJPNP parameters. Best fits of CaCu(3)Ti(4)O(12) (CCTO) single-crystal data, an electronic conductor, at 80 and 140 K, required the anomalous diffusion model, CJPNPA, and led to medium-size rate estimates of about 0.12 and 0.03, respectively, as well as good estimates of the values of other important CJPNPA parameters such as the independently verified concentration of neutral dissociable centers. These continuum-fit results were found to be only somewhat comparable to those obtained from a composite continuous-time random-walk hopping/trapping semiuniversal UN model.

  8. Nonlinear Analysis of Experimental Measurements 7.6. Theoretical Chemistry

    DTIC Science & Technology

    2015-01-26

    Jianshu Cao, Robert J. Silbey, Jaeyoung Sung. Quantitative Interpretation of the Randomness in Single Enzyme Turnover Times, Biophysical Journal...Universality of Poisson Indicator and Fano Factor of Transport Event Statistics in Ion Channels and Enzyme Kinetics., J. Phys. B: At. Mol. Opt. Phys...TOTAL: 4 01/26/2015 Received Book 4.00 Jianshu Cao, Jianlan Wu. GENERALIZED MICHAELIS–MENTENEQUATION FOR CONFORMATION- MODULATEDMONOMERIC ENZYMES , New

  9. Coma cluster ultradiffuse galaxies are not standard radio galaxies

    NASA Astrophysics Data System (ADS)

    Struble, Mitchell F.

    2018-02-01

    Matching members in the Coma cluster catalogue of ultradiffuse galaxies (UDGs) from SUBARU imaging with a very deep radio continuum survey source catalogue of the cluster using the Karl G. Jansky Very Large Array (VLA) within a rectangular region of ∼1.19 deg2 centred on the cluster core reveals matches consistent with random. An overlapping set of 470 UDGs and 696 VLA radio sources in this rectangular area finds 33 matches within a separation of 25 arcsec; dividing the sample into bins with separations bounded by 5, 10, 20 and 25 arcsec finds 1, 4, 17 and 11 matches. An analytical model estimate, based on the Poisson probability distribution, of the number of randomly expected matches within these same separation bounds is 1.7, 4.9, 19.4 and 14.2, each, respectively, consistent with the 95 per cent Poisson confidence intervals of the observed values. Dividing the data into five clustercentric annuli of 0.1° and into the four separation bins, finds the same result. This random match of UDGs with VLA sources implies that UDGs are not radio galaxies by the standard definition. Those VLA sources having integrated flux >1 mJy at 1.4 GHz in Miller, Hornschemeier and Mobasher without SDSS galaxy matches are consistent with the known surface density of background radio sources. We briefly explore the possibility that some unresolved VLA sources near UDGs could be young, compact, bright, supernova remnants of Type Ia events, possibly in the intracluster volume.

  10. Exact solution of two interacting run-and-tumble random walkers with finite tumble duration

    NASA Astrophysics Data System (ADS)

    Slowman, A. B.; Evans, M. R.; Blythe, R. A.

    2017-09-01

    We study a model of interacting run-and-tumble random walkers operating under mutual hardcore exclusion on a one-dimensional lattice with periodic boundary conditions. We incorporate a finite, poisson-distributed, tumble duration so that a particle remains stationary whilst tumbling, thus generalising the persistent random walker model. We present the exact solution for the nonequilibrium stationary state of this system in the case of two random walkers. We find this to be characterised by two lengthscales, one arising from the jamming of approaching particles, and the other from one particle moving when the other is tumbling. The first of these lengthscales vanishes in a scaling limit where the continuous-space dynamics is recovered whilst the second remains finite. Thus the nonequilibrium stationary state reveals a rich structure of attractive, jammed and extended pieces.

  11. Void statistics, scaling, and the origins of large-scale structure

    NASA Technical Reports Server (NTRS)

    Fry, J. N.; Giovanelli, Riccardo; Haynes, Martha P.; Melott, Adrian L.; Scherrer, Robert J.

    1989-01-01

    The probability that a volume of the universe of given size and shape spaced at random will be void of galaxies is used here to study various models of the origin of cosmological structures. Numerical simulations are conducted on hot-particle and cold-particle-modulated inflationary models with and without biasing, on isothermal or initially Poisson models, and on models where structure is seeded by loops of cosmic string. For the Pisces-Perseus redshift compilation of Giovanelli and Haynes (1985), it is found that hierarchical scaling is obeyed for subsamples constructed with different limiting magnitudes and subsamples taken at random. This result confirms that the hierarchical ansatz holds valid to high order and supports the idea that structure in the observed universe evolves by a regular process from an almost Gaussian primordial state. Neutrino models without biasing show the effect of a strong feature in the initial power spectrum. Cosmic string models do not agree well with the galaxy data.

  12. Poisson's spot and Gouy phase

    NASA Astrophysics Data System (ADS)

    da Paz, I. G.; Soldati, Rodolfo; Cabral, L. A.; de Oliveira, J. G. G.; Sampaio, Marcos

    2016-12-01

    Recently there have been experimental results on Poisson spot matter-wave interferometry followed by theoretical models describing the relative importance of the wave and particle behaviors for the phenomenon. We propose an analytical theoretical model for Poisson's spot with matter waves based on the Babinet principle, in which we use the results for free propagation and single-slit diffraction. We take into account effects of loss of coherence and finite detection area using the propagator for a quantum particle interacting with an environment. We observe that the matter-wave Gouy phase plays a role in the existence of the central peak and thus corroborates the predominantly wavelike character of the Poisson's spot. Our model shows remarkable agreement with the experimental data for deuterium (D2) molecules.

  13. Droxidopa and Reduced Falls in a Trial of Parkinson Disease Patients With Neurogenic Orthostatic Hypotension.

    PubMed

    Hauser, Robert A; Heritier, Stephane; Rowse, Gerald J; Hewitt, L Arthur; Isaacson, Stuart H

    2016-01-01

    Droxidopa is a prodrug of norepinephrine indicated for the treatment of orthostatic dizziness, lightheadedness, or the "feeling that you are about to black out" in adult patients with symptomatic neurogenic orthostatic hypotension caused by primary autonomic failure including Parkinson disease (PD). The objective of this study was to compare fall rates in PD patients with symptomatic neurogenic orthostatic hypotension randomized to droxidopa or placebo. Study NOH306 was a 10-week, phase 3, randomized, placebo-controlled, double-blind trial of droxidopa in PD patients with symptomatic neurogenic orthostatic hypotension that included assessments of falls as a key secondary end point. In this report, the principal analysis consisted of a comparison of the rate of patient-reported falls from randomization to end of study in droxidopa versus placebo groups. A total of 225 patients were randomized; 222 patients were included in the safety analyses, and 197 patients provided efficacy data and were included in the falls analyses. The 92 droxidopa patients reported 308 falls, and the 105 placebo patients reported 908 falls. In the droxidopa group, the fall rate was 0.4 falls per patient-week; in the placebo group, the rate was 1.05 falls per patient-week (prespecified Wilcoxon rank sum P = 0.704; post hoc Poisson-inverse Gaussian test P = 0.014), yielding a relative risk reduction of 77% using the Poisson-inverse Gaussian model. Fall-related injuries occurred in 16.7% of droxidopa-treated patients and 26.9% of placebo-treated patients. Treatment with droxidopa appears to reduce falls in PD patients with symptomatic neurogenic orthostatic hypotension, but this finding must be confirmed.

  14. Droxidopa and Reduced Falls in a Trial of Parkinson Disease Patients With Neurogenic Orthostatic Hypotension

    PubMed Central

    Hauser, Robert A.; Heritier, Stephane; Rowse, Gerald J.; Hewitt, L. Arthur; Isaacson, Stuart H.

    2016-01-01

    Objectives Droxidopa is a prodrug of norepinephrine indicated for the treatment of orthostatic dizziness, lightheadedness, or the “feeling that you are about to black out” in adult patients with symptomatic neurogenic orthostatic hypotension caused by primary autonomic failure including Parkinson disease (PD). The objective of this study was to compare fall rates in PD patients with symptomatic neurogenic orthostatic hypotension randomized to droxidopa or placebo. Methods Study NOH306 was a 10-week, phase 3, randomized, placebo-controlled, double-blind trial of droxidopa in PD patients with symptomatic neurogenic orthostatic hypotension that included assessments of falls as a key secondary end point. In this report, the principal analysis consisted of a comparison of the rate of patient-reported falls from randomization to end of study in droxidopa versus placebo groups. Results A total of 225 patients were randomized; 222 patients were included in the safety analyses, and 197 patients provided efficacy data and were included in the falls analyses. The 92 droxidopa patients reported 308 falls, and the 105 placebo patients reported 908 falls. In the droxidopa group, the fall rate was 0.4 falls per patient-week; in the placebo group, the rate was 1.05 falls per patient-week (prespecified Wilcoxon rank sum P = 0.704; post hoc Poisson-inverse Gaussian test P = 0.014), yielding a relative risk reduction of 77% using the Poisson-inverse Gaussian model. Fall-related injuries occurred in 16.7% of droxidopa-treated patients and 26.9% of placebo-treated patients. Conclusions Treatment with droxidopa appears to reduce falls in PD patients with symptomatic neurogenic orthostatic hypotension, but this finding must be confirmed. PMID:27332626

  15. Analysis of Machine Learning Techniques for Heart Failure Readmissions.

    PubMed

    Mortazavi, Bobak J; Downing, Nicholas S; Bucholz, Emily M; Dharmarajan, Kumar; Manhapra, Ajay; Li, Shu-Xia; Negahban, Sahand N; Krumholz, Harlan M

    2016-11-01

    The current ability to predict readmissions in patients with heart failure is modest at best. It is unclear whether machine learning techniques that address higher dimensional, nonlinear relationships among variables would enhance prediction. We sought to compare the effectiveness of several machine learning algorithms for predicting readmissions. Using data from the Telemonitoring to Improve Heart Failure Outcomes trial, we compared the effectiveness of random forests, boosting, random forests combined hierarchically with support vector machines or logistic regression (LR), and Poisson regression against traditional LR to predict 30- and 180-day all-cause readmissions and readmissions because of heart failure. We randomly selected 50% of patients for a derivation set, and a validation set comprised the remaining patients, validated using 100 bootstrapped iterations. We compared C statistics for discrimination and distributions of observed outcomes in risk deciles for predictive range. In 30-day all-cause readmission prediction, the best performing machine learning model, random forests, provided a 17.8% improvement over LR (mean C statistics, 0.628 and 0.533, respectively). For readmissions because of heart failure, boosting improved the C statistic by 24.9% over LR (mean C statistic 0.678 and 0.543, respectively). For 30-day all-cause readmission, the observed readmission rates in the lowest and highest deciles of predicted risk with random forests (7.8% and 26.2%, respectively) showed a much wider separation than LR (14.2% and 16.4%, respectively). Machine learning methods improved the prediction of readmission after hospitalization for heart failure compared with LR and provided the greatest predictive range in observed readmission rates. © 2016 American Heart Association, Inc.

  16. A fresh approach to forecasting in astroparticle physics and dark matter searches

    NASA Astrophysics Data System (ADS)

    Edwards, Thomas D. P.; Weniger, Christoph

    2018-02-01

    We present a toolbox of new techniques and concepts for the efficient forecasting of experimental sensitivities. These are applicable to a large range of scenarios in (astro-)particle physics, and based on the Fisher information formalism. Fisher information provides an answer to the question 'what is the maximum extractable information from a given observation?'. It is a common tool for the forecasting of experimental sensitivities in many branches of science, but rarely used in astroparticle physics or searches for particle dark matter. After briefly reviewing the Fisher information matrix of general Poisson likelihoods, we propose very compact expressions for estimating expected exclusion and discovery limits ('equivalent counts method'). We demonstrate by comparison with Monte Carlo results that they remain surprisingly accurate even deep in the Poisson regime. We show how correlated background systematics can be efficiently accounted for by a treatment based on Gaussian random fields. Finally, we introduce the novel concept of Fisher information flux. It can be thought of as a generalization of the commonly used signal-to-noise ratio, while accounting for the non-local properties and saturation effects of background and instrumental uncertainties. It is a powerful and flexible tool ready to be used as core concept for informed strategy development in astroparticle physics and searches for particle dark matter.

  17. Poisson image reconstruction with Hessian Schatten-norm regularization.

    PubMed

    Lefkimmiatis, Stamatios; Unser, Michael

    2013-11-01

    Poisson inverse problems arise in many modern imaging applications, including biomedical and astronomical ones. The main challenge is to obtain an estimate of the underlying image from a set of measurements degraded by a linear operator and further corrupted by Poisson noise. In this paper, we propose an efficient framework for Poisson image reconstruction, under a regularization approach, which depends on matrix-valued regularization operators. In particular, the employed regularizers involve the Hessian as the regularization operator and Schatten matrix norms as the potential functions. For the solution of the problem, we propose two optimization algorithms that are specifically tailored to the Poisson nature of the noise. These algorithms are based on an augmented-Lagrangian formulation of the problem and correspond to two variants of the alternating direction method of multipliers. Further, we derive a link that relates the proximal map of an l(p) norm with the proximal map of a Schatten matrix norm of order p. This link plays a key role in the development of one of the proposed algorithms. Finally, we provide experimental results on natural and biological images for the task of Poisson image deblurring and demonstrate the practical relevance and effectiveness of the proposed framework.

  18. ? filtering for stochastic systems driven by Poisson processes

    NASA Astrophysics Data System (ADS)

    Song, Bo; Wu, Zheng-Guang; Park, Ju H.; Shi, Guodong; Zhang, Ya

    2015-01-01

    This paper investigates the ? filtering problem for stochastic systems driven by Poisson processes. By utilising the martingale theory such as the predictable projection operator and the dual predictable projection operator, this paper transforms the expectation of stochastic integral with respect to the Poisson process into the expectation of Lebesgue integral. Then, based on this, this paper designs an ? filter such that the filtering error system is mean-square asymptotically stable and satisfies a prescribed ? performance level. Finally, a simulation example is given to illustrate the effectiveness of the proposed filtering scheme.

  19. A stochastic-dynamic model for global atmospheric mass field statistics

    NASA Technical Reports Server (NTRS)

    Ghil, M.; Balgovind, R.; Kalnay-Rivas, E.

    1981-01-01

    A model that yields the spatial correlation structure of atmospheric mass field forecast errors was developed. The model is governed by the potential vorticity equation forced by random noise. Expansion in spherical harmonics and correlation function was computed analytically using the expansion coefficients. The finite difference equivalent was solved using a fast Poisson solver and the correlation function was computed using stratified sampling of the individual realization of F(omega) and hence of phi(omega). A higher order equation for gamma was derived and solved directly in finite differences by two successive applications of the fast Poisson solver. The methods were compared for accuracy and efficiency and the third method was chosen as clearly superior. The results agree well with the latitude dependence of observed atmospheric correlation data. The value of the parameter c sub o which gives the best fit to the data is close to the value expected from dynamical considerations.

  20. Sojourning with the Homogeneous Poisson Process.

    PubMed

    Liu, Piaomu; Peña, Edsel A

    2016-01-01

    In this pedagogical article, distributional properties, some surprising, pertaining to the homogeneous Poisson process (HPP), when observed over a possibly random window, are presented. Properties of the gap-time that covered the termination time and the correlations among gap-times of the observed events are obtained. Inference procedures, such as estimation and model validation, based on event occurrence data over the observation window, are also presented. We envision that through the results in this paper, a better appreciation of the subtleties involved in the modeling and analysis of recurrent events data will ensue, since the HPP is arguably one of the simplest among recurrent event models. In addition, the use of the theorem of total probability, Bayes theorem, the iterated rules of expectation, variance and covariance, and the renewal equation could be illustrative when teaching distribution theory, mathematical statistics, and stochastic processes at both the undergraduate and graduate levels. This article is targeted towards both instructors and students.

  1. Strategy, structure, and patient quality outcomes in ambulatory surgery centers (1997-2004).

    PubMed

    Chukmaitov, Askar; Devers, Kelly J; Harless, David W; Menachemi, Nir; Brooks, Robert G

    2011-04-01

    The purpose of this study was to examine potential associations among ambulatory surgery centers' (ASCs) organizational strategy, structure, and quality performance. The authors obtained several large-scale, all-payer claims data sets for the 1997 to 2004 period. The authors operationalized quality performance as unplanned hospitalizations at 30 days after outpatient arthroscopy and colonoscopy procedures. The authors draw on related organizational theory, behavior, and health services research literatures to develop their conceptual framework and hypotheses and fitted fixed and random effects Poisson regression models with the count of unplanned hospitalizations. Consistent with the key hypotheses formulated, the findings suggest that higher levels of specialization and the volume of procedures may be associated with a decrease in unplanned hospitalizations at ASCs.

  2. Probabilities for gravitational lensing by point masses in a locally inhomogeneous universe

    NASA Technical Reports Server (NTRS)

    Isaacson, Jeffrey A.; Canizares, Claude R.

    1989-01-01

    Probability functions for gravitational lensing by point masses that incorporate Poisson statistics and flux conservation are formulated in the Dyer-Roeder construction. Optical depths to lensing for distant sources are calculated using both the method of Press and Gunn (1973) which counts lenses in an otherwise empty cone, and the method of Ehlers and Schneider (1986) which projects lensing cross sections onto the source sphere. These are then used as parameters of the probability density for lensing in the case of a critical (q0 = 1/2) Friedmann universe. A comparison of the probability functions indicates that the effects of angle-averaging can be well approximated by adjusting the average magnification along a random line of sight so as to conserve flux.

  3. Background stratified Poisson regression analysis of cohort data.

    PubMed

    Richardson, David B; Langholz, Bryan

    2012-03-01

    Background stratified Poisson regression is an approach that has been used in the analysis of data derived from a variety of epidemiologically important studies of radiation-exposed populations, including uranium miners, nuclear industry workers, and atomic bomb survivors. We describe a novel approach to fit Poisson regression models that adjust for a set of covariates through background stratification while directly estimating the radiation-disease association of primary interest. The approach makes use of an expression for the Poisson likelihood that treats the coefficients for stratum-specific indicator variables as 'nuisance' variables and avoids the need to explicitly estimate the coefficients for these stratum-specific parameters. Log-linear models, as well as other general relative rate models, are accommodated. This approach is illustrated using data from the Life Span Study of Japanese atomic bomb survivors and data from a study of underground uranium miners. The point estimate and confidence interval obtained from this 'conditional' regression approach are identical to the values obtained using unconditional Poisson regression with model terms for each background stratum. Moreover, it is shown that the proposed approach allows estimation of background stratified Poisson regression models of non-standard form, such as models that parameterize latency effects, as well as regression models in which the number of strata is large, thereby overcoming the limitations of previously available statistical software for fitting background stratified Poisson regression models.

  4. Normal forms for Poisson maps and symplectic groupoids around Poisson transversals

    NASA Astrophysics Data System (ADS)

    Frejlich, Pedro; Mărcuț, Ioan

    2018-03-01

    Poisson transversals are submanifolds in a Poisson manifold which intersect all symplectic leaves transversally and symplectically. In this communication, we prove a normal form theorem for Poisson maps around Poisson transversals. A Poisson map pulls a Poisson transversal back to a Poisson transversal, and our first main result states that simultaneous normal forms exist around such transversals, for which the Poisson map becomes transversally linear, and intertwines the normal form data of the transversals. Our second result concerns symplectic integrations. We prove that a neighborhood of a Poisson transversal is integrable exactly when the Poisson transversal itself is integrable, and in that case we prove a normal form theorem for the symplectic groupoid around its restriction to the Poisson transversal, which puts all structure maps in normal form. We conclude by illustrating our results with examples arising from Lie algebras.

  5. Normal forms for Poisson maps and symplectic groupoids around Poisson transversals.

    PubMed

    Frejlich, Pedro; Mărcuț, Ioan

    2018-01-01

    Poisson transversals are submanifolds in a Poisson manifold which intersect all symplectic leaves transversally and symplectically. In this communication, we prove a normal form theorem for Poisson maps around Poisson transversals. A Poisson map pulls a Poisson transversal back to a Poisson transversal, and our first main result states that simultaneous normal forms exist around such transversals, for which the Poisson map becomes transversally linear, and intertwines the normal form data of the transversals. Our second result concerns symplectic integrations. We prove that a neighborhood of a Poisson transversal is integrable exactly when the Poisson transversal itself is integrable, and in that case we prove a normal form theorem for the symplectic groupoid around its restriction to the Poisson transversal, which puts all structure maps in normal form. We conclude by illustrating our results with examples arising from Lie algebras.

  6. Low-contrast lesion detection in tomosynthetic breast imaging using a realistic breast phantom

    NASA Astrophysics Data System (ADS)

    Zhou, Lili; Oldan, Jorge; Fisher, Paul; Gindi, Gene

    2006-03-01

    Tomosynthesis mammography is a potentially valuable technique for detection of breast cancer. In this simulation study, we investigate the efficacy of three different tomographic reconstruction methods, EM, SART and Backprojection, in the context of an especially difficult mammographic detection task. The task is the detection of a very low-contrast mass embedded in very dense fibro-glandular tissue - a clinically useful task for which tomosynthesis may be well suited. The project uses an anatomically realistic 3D digital breast phantom whose normal anatomic variability limits lesion conspicuity. In order to capture anatomical object variability, we generate an ensemble of phantoms, each of which comprises random instances of various breast structures. We construct medium-sized 3D breast phantoms which model random instances of ductal structures, fibrous connective tissue, Cooper's ligaments and power law structural noise for small scale object variability. Random instances of 7-8 mm irregular masses are generated by a 3D random walk algorithm and placed in very dense fibro-glandular tissue. Several other components of the breast phantom are held fixed, i.e. not randomly generated. These include the fixed breast shape and size, nipple structure, fixed lesion location, and a pectoralis muscle. We collect low-dose data using an isocentric tomosynthetic geometry at 11 angles over 50 degrees and add Poisson noise. The data is reconstructed using the three algorithms. Reconstructed slices through the center of the lesion are presented to human observers in a 2AFC (two-alternative-forced-choice) test that measures detectability by computing AUC (area under the ROC curve). The data collected in each simulation includes two sources of variability, that due to the anatomical variability of the phantom and that due to the Poisson data noise. We found that for this difficult task that the AUC value for EM (0.89) was greater than that for SART (0.83) and Backprojection (0.66).

  7. Mean-square state and parameter estimation for stochastic linear systems with Gaussian and Poisson noises

    NASA Astrophysics Data System (ADS)

    Basin, M.; Maldonado, J. J.; Zendejo, O.

    2016-07-01

    This paper proposes new mean-square filter and parameter estimator design for linear stochastic systems with unknown parameters over linear observations, where unknown parameters are considered as combinations of Gaussian and Poisson white noises. The problem is treated by reducing the original problem to a filtering problem for an extended state vector that includes parameters as additional states, modelled as combinations of independent Gaussian and Poisson processes. The solution to this filtering problem is based on the mean-square filtering equations for incompletely polynomial states confused with Gaussian and Poisson noises over linear observations. The resulting mean-square filter serves as an identifier for the unknown parameters. Finally, a simulation example shows effectiveness of the proposed mean-square filter and parameter estimator.

  8. Transient finite element analysis of electric double layer using Nernst-Planck-Poisson equations with a modified Stern layer.

    PubMed

    Lim, Jongil; Whitcomb, John; Boyd, James; Varghese, Julian

    2007-01-01

    A finite element implementation of the transient nonlinear Nernst-Planck-Poisson (NPP) and Nernst-Planck-Poisson-modified Stern (NPPMS) models is presented. The NPPMS model uses multipoint constraints to account for finite ion size, resulting in realistic ion concentrations even at high surface potential. The Poisson-Boltzmann equation is used to provide a limited check of the transient models for low surface potential and dilute bulk solutions. The effects of the surface potential and bulk molarity on the electric potential and ion concentrations as functions of space and time are studied. The ability of the models to predict realistic energy storage capacity is investigated. The predicted energy is much more sensitive to surface potential than to bulk solution molarity.

  9. Response analysis of a class of quasi-linear systems with fractional derivative excited by Poisson white noise

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Yongge; Xu, Wei, E-mail: weixu@nwpu.edu.cn; Yang, Guidong

    The Poisson white noise, as a typical non-Gaussian excitation, has attracted much attention recently. However, little work was referred to the study of stochastic systems with fractional derivative under Poisson white noise excitation. This paper investigates the stationary response of a class of quasi-linear systems with fractional derivative excited by Poisson white noise. The equivalent stochastic system of the original stochastic system is obtained. Then, approximate stationary solutions are obtained with the help of the perturbation method. Finally, two typical examples are discussed in detail to demonstrate the effectiveness of the proposed method. The analysis also shows that the fractionalmore » order and the fractional coefficient significantly affect the responses of the stochastic systems with fractional derivative.« less

  10. Angiogenic Signaling in Living Breast Tumor Models

    DTIC Science & Technology

    2007-06-01

    Poisson distributed random noise is added in an amount relative to the desired signal to noise ratio. We fit the data using a regressive fitting...AD_________________ Award Number: W81XWH-05-1-0396 TITLE: Angiogenic Signaling in Living Breast...CONTRACT NUMBER Angiogenic Signaling in Living Breast Tumor Models 5b. GRANT NUMBER W81XWH-05-1-0396 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d

  11. TRUNCATED RANDOM MEASURES

    DTIC Science & Technology

    2018-01-12

    sequential representations, a method is required for deter- mining which to use for the application at hand and, once a representation is selected, for...DISTRIBUTION UNLIMITED Methods , Assumptions, and Procedures 3.1 Background 3.1.1 CRMs and truncation Consider a Poisson point process on R+ := [0...the heart of the study of truncated CRMs. They provide an itera- tive method that can be terminated at any point to yield a finite approximation to the

  12. Logistic quantile regression provides improved estimates for bounded avian counts: a case study of California Spotted Owl fledgling production

    Treesearch

    Brian S. Cade; Barry R. Noon; Rick D. Scherer; John J. Keane

    2017-01-01

    Counts of avian fledglings, nestlings, or clutch size that are bounded below by zero and above by some small integer form a discrete random variable distribution that is not approximated well by conventional parametric count distributions such as the Poisson or negative binomial. We developed a logistic quantile regression model to provide estimates of the empirical...

  13. Hyperuniformity Length in Experimental Foam and Simulated Point Patterns

    NASA Astrophysics Data System (ADS)

    Chieco, Anthony; Roth, Adam; Dreyfus, Remi; Torquato, Salvatore; Durian, Douglas

    2015-03-01

    Systems without long-wavelength number density fluctuations are called hyperuniform (HU). The degree to which a point pattern is HU may be tested in terms of the variance in the number of points inside randomly placed boxes of side length L. If HU then the variance is due solely to fluctuations near the boundary rather than throughout the entire volume of the box. To make this concrete we introduce a hyperuniformity length h, equal to the width of the boundary where number fluctuations occur. Thus h helps characterize the disorder. We show how to deduce h from the number variance, and we do so for Poisson and Einstein patterns plus those made by the vertices and bubble centroids in 2d foams. A Poisson pattern is one where points are totally random. These are not HU and h equals L/2. We coin ``Einstein patterns'' to be where points in a lattice are independently displaced from their site by a normally distributed amount. These are HU and h equals the RMS displacement from the lattice sites. Bubble centroids and vertices are both HU. For these, h is less than L/2 and increases slower than linear in L. The centroids are more HU than the vertices, in that h that increases more slowly.

  14. Marginalized zero-inflated Poisson models with missing covariates.

    PubMed

    Benecha, Habtamu K; Preisser, John S; Divaris, Kimon; Herring, Amy H; Das, Kalyan

    2018-05-11

    Unlike zero-inflated Poisson regression, marginalized zero-inflated Poisson (MZIP) models for counts with excess zeros provide estimates with direct interpretations for the overall effects of covariates on the marginal mean. In the presence of missing covariates, MZIP and many other count data models are ordinarily fitted using complete case analysis methods due to lack of appropriate statistical methods and software. This article presents an estimation method for MZIP models with missing covariates. The method, which is applicable to other missing data problems, is illustrated and compared with complete case analysis by using simulations and dental data on the caries preventive effects of a school-based fluoride mouthrinse program. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. STDP allows fast rate-modulated coding with Poisson-like spike trains.

    PubMed

    Gilson, Matthieu; Masquelier, Timothée; Hugues, Etienne

    2011-10-01

    Spike timing-dependent plasticity (STDP) has been shown to enable single neurons to detect repeatedly presented spatiotemporal spike patterns. This holds even when such patterns are embedded in equally dense random spiking activity, that is, in the absence of external reference times such as a stimulus onset. Here we demonstrate, both analytically and numerically, that STDP can also learn repeating rate-modulated patterns, which have received more experimental evidence, for example, through post-stimulus time histograms (PSTHs). Each input spike train is generated from a rate function using a stochastic sampling mechanism, chosen to be an inhomogeneous Poisson process here. Learning is feasible provided significant covarying rate modulations occur within the typical timescale of STDP (~10-20 ms) for sufficiently many inputs (~100 among 1000 in our simulations), a condition that is met by many experimental PSTHs. Repeated pattern presentations induce spike-time correlations that are captured by STDP. Despite imprecise input spike times and even variable spike counts, a single trained neuron robustly detects the pattern just a few milliseconds after its presentation. Therefore, temporal imprecision and Poisson-like firing variability are not an obstacle to fast temporal coding. STDP provides an appealing mechanism to learn such rate patterns, which, beyond sensory processing, may also be involved in many cognitive tasks.

  16. STDP Allows Fast Rate-Modulated Coding with Poisson-Like Spike Trains

    PubMed Central

    Hugues, Etienne

    2011-01-01

    Spike timing-dependent plasticity (STDP) has been shown to enable single neurons to detect repeatedly presented spatiotemporal spike patterns. This holds even when such patterns are embedded in equally dense random spiking activity, that is, in the absence of external reference times such as a stimulus onset. Here we demonstrate, both analytically and numerically, that STDP can also learn repeating rate-modulated patterns, which have received more experimental evidence, for example, through post-stimulus time histograms (PSTHs). Each input spike train is generated from a rate function using a stochastic sampling mechanism, chosen to be an inhomogeneous Poisson process here. Learning is feasible provided significant covarying rate modulations occur within the typical timescale of STDP (∼10–20 ms) for sufficiently many inputs (∼100 among 1000 in our simulations), a condition that is met by many experimental PSTHs. Repeated pattern presentations induce spike-time correlations that are captured by STDP. Despite imprecise input spike times and even variable spike counts, a single trained neuron robustly detects the pattern just a few milliseconds after its presentation. Therefore, temporal imprecision and Poisson-like firing variability are not an obstacle to fast temporal coding. STDP provides an appealing mechanism to learn such rate patterns, which, beyond sensory processing, may also be involved in many cognitive tasks. PMID:22046113

  17. A randomized controlled trial of the effect of participatory ergonomic low back pain training on workplace improvement

    PubMed Central

    Kajiki, Shigeyuki; Izumi, Hiroyuki; Hayashida, Kenshi; Kusumoto, Akira; Nagata, Tomohisa; Mori, Koji

    2017-01-01

    Objectives: This study aimed to determine the effects of participatory workplace improvement (PWI) -based provision of ergonomic training and ergonomic action checklists (ACLs) to on-site managers on workplace improvement activities for low back pain (LBP). Methods: A randomized controlled trial (RCT) was conducted at a manufacturing company in Japan. Teams entered in the study were randomly assigned to a control and an intervention group. A total of three interventional training sessions on methods of ergonomics were provided to on-site managers in the intervention group, with 1-month intervals between sessions. Ergonomic ACLs were provided at the same time. After completion of the training sessions, each team then provided a report of improvements each month for the next 10 months. Two people in charge of safety and health chose two major objectives of the implemented activities from the five categories. The reported number of improvements was analyzed using a Poisson regression model. Results: In the intervention group, although the incident rate ratio (IRR) of PWIs in countermeasures for the LBP category was significantly elevated after the training sessions, the IRR of improvements decreased over time during the 10-month follow-up period. No significant difference was observed in the IRR of total PWIs in either the control or intervention group. Conclusions: PWI-based provision of ergonomic training sessions and ergonomics ACLs to on-site managers was shown to be effective for workplace improvement activities targeted at LBP. However, because the effects decrease over time, efforts should be made to maintain the effects through regular interventions. PMID:28320978

  18. A randomized controlled trial of the effect of participatory ergonomic low back pain training on workplace improvement.

    PubMed

    Kajiki, Shigeyuki; Izumi, Hiroyuki; Hayashida, Kenshi; Kusumoto, Akira; Nagata, Tomohisa; Mori, Koji

    2017-05-25

    This study aimed to determine the effects of participatory workplace improvement (PWI) -based provision of ergonomic training and ergonomic action checklists (ACLs) to on-site managers on workplace improvement activities for low back pain (LBP). A randomized controlled trial (RCT) was conducted at a manufacturing company in Japan. Teams entered in the study were randomly assigned to a control and an intervention group. A total of three interventional training sessions on methods of ergonomics were provided to on-site managers in the intervention group, with 1-month intervals between sessions. Ergonomic ACLs were provided at the same time. After completion of the training sessions, each team then provided a report of improvements each month for the next 10 months. Two people in charge of safety and health chose two major objectives of the implemented activities from the five categories. The reported number of improvements was analyzed using a Poisson regression model. In the intervention group, although the incident rate ratio (IRR) of PWIs in countermeasures for the LBP category was significantly elevated after the training sessions, the IRR of improvements decreased over time during the 10-month follow-up period. No significant difference was observed in the IRR of total PWIs in either the control or intervention group. PWI-based provision of ergonomic training sessions and ergonomics ACLs to on-site managers was shown to be effective for workplace improvement activities targeted at LBP. However, because the effects decrease over time, efforts should be made to maintain the effects through regular interventions.

  19. Universal self-similarity of propagating populations

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Klafter, Joseph

    2010-07-01

    This paper explores the universal self-similarity of propagating populations. The following general propagation model is considered: particles are randomly emitted from the origin of a d -dimensional Euclidean space and propagate randomly and independently of each other in space; all particles share a statistically common—yet arbitrary—motion pattern; each particle has its own random propagation parameters—emission epoch, motion frequency, and motion amplitude. The universally self-similar statistics of the particles’ displacements and first passage times (FPTs) are analyzed: statistics which are invariant with respect to the details of the displacement and FPT measurements and with respect to the particles’ underlying motion pattern. Analysis concludes that the universally self-similar statistics are governed by Poisson processes with power-law intensities and by the Fréchet and Weibull extreme-value laws.

  20. Universal self-similarity of propagating populations.

    PubMed

    Eliazar, Iddo; Klafter, Joseph

    2010-07-01

    This paper explores the universal self-similarity of propagating populations. The following general propagation model is considered: particles are randomly emitted from the origin of a d-dimensional Euclidean space and propagate randomly and independently of each other in space; all particles share a statistically common--yet arbitrary--motion pattern; each particle has its own random propagation parameters--emission epoch, motion frequency, and motion amplitude. The universally self-similar statistics of the particles' displacements and first passage times (FPTs) are analyzed: statistics which are invariant with respect to the details of the displacement and FPT measurements and with respect to the particles' underlying motion pattern. Analysis concludes that the universally self-similar statistics are governed by Poisson processes with power-law intensities and by the Fréchet and Weibull extreme-value laws.

  1. Clustering, randomness and regularity in cloud fields. I - Theoretical considerations. II - Cumulus cloud fields

    NASA Technical Reports Server (NTRS)

    Weger, R. C.; Lee, J.; Zhu, Tianri; Welch, R. M.

    1992-01-01

    The current controversy existing in reference to the regularity vs. clustering in cloud fields is examined by means of analysis and simulation studies based upon nearest-neighbor cumulative distribution statistics. It is shown that the Poisson representation of random point processes is superior to pseudorandom-number-generated models and that pseudorandom-number-generated models bias the observed nearest-neighbor statistics towards regularity. Interpretation of this nearest-neighbor statistics is discussed for many cases of superpositions of clustering, randomness, and regularity. A detailed analysis is carried out of cumulus cloud field spatial distributions based upon Landsat, AVHRR, and Skylab data, showing that, when both large and small clouds are included in the cloud field distributions, the cloud field always has a strong clustering signal.

  2. surrosurv: An R package for the evaluation of failure time surrogate endpoints in individual patient data meta-analyses of randomized clinical trials.

    PubMed

    Rotolo, Federico; Paoletti, Xavier; Michiels, Stefan

    2018-03-01

    Surrogate endpoints are attractive for use in clinical trials instead of well-established endpoints because of practical convenience. To validate a surrogate endpoint, two important measures can be estimated in a meta-analytic context when individual patient data are available: the R indiv 2 or the Kendall's τ at the individual level, and the R trial 2 at the trial level. We aimed at providing an R implementation of classical and well-established as well as more recent statistical methods for surrogacy assessment with failure time endpoints. We also intended incorporating utilities for model checking and visualization and data generating methods described in the literature to date. In the case of failure time endpoints, the classical approach is based on two steps. First, a Kendall's τ is estimated as measure of individual level surrogacy using a copula model. Then, the R trial 2 is computed via a linear regression of the estimated treatment effects; at this second step, the estimation uncertainty can be accounted for via measurement-error model or via weights. In addition to the classical approach, we recently developed an approach based on bivariate auxiliary Poisson models with individual random effects to measure the Kendall's τ and treatment-by-trial interactions to measure the R trial 2 . The most common data simulation models described in the literature are based on: copula models, mixed proportional hazard models, and mixture of half-normal and exponential random variables. The R package surrosurv implements the classical two-step method with Clayton, Plackett, and Hougaard copulas. It also allows to optionally adjusting the second-step linear regression for measurement-error. The mixed Poisson approach is implemented with different reduced models in addition to the full model. We present the package functions for estimating the surrogacy models, for checking their convergence, for performing leave-one-trial-out cross-validation, and for plotting the results. We illustrate their use in practice on individual patient data from a meta-analysis of 4069 patients with advanced gastric cancer from 20 trials of chemotherapy. The surrosurv package provides an R implementation of classical and recent statistical methods for surrogacy assessment of failure time endpoints. Flexible simulation functions are available to generate data according to the methods described in the literature. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Zeroth Poisson Homology, Foliated Cohomology and Perfect Poisson Manifolds

    NASA Astrophysics Data System (ADS)

    Martínez-Torres, David; Miranda, Eva

    2018-01-01

    We prove that, for compact regular Poisson manifolds, the zeroth homology group is isomorphic to the top foliated cohomology group, and we give some applications. In particular, we show that, for regular unimodular Poisson manifolds, top Poisson and foliated cohomology groups are isomorphic. Inspired by the symplectic setting, we define what a perfect Poisson manifold is. We use these Poisson homology computations to provide families of perfect Poisson manifolds.

  4. Random Dopant Induced Threshold Voltage Lowering and Fluctuations in Sub-0.1 (micron)meter MOSFET's: A 3-D 'Atomistic' Simulation Study

    NASA Technical Reports Server (NTRS)

    Asenov, Asen

    1998-01-01

    A three-dimensional (3-D) "atomistic" simulation study of random dopant induced threshold voltage lowering and fluctuations in sub-0.1 microns MOSFET's is presented. For the first time a systematic analysis of random dopant effects down to an individual dopant level was carried out in 3-D on a scale sufficient to provide quantitative statistical predictions. Efficient algorithms based on a single multigrid solution of the Poisson equation followed by the solution of a simplified current continuity equation are used in the simulations. The effects of various MOSFET design parameters, including the channel length and width, oxide thickness and channel doping, on the threshold voltage lowering and fluctuations are studied using typical samples of 200 atomistically different MOSFET's. The atomistic results for the threshold voltage fluctuations were compared with two analytical models based on dopant number fluctuations. Although the analytical models predict the general trends in the threshold voltage fluctuations, they fail to describe quantitatively the magnitude of the fluctuations. The distribution of the atomistically calculated threshold voltage and its correlation with the number of dopants in the channel of the MOSFET's was analyzed based on a sample of 2500 microscopically different devices. The detailed analysis shows that the threshold voltage fluctuations are determined not only by the fluctuation in the dopant number, but also in the dopant position.

  5. Estimating prevalence of coronary heart disease for small areas using collateral indicators of morbidity.

    PubMed

    Congdon, Peter

    2010-01-01

    Different indicators of morbidity for chronic disease may not necessarily be available at a disaggregated spatial scale (e.g., for small areas with populations under 10 thousand). Instead certain indicators may only be available at a more highly aggregated spatial scale; for example, deaths may be recorded for small areas, but disease prevalence only at a considerably higher spatial scale. Nevertheless prevalence estimates at small area level are important for assessing health need. An instance is provided by England where deaths and hospital admissions for coronary heart disease are available for small areas known as wards, but prevalence is only available for relatively large health authority areas. To estimate CHD prevalence at small area level in such a situation, a shared random effect method is proposed that pools information regarding spatial morbidity contrasts over different indicators (deaths, hospitalizations, prevalence). The shared random effect approach also incorporates differences between small areas in known risk factors (e.g., income, ethnic structure). A Poisson-multinomial equivalence may be used to ensure small area prevalence estimates sum to the known higher area total. An illustration is provided by data for London using hospital admissions and CHD deaths at ward level, together with CHD prevalence totals for considerably larger local health authority areas. The shared random effect involved a spatially correlated common factor, that accounts for clustering in latent risk factors, and also provides a summary measure of small area CHD morbidity.

  6. Estimating Prevalence of Coronary Heart Disease for Small Areas Using Collateral Indicators of Morbidity

    PubMed Central

    Congdon, Peter

    2010-01-01

    Different indicators of morbidity for chronic disease may not necessarily be available at a disaggregated spatial scale (e.g., for small areas with populations under 10 thousand). Instead certain indicators may only be available at a more highly aggregated spatial scale; for example, deaths may be recorded for small areas, but disease prevalence only at a considerably higher spatial scale. Nevertheless prevalence estimates at small area level are important for assessing health need. An instance is provided by England where deaths and hospital admissions for coronary heart disease are available for small areas known as wards, but prevalence is only available for relatively large health authority areas. To estimate CHD prevalence at small area level in such a situation, a shared random effect method is proposed that pools information regarding spatial morbidity contrasts over different indicators (deaths, hospitalizations, prevalence). The shared random effect approach also incorporates differences between small areas in known risk factors (e.g., income, ethnic structure). A Poisson-multinomial equivalence may be used to ensure small area prevalence estimates sum to the known higher area total. An illustration is provided by data for London using hospital admissions and CHD deaths at ward level, together with CHD prevalence totals for considerably larger local health authority areas. The shared random effect involved a spatially correlated common factor, that accounts for clustering in latent risk factors, and also provides a summary measure of small area CHD morbidity. PMID:20195439

  7. A cross-sectional analysis of HIV and hepatitis C clinical trials 2007 to 2010: the relationship between industry sponsorship and randomized study design.

    PubMed

    Goswami, Neela D; Tsalik, Ephraim L; Naggie, Susanna; Miller, William C; Horton, John R; Pfeiffer, Christopher D; Hicks, Charles B

    2014-01-22

    The proportion of clinical research sponsored by industry will likely continue to expand as federal funds for academic research decreases, particularly in the fields of HIV/AIDS and hepatitis C (HCV). While HIV and HCV continue to burden the US population, insufficient data exists as to how industry sponsorship affects clinical trials involving these infectious diseases. Debate exists about whether pharmaceutical companies undertake more market-driven research practices to promote therapeutics, or instead conduct more rigorous trials than their non-industry counterparts because of increased resources and scrutiny. The ClinicalTrials.gov registry, which allows investigators to fulfill a federal mandate for public trial registration, provides an opportunity for critical evaluation of study designs for industry-sponsored trials, independent of publication status. As part of a large public policy effort, the Clinical Trials Transformation Initiative (CTTI) recently transformed the ClinicalTrials.gov registry into a searchable dataset to facilitate research on clinical trials themselves. We conducted a cross-sectional analysis of 477 HIV and HCV drug treatment trials, registered with ClinicalTrials.gov from 1 October 2007 to 27 September 2010, to study the relationship of study sponsorship with randomized study design. The likelihood of using randomization given industry (versus non-industry) sponsorship was reported with prevalence ratios (PR). PRs were estimated using crude and stratified tabular analysis and Poisson regression adjusting for presence of a data monitoring committee, enrollment size, study phase, number of study sites, inclusion of foreign study sites, exclusion of persons older than age 65, and disease condition. The crude PR was 1.17 (95% CI 0.94, 1.45). Adjusted Poisson models produced a PR of 1.13 (95% CI 0.82, 1.56). There was a trend toward mild effect measure modification by study phase, but this was not statistically significant. In stratified tabular analysis the adjusted PR was 1.14 (95% CI 0.78, 1.68) among phase 2/3 trials and 1.06 (95% CI 0.50, 2.22) among phase 4 trials. No significant relationship was found between industry sponsorship and use of randomization in trial design in this cross-sectional study. Prospective studies evaluating other aspects of trial design may shed further light on the relationship between industry sponsorship and appropriate trial methodology.

  8. A cross-sectional analysis of HIV and hepatitis C clinical trials 2007 to 2010: the relationship between industry sponsorship and randomized study design

    PubMed Central

    2014-01-01

    Background The proportion of clinical research sponsored by industry will likely continue to expand as federal funds for academic research decreases, particularly in the fields of HIV/AIDS and hepatitis C (HCV). While HIV and HCV continue to burden the US population, insufficient data exists as to how industry sponsorship affects clinical trials involving these infectious diseases. Debate exists about whether pharmaceutical companies undertake more market-driven research practices to promote therapeutics, or instead conduct more rigorous trials than their non-industry counterparts because of increased resources and scrutiny. The ClinicalTrials.gov registry, which allows investigators to fulfill a federal mandate for public trial registration, provides an opportunity for critical evaluation of study designs for industry-sponsored trials, independent of publication status. As part of a large public policy effort, the Clinical Trials Transformation Initiative (CTTI) recently transformed the ClinicalTrials.gov registry into a searchable dataset to facilitate research on clinical trials themselves. Methods We conducted a cross-sectional analysis of 477 HIV and HCV drug treatment trials, registered with ClinicalTrials.gov from 1 October 2007 to 27 September 2010, to study the relationship of study sponsorship with randomized study design. The likelihood of using randomization given industry (versus non-industry) sponsorship was reported with prevalence ratios (PR). PRs were estimated using crude and stratified tabular analysis and Poisson regression adjusting for presence of a data monitoring committee, enrollment size, study phase, number of study sites, inclusion of foreign study sites, exclusion of persons older than age 65, and disease condition. Results The crude PR was 1.17 (95% CI 0.94, 1.45). Adjusted Poisson models produced a PR of 1.13 (95% CI 0.82, 1.56). There was a trend toward mild effect measure modification by study phase, but this was not statistically significant. In stratified tabular analysis the adjusted PR was 1.14 (95% CI 0.78, 1.68) among phase 2/3 trials and 1.06 (95% CI 0.50, 2.22) among phase 4 trials. Conclusions No significant relationship was found between industry sponsorship and use of randomization in trial design in this cross-sectional study. Prospective studies evaluating other aspects of trial design may shed further light on the relationship between industry sponsorship and appropriate trial methodology. PMID:24450313

  9. Sample size calculations for comparative clinical trials with over-dispersed Poisson process data.

    PubMed

    Matsui, Shigeyuki

    2005-05-15

    This paper develops a new formula for sample size calculations for comparative clinical trials with Poisson or over-dispersed Poisson process data. The criteria for sample size calculations is developed on the basis of asymptotic approximations for a two-sample non-parametric test to compare the empirical event rate function between treatment groups. This formula can accommodate time heterogeneity, inter-patient heterogeneity in event rate, and also, time-varying treatment effects. An application of the formula to a trial for chronic granulomatous disease is provided. Copyright 2004 John Wiley & Sons, Ltd.

  10. Partial-Interval Estimation of Count: Uncorrected and Poisson-Corrected Error Levels

    ERIC Educational Resources Information Center

    Yoder, Paul J.; Ledford, Jennifer R.; Harbison, Amy L.; Tapp, Jon T.

    2018-01-01

    A simulation study that used 3,000 computer-generated event streams with known behavior rates, interval durations, and session durations was conducted to test whether the main and interaction effects of true rate and interval duration affect the error level of uncorrected and Poisson-transformed (i.e., "corrected") count as estimated by…

  11. Markov modulated Poisson process models incorporating covariates for rainfall intensity.

    PubMed

    Thayakaran, R; Ramesh, N I

    2013-01-01

    Time series of rainfall bucket tip times at the Beaufort Park station, Bracknell, in the UK are modelled by a class of Markov modulated Poisson processes (MMPP) which may be thought of as a generalization of the Poisson process. Our main focus in this paper is to investigate the effects of including covariate information into the MMPP model framework on statistical properties. In particular, we look at three types of time-varying covariates namely temperature, sea level pressure, and relative humidity that are thought to be affecting the rainfall arrival process. Maximum likelihood estimation is used to obtain the parameter estimates, and likelihood ratio tests are employed in model comparison. Simulated data from the fitted model are used to make statistical inferences about the accumulated rainfall in the discrete time interval. Variability of the daily Poisson arrival rates is studied.

  12. Nonlocal Poisson-Fermi double-layer models: Effects of nonuniform ion sizes on double-layer structure

    NASA Astrophysics Data System (ADS)

    Xie, Dexuan; Jiang, Yi

    2018-05-01

    This paper reports a nonuniform ionic size nonlocal Poisson-Fermi double-layer model (nuNPF) and a uniform ionic size nonlocal Poisson-Fermi double-layer model (uNPF) for an electrolyte mixture of multiple ionic species, variable voltages on electrodes, and variable induced charges on boundary segments. The finite element solvers of nuNPF and uNPF are developed and applied to typical double-layer tests defined on a rectangular box, a hollow sphere, and a hollow rectangle with a charged post. Numerical results show that nuNPF can significantly improve the quality of the ionic concentrations and electric fields generated from uNPF, implying that the effect of nonuniform ion sizes is a key consideration in modeling the double-layer structure.

  13. Discrimination of shot-noise-driven Poisson processes by external dead time - Application of radioluminescence from glass

    NASA Technical Reports Server (NTRS)

    Saleh, B. E. A.; Tavolacci, J. T.; Teich, M. C.

    1981-01-01

    Ways in which dead time can be used to constructively enhance or diminish the effects of point processes that display bunching in the shot-noise-driven doubly stochastic Poisson point process (SNDP) are discussed. Interrelations between photocount bunching arising in the SNDP and the antibunching character arising from dead-time effects are investigated. It is demonstrated that the dead-time-modified count mean and variance for an arbitrary doubly stochastic Poisson point process can be obtained from the Laplace transform of the single-fold and joint-moment-generating functions for the driving rate process. The theory is in good agreement with experimental values for radioluminescence radiation in fused silica, quartz, and glass, and the process has many applications in pulse, particle, and photon detection.

  14. A Generalized QMRA Beta-Poisson Dose-Response Model.

    PubMed

    Xie, Gang; Roiko, Anne; Stratton, Helen; Lemckert, Charles; Dunn, Peter K; Mengersen, Kerrie

    2016-10-01

    Quantitative microbial risk assessment (QMRA) is widely accepted for characterizing the microbial risks associated with food, water, and wastewater. Single-hit dose-response models are the most commonly used dose-response models in QMRA. Denoting PI(d) as the probability of infection at a given mean dose d, a three-parameter generalized QMRA beta-Poisson dose-response model, PI(d|α,β,r*), is proposed in which the minimum number of organisms required for causing infection, K min , is not fixed, but a random variable following a geometric distribution with parameter 0

  15. The Effects of a Skill-Based Intervention for Victims of Bullying in Brazil.

    PubMed

    da Silva, Jorge Luiz; de Oliveira, Wanderlei Abadio; Braga, Iara Falleiros; Farias, Marilurdes Silva; da Silva Lizzi, Elisangela Aparecida; Gonçalves, Marlene Fagundes Carvalho; Pereira, Beatriz Oliveira; Silva, Marta Angélica Iossi

    2016-10-26

    This study's objective was to verify whether improved social and emotional skills would reduce victimization among Brazilian 6th grade student victims of bullying. The targets of this intervention were victimized students; a total of 78 victims participated. A cognitive-behavioral intervention based on social and emotional skills was held in eight weekly sessions. The sessions focused on civility, the ability to make friends, self-control, emotional expressiveness, empathy, assertiveness, and interpersonal problem-solving capacity. Data were analyzed through Poisson regression models with random effects. Pre- and post-analyses reveal that intervention and comparison groups presented significant reduced victimization by bullying. No significant improvement was found in regard to difficulties in practicing social skills. Victimization reduction cannot be attributed to the program. This study contributes to the incipient literature addressing anti-bullying interventions conducted in developing countries and highlights the need for approaches that do not exclusively focus on the students' individual aspects.

  16. Normal and compound poisson approximations for pattern occurrences in NGS reads.

    PubMed

    Zhai, Zhiyuan; Reinert, Gesine; Song, Kai; Waterman, Michael S; Luan, Yihui; Sun, Fengzhu

    2012-06-01

    Next generation sequencing (NGS) technologies are now widely used in many biological studies. In NGS, sequence reads are randomly sampled from the genome sequence of interest. Most computational approaches for NGS data first map the reads to the genome and then analyze the data based on the mapped reads. Since many organisms have unknown genome sequences and many reads cannot be uniquely mapped to the genomes even if the genome sequences are known, alternative analytical methods are needed for the study of NGS data. Here we suggest using word patterns to analyze NGS data. Word pattern counting (the study of the probabilistic distribution of the number of occurrences of word patterns in one or multiple long sequences) has played an important role in molecular sequence analysis. However, no studies are available on the distribution of the number of occurrences of word patterns in NGS reads. In this article, we build probabilistic models for the background sequence and the sampling process of the sequence reads from the genome. Based on the models, we provide normal and compound Poisson approximations for the number of occurrences of word patterns from the sequence reads, with bounds on the approximation error. The main challenge is to consider the randomness in generating the long background sequence, as well as in the sampling of the reads using NGS. We show the accuracy of these approximations under a variety of conditions for different patterns with various characteristics. Under realistic assumptions, the compound Poisson approximation seems to outperform the normal approximation in most situations. These approximate distributions can be used to evaluate the statistical significance of the occurrence of patterns from NGS data. The theory and the computational algorithm for calculating the approximate distributions are then used to analyze ChIP-Seq data using transcription factor GABP. Software is available online (www-rcf.usc.edu/∼fsun/Programs/NGS_motif_power/NGS_motif_power.html). In addition, Supplementary Material can be found online (www.liebertonline.com/cmb).

  17. DISCRETE COMPOUND POISSON PROCESSES AND TABLES OF THE GEOMETRIC POISSON DISTRIBUTION.

    DTIC Science & Technology

    A concise summary of the salient properties of discrete Poisson processes , with emphasis on comparing the geometric and logarithmic Poisson processes . The...the geometric Poisson process are given for 176 sets of parameter values. New discrete compound Poisson processes are also introduced. These...processes have properties that are particularly relevant when the summation of several different Poisson processes is to be analyzed. This study provides the

  18. Randomized central limit theorems: A unified theory.

    PubMed

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles' aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles' extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic-scaling all ensemble components by a common deterministic scale. However, there are "random environment" settings in which the underlying scaling schemes are stochastic-scaling the ensemble components by different random scales. Examples of such settings include Holtsmark's law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)-in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes-and present "randomized counterparts" to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  19. Randomized central limit theorems: A unified theory

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Klafter, Joseph

    2010-08-01

    The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles’ aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles’ extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic—scaling all ensemble components by a common deterministic scale. However, there are “random environment” settings in which the underlying scaling schemes are stochastic—scaling the ensemble components by different random scales. Examples of such settings include Holtsmark’s law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)—in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes—and present “randomized counterparts” to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.

  20. Effect of angle-ply orientation on compression strength of composite laminates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeTeresa, S J; Hoppel, C P

    1999-03-01

    An experimental program was initiated to investigate the effect of angle-ply orientations on the compressive strength (X{sub 1C}) of 0{degree} plies in fiber reinforced composite laminates. Graphite fiber-reinforced epoxy test coupons with the generic architecture [0{sub 2}/{+-}{theta}] (where {theta} varied between 0{degree} and 90{degree}) and for the quasi-isotropic architecture were evaluated. The effective compressive strength of the 0{degree} plies varied considerably. The results were related to the Poisson's ratios of the laminates with high Poisson's ratios leading to high transverse tensile strains in the test coupons and lower than expected strengths. Specimens with the [O{sub 2}/{+-}30] architecture had both themore » highest Poisson's ratio and the lowest calculated ply-level compression strength for the 0{degree} plies. This work has implications in the selection of composite failure criterion for compression performance, design of test coupons for acceptance testing, and the selection of laminate architectures for optimum combinations of compressive and shear behavior. Two commonly used composite failure criteria, the maximum stress and the Tsai-Wu, predict significantly different laminate strengths depending on the Poisson's ratio of the laminate. This implies that the biaxial stress state in the laminate needs to be carefully considered before backing out unidirectional properties.« less

  1. Supercomputer optimizations for stochastic optimal control applications

    NASA Technical Reports Server (NTRS)

    Chung, Siu-Leung; Hanson, Floyd B.; Xu, Huihuang

    1991-01-01

    Supercomputer optimizations for a computational method of solving stochastic, multibody, dynamic programming problems are presented. The computational method is valid for a general class of optimal control problems that are nonlinear, multibody dynamical systems, perturbed by general Markov noise in continuous time, i.e., nonsmooth Gaussian as well as jump Poisson random white noise. Optimization techniques for vector multiprocessors or vectorizing supercomputers include advanced data structures, loop restructuring, loop collapsing, blocking, and compiler directives. These advanced computing techniques and superconducting hardware help alleviate Bellman's curse of dimensionality in dynamic programming computations, by permitting the solution of large multibody problems. Possible applications include lumped flight dynamics models for uncertain environments, such as large scale and background random aerospace fluctuations.

  2. Distribution of shortest cycle lengths in random networks

    NASA Astrophysics Data System (ADS)

    Bonneau, Haggai; Hassid, Aviv; Biham, Ofer; Kühn, Reimer; Katzav, Eytan

    2017-12-01

    We present analytical results for the distribution of shortest cycle lengths (DSCL) in random networks. The approach is based on the relation between the DSCL and the distribution of shortest path lengths (DSPL). We apply this approach to configuration model networks, for which analytical results for the DSPL were obtained before. We first calculate the fraction of nodes in the network which reside on at least one cycle. Conditioning on being on a cycle, we provide the DSCL over ensembles of configuration model networks with degree distributions which follow a Poisson distribution (Erdős-Rényi network), degenerate distribution (random regular graph), and a power-law distribution (scale-free network). The mean and variance of the DSCL are calculated. The analytical results are found to be in very good agreement with the results of computer simulations.

  3. Method for resonant measurement

    DOEpatents

    Rhodes, G.W.; Migliori, A.; Dixon, R.D.

    1996-03-05

    A method of measurement of objects to determine object flaws, Poisson`s ratio ({sigma}) and shear modulus ({mu}) is shown and described. First, the frequency for expected degenerate responses is determined for one or more input frequencies and then splitting of degenerate resonant modes are observed to identify the presence of flaws in the object. Poisson`s ratio and the shear modulus can be determined by identification of resonances dependent only on the shear modulus, and then using that shear modulus to find Poisson`s ratio using other modes dependent on both the shear modulus and Poisson`s ratio. 1 fig.

  4. Maximum Likelihood Time-of-Arrival Estimation of Optical Pulses via Photon-Counting Photodetectors

    NASA Technical Reports Server (NTRS)

    Erkmen, Baris I.; Moision, Bruce E.

    2010-01-01

    Many optical imaging, ranging, and communications systems rely on the estimation of the arrival time of an optical pulse. Recently, such systems have been increasingly employing photon-counting photodetector technology, which changes the statistics of the observed photocurrent. This requires time-of-arrival estimators to be developed and their performances characterized. The statistics of the output of an ideal photodetector, which are well modeled as a Poisson point process, were considered. An analytical model was developed for the mean-square error of the maximum likelihood (ML) estimator, demonstrating two phenomena that cause deviations from the minimum achievable error at low signal power. An approximation was derived to the threshold at which the ML estimator essentially fails to provide better than a random guess of the pulse arrival time. Comparing the analytic model performance predictions to those obtained via simulations, it was verified that the model accurately predicts the ML performance over all regimes considered. There is little prior art that attempts to understand the fundamental limitations to time-of-arrival estimation from Poisson statistics. This work establishes both a simple mathematical description of the error behavior, and the associated physical processes that yield this behavior. Previous work on mean-square error characterization for ML estimators has predominantly focused on additive Gaussian noise. This work demonstrates that the discrete nature of the Poisson noise process leads to a distinctly different error behavior.

  5. Shear-induced chaos

    NASA Astrophysics Data System (ADS)

    Lin, Kevin K.; Young, Lai-Sang

    2008-05-01

    Guided by a geometric understanding developed in earlier works of Wang and Young, we carry out numerical studies of shear-induced chaos in several parallel but different situations. The settings considered include periodic kicking of limit cycles, random kicks at Poisson times and continuous-time driving by white noise. The forcing of a quasi-periodic model describing two coupled oscillators is also investigated. In all cases, positive Lyapunov exponents are found in suitable parameter ranges when the forcing is suitably directed.

  6. Antibiotic resistance in hospitals: a ward-specific random effect model in a low antibiotic consumption environment.

    PubMed

    Aldrin, Magne; Raastad, Ragnhild; Tvete, Ingunn Fride; Berild, Dag; Frigessi, Arnoldo; Leegaard, Truls; Monnet, Dominique L; Walberg, Mette; Müller, Fredrik

    2013-04-15

    Association between previous antibiotic use and emergence of antibiotic resistance has been reported for several microorganisms. The relationship has been extensively studied, and although the causes of antibiotic resistance are multi-factorial, clear evidence of antibiotic use as a major risk factor exists. Most studies are carried out in countries with high consumption of antibiotics and corresponding high levels of antibiotic resistance, and currently, little is known whether and at what level the associations are detectable in a low antibiotic consumption environment. We conduct an ecological, retrospective study aimed at determining the impact of antibiotic consumption on antibiotic-resistant Pseudomonas aeruginosa in three hospitals in Norway, a country with low levels of antibiotic use. We construct a sophisticated statistical model to capture such low signals. To reduce noise, we conduct our study at hospital ward level. We propose a random effect Poisson or binomial regression model, with a reparametrisation that allows us to reduce the number of parameters. Inference is likelihood based. Through scenario simulation, we study the potential effects of reduced or increased antibiotic use. Results clearly indicate that the effects of consumption on resistance are present under conditions with relatively low use of antibiotic agents. This strengthens the recommendation on prudent use of antibiotics, even when consumption is relatively low. Copyright © 2012 John Wiley & Sons, Ltd.

  7. Effectiveness of the bucco-lingual technique within a school-based supervised toothbrushing program on preventing caries: a randomized controlled trial

    PubMed Central

    2011-01-01

    Background Supervised toothbrushing programs using fluoride dentifrice have reduced caries increment. However there is no information about the effectiveness of the professional cross-brushing technique within a community intervention. The aim was to assess if the bucco-lingual technique can increase the effectiveness of a school-based supervised toothbrushing program on preventing caries. Methods A randomized double-blinded controlled community intervention trial to be analyzed at an individual level was conducted in a Brazilian low-income fluoridated area. Six preschools were randomly assigned to the test and control groups and 284 five-year-old children presenting at least one permanent molar with emerged/sound occlusal surface participated. In control group, oral health education and dental plaque dying followed by toothbrushing with fluoride dentifrice supervised directly by a dental assistant, was developed four times per year. At the remaining school days the children brushed their teeth under indirect supervising of the teachers. In test group, children also underwent a professional cross-brushing on surfaces of first permanent molar rendered by a specially trained dental assistant five times per year. Enamel and dentin caries were recorded on buccal, occlusal and lingual surfaces of permanent molars during 18-month follow-up. Exposure time of surfaces was calculated and incidence density ratio was estimated using Poisson regression model. Results Difference of 21.6 lesions per 1,000 children between control and test groups was observed. Among boys whose caries risk was higher compared to girls, incidence density was 50% lower in test group (p = 0.016). Conclusion Modified program was effective among the boys. It is licit to project a relevant effect in a larger period suggesting in a broader population substantial reduction of dental care needs. Trial registration ISRCTN18548869. PMID:21426572

  8. Spatio-energetic cross talk in photon counting detectors: Detector model and correlated Poisson data generator.

    PubMed

    Taguchi, Katsuyuki; Polster, Christoph; Lee, Okkyun; Stierstorfer, Karl; Kappler, Steffen

    2016-12-01

    An x-ray photon interacts with photon counting detectors (PCDs) and generates an electron charge cloud or multiple clouds. The clouds (thus, the photon energy) may be split between two adjacent PCD pixels when the interaction occurs near pixel boundaries, producing a count at both of the pixels. This is called double-counting with charge sharing. (A photoelectric effect with K-shell fluorescence x-ray emission would result in double-counting as well). As a result, PCD data are spatially and energetically correlated, although the output of individual PCD pixels is Poisson distributed. Major problems include the lack of a detector noise model for the spatio-energetic cross talk and lack of a computationally efficient simulation tool for generating correlated Poisson data. A Monte Carlo (MC) simulation can accurately simulate these phenomena and produce noisy data; however, it is not computationally efficient. In this study, the authors developed a new detector model and implemented it in an efficient software simulator that uses a Poisson random number generator to produce correlated noisy integer counts. The detector model takes the following effects into account: (1) detection efficiency; (2) incomplete charge collection and ballistic effect; (3) interaction with PCDs via photoelectric effect (with or without K-shell fluorescence x-ray emission, which may escape from the PCDs or be reabsorbed); and (4) electronic noise. The correlation was modeled by using these two simplifying assumptions: energy conservation and mutual exclusiveness. The mutual exclusiveness is that no more than two pixels measure energy from one photon. The effect of model parameters has been studied and results were compared with MC simulations. The agreement, with respect to the spectrum, was evaluated using the reduced χ 2 statistics or a weighted sum of squared errors, χ red 2 (≥1), where χ red 2 =1 indicates a perfect fit. The model produced spectra with flat field irradiation that qualitatively agree with previous studies. The spectra generated with different model and geometry parameters allowed for understanding the effect of the parameters on the spectrum and the correlation of data. The agreement between the model and MC data was very strong. The mean spectra with 90 keV and 140 kVp agreed exceptionally well: χ red 2 values were 1.049 with 90 keV data and 1.007 with 140 kVp data. The degrees of cross talk (in terms of the relative increase from single pixel irradiation to flat field irradiation) were 22% with 90 keV and 19% with 140 kVp for MC simulations, while they were 21% and 17%, respectively, for the model. The covariance was in strong agreement qualitatively, although it was overestimated. The noisy data generation was very efficient, taking less than a CPU minute as opposed to CPU hours for MC simulators. The authors have developed a novel, computationally efficient PCD model that takes into account double-counting and resulting spatio-energetic correlation between PCD pixels. The MC simulation validated the accuracy.

  9. [Prevalence of leisure-time physical activity and associated factors: a population-based study in São Paulo, Brazil, 2008-2009].

    PubMed

    Sousa, Clóvis Arlindo de; César, Chester Luiz Galvão; Barros, Marilisa Berti de Azevedo; Carandina, Luana; Goldbaum, Moisés; Marchioni, Dirce Maria Lobo; Fisberg, Regina Mara

    2013-02-01

    The purpose of this study was to ascertain the prevalence of self-reported leisure-time physical activity and related factors in the city of São Paulo, Brazil, 2008-2009. A population- based cross-sectional study interviewed 2,691 individuals of both sexes, 12 years or older. A two-stage cluster (census tract, household) random sample provided data using home interviews in 2008 and 2009. Leisure-time physical activity was measured with IPAQ, long version. Complex sample-adjusted descriptive statistics provided prevalence estimates, chi-square tests screened associations, and prevalence ratios (PR) expressed effects. Multiple Poisson regression was used to ascertain adjusted effects, and design effects were calculated. Of the interviewees, 16.4% (95%CI: 14.3-18.7) reported leisure-time physical activity. The findings indicate the importance of encouraging leisure-time physical activity, which was associated with male sex, higher income, younger age (12 to 29 years), not smoking, and not reporting frequent fatigue.

  10. Properties of the Bivariate Delayed Poisson Process

    DTIC Science & Technology

    1974-07-01

    and Lewis (1972) in their Berkeley Symposium paper and here their analysis of the bivariate Poisson processes (without Poisson noise) is carried... Poisson processes . They cannot, however, be independent Poisson processes because their events are associated in pairs by the displace- ment centres...process because its marginal processes for events of each type are themselves (univariate) Poisson processes . Cox and Lewis (1972) assumed a

  11. Time to burn: Modeling wildland arson as an autoregressive crime function

    Treesearch

    Jeffrey P. Prestemon; David T. Butry

    2005-01-01

    Six Poisson autoregressive models of order p [PAR(p)] of daily wildland arson ignition counts are estimated for five locations in Florida (1994-2001). In addition, a fixed effects time-series Poisson model of annual arson counts is estimated for all Florida counties (1995-2001). PAR(p) model estimates reveal highly significant arson ignition autocorrelation, lasting up...

  12. Impact of the Fano Factor on Position and Energy Estimation in Scintillation Detectors.

    PubMed

    Bora, Vaibhav; Barrett, Harrison H; Jha, Abhinav K; Clarkson, Eric

    2015-02-01

    The Fano factor for an integer-valued random variable is defined as the ratio of its variance to its mean. Light from various scintillation crystals have been reported to have Fano factors from sub-Poisson (Fano factor < 1) to super-Poisson (Fano factor > 1). For a given mean, a smaller Fano factor implies a smaller variance and thus less noise. We investigated if lower noise in the scintillation light will result in better spatial and energy resolutions. The impact of Fano factor on the estimation of position of interaction and energy deposited in simple gamma-camera geometries is estimated by two methods - calculating the Cramér-Rao bound and estimating the variance of a maximum likelihood estimator. The methods are consistent with each other and indicate that when estimating the position of interaction and energy deposited by a gamma-ray photon, the Fano factor of a scintillator does not affect the spatial resolution. A smaller Fano factor results in a better energy resolution.

  13. Cascades of Particles Moving at Finite Velocity in Hyperbolic Spaces

    NASA Astrophysics Data System (ADS)

    Cammarota, V.; Orsingher, E.

    2008-12-01

    A branching process of particles moving at finite velocity over the geodesic lines of the hyperbolic space (Poincaré half-plane and Poincaré disk) is examined. Each particle can split into two particles only once at Poisson spaced times and deviates orthogonally when splitted. At time t, after N( t) Poisson events, there are N( t)+1 particles moving along different geodesic lines. We are able to obtain the exact expression of the mean hyperbolic distance of the center of mass of the cloud of particles. We derive such mean hyperbolic distance from two different and independent ways and we study the behavior of the relevant expression as t increases and for different values of the parameters c (hyperbolic velocity of motion) and λ (rate of reproduction). The mean hyperbolic distance of each moving particle is also examined and a useful representation, as the distance of a randomly stopped particle moving over the main geodesic line, is presented.

  14. Relative age effect in elite soccer: More early-born players, but no better valued, and no paragon clubs or countries

    PubMed Central

    Doyle, John R.

    2018-01-01

    The paper analyses two datasets of elite soccer players (top 1000 professionals and UEFA Under-19 Youth League). In both, we find a Relative Age Effect (RAE) for frequency, but not for value. That is, while there are more players born at the start of the competition year, their transfer values are no higher, nor are they given more game time. We use Poisson regression to derive a transparent index of the discrimination present in RAE. Also, because Poisson is valid for small frequency counts, it supports analysis at the disaggregated levels of country and club. From this, we conclude there are no paragon clubs or countries immune to RAE; that is clubs and countries do not differ systematically in the RAE they experience; also, that Poisson regression is a powerful and flexible method of analysing RAE data. PMID:29420576

  15. Effect of noise on defect chaos in a reaction-diffusion model.

    PubMed

    Wang, Hongli; Ouyang, Qi

    2005-06-01

    The influence of noise on defect chaos due to breakup of spiral waves through Doppler and Eckhaus instabilities is investigated numerically with a modified Fitzhugh-Nagumo model. By numerical simulations we show that the noise can drastically enhance the creation and annihilation rates of topological defects. The noise-free probability distribution function for defects in this model is found not to fit with the previously reported squared-Poisson distribution. Under the influence of noise, the distributions are flattened, and can fit with the squared-Poisson or the modified-Poisson distribution. The defect lifetime and diffusive property of defects under the influence of noise are also checked in this model.

  16. Modeling species-abundance relationships in multi-species collections

    USGS Publications Warehouse

    Peng, S.; Yin, Z.; Ren, H.; Guo, Q.

    2003-01-01

    Species-abundance relationship is one of the most fundamental aspects of community ecology. Since Motomura first developed the geometric series model to describe the feature of community structure, ecologists have developed many other models to fit the species-abundance data in communities. These models can be classified into empirical and theoretical ones, including (1) statistical models, i.e., negative binomial distribution (and its extension), log-series distribution (and its extension), geometric distribution, lognormal distribution, Poisson-lognormal distribution, (2) niche models, i.e., geometric series, broken stick, overlapping niche, particulate niche, random assortment, dominance pre-emption, dominance decay, random fraction, weighted random fraction, composite niche, Zipf or Zipf-Mandelbrot model, and (3) dynamic models describing community dynamics and restrictive function of environment on community. These models have different characteristics and fit species-abundance data in various communities or collections. Among them, log-series distribution, lognormal distribution, geometric series, and broken stick model have been most widely used.

  17. Poisson-Nernst-Planck Equations for Simulating Biomolecular Diffusion-Reaction Processes II: Size Effects on Ionic Distributions and Diffusion-Reaction Rates

    PubMed Central

    Lu, Benzhuo; Zhou, Y.C.

    2011-01-01

    The effects of finite particle size on electrostatics, density profiles, and diffusion have been a long existing topic in the study of ionic solution. The previous size-modified Poisson-Boltzmann and Poisson-Nernst-Planck models are revisited in this article. In contrast to many previous works that can only treat particle species with a single uniform size or two sizes, we generalize the Borukhov model to obtain a size-modified Poisson-Nernst-Planck (SMPNP) model that is able to treat nonuniform particle sizes. The numerical tractability of the model is demonstrated as well. The main contributions of this study are as follows. 1), We show that an (arbitrarily) size-modified PB model is indeed implied by the SMPNP equations under certain boundary/interface conditions, and can be reproduced through numerical solutions of the SMPNP. 2), The size effects in the SMPNP effectively reduce the densities of highly concentrated counterions around the biomolecule. 3), The SMPNP is applied to the diffusion-reaction process for the first time, to our knowledge. In the case of low substrate density near the enzyme reactive site, it is observed that the rate coefficients predicted by SMPNP model are considerably larger than those by the PNP model, suggesting both ions and substrates are subject to finite size effects. 4), An accurate finite element method and a convergent Gummel iteration are developed for the numerical solution of the completely coupled nonlinear system of SMPNP equations. PMID:21575582

  18. Continuous Decision Support

    DTIC Science & Technology

    2015-12-24

    but the fundamental approach remains unchanged. We consider the case of a sports memorabilia shop whose owner is an avid personal collector of baseball...collector’s competition 15 days from now. Between now and then, as customers bring in antique baseball cards, he must decide which ones to purchase for his...purchased from the shop each day is a random variable that is Poisson distributed with λout = 2. • 20% of cards are 5.25 in2, 10% are 9.97 in2, and 70% are

  19. Diversity of Poissonian populations.

    PubMed

    Eliazar, Iddo I; Sokolov, Igor M

    2010-01-01

    Populations represented by collections of points scattered randomly on the real line are ubiquitous in science and engineering. The statistical modeling of such populations leads naturally to Poissonian populations-Poisson processes on the real line with a distinguished maximal point. Poissonian populations are infinite objects underlying key issues in statistical physics, probability theory, and random fractals. Due to their infiniteness, measuring the diversity of Poissonian populations depends on the lower-bound cut-off applied. This research characterizes the classes of Poissonian populations whose diversities are invariant with respect to the cut-off level applied and establishes an elemental connection between these classes and extreme-value theory. The measures of diversity considered are variance and dispersion, Simpson's index and inverse participation ratio, Shannon's entropy and Rényi's entropy, and Gini's index.

  20. The asymptotic behavior in a reversible random coagulation-fragmentation polymerization process with sub-exponential decay

    NASA Astrophysics Data System (ADS)

    Dong, Siqun; Zhao, Dianli

    2018-01-01

    This paper studies the subcritical, near-critical and supercritical asymptotic behavior of a reversible random coagulation-fragmentation polymerization process as N → ∞, with the number of distinct ways to form a k-clusters from k units satisfying f(k) =(1 + o (1)) cr-ke-kαk-β, where 0 < α < 1 and β > 0. When the cluster size is small, its distribution is proved to converge to the Gaussian distribution. For the medium clusters, its distribution will converge to Poisson distribution in supercritical stage, and no large clusters exist in this stage. Furthermore, the largest length of polymers of size N is of order ln N in the subcritical stage under α ⩽ 1 / 2.

  1. Hierarchical Bayesian spatial models for alcohol availability, drug "hot spots" and violent crime.

    PubMed

    Zhu, Li; Gorman, Dennis M; Horel, Scott

    2006-12-07

    Ecologic studies have shown a relationship between alcohol outlet densities, illicit drug use and violence. The present study examined this relationship in the City of Houston, Texas, using a sample of 439 census tracts. Neighborhood sociostructural covariates, alcohol outlet density, drug crime density and violent crime data were collected for the year 2000, and analyzed using hierarchical Bayesian models. Model selection was accomplished by applying the Deviance Information Criterion. The counts of violent crime in each census tract were modelled as having a conditional Poisson distribution. Four neighbourhood explanatory variables were identified using principal component analysis. The best fitted model was selected as the one considering both unstructured and spatial dependence random effects. The results showed that drug-law violation explained a greater amount of variance in violent crime rates than alcohol outlet densities. The relative risk for drug-law violation was 2.49 and that for alcohol outlet density was 1.16. Of the neighbourhood sociostructural covariates, males of age 15 to 24 showed an effect on violence, with a 16% decrease in relative risk for each increase the size of its standard deviation. Both unstructured heterogeneity random effect and spatial dependence need to be included in the model. The analysis presented suggests that activity around illicit drug markets is more strongly associated with violent crime than is alcohol outlet density. Unique among the ecological studies in this field, the present study not only shows the direction and magnitude of impact of neighbourhood sociostructural covariates as well as alcohol and illicit drug activities in a neighbourhood, it also reveals the importance of applying hierarchical Bayesian models in this research field as both spatial dependence and heterogeneity random effects need to be considered simultaneously.

  2. Modeling motor vehicle crashes using Poisson-gamma models: examining the effects of low sample mean values and small sample size on the estimation of the fixed dispersion parameter.

    PubMed

    Lord, Dominique

    2006-07-01

    There has been considerable research conducted on the development of statistical models for predicting crashes on highway facilities. Despite numerous advancements made for improving the estimation tools of statistical models, the most common probabilistic structure used for modeling motor vehicle crashes remains the traditional Poisson and Poisson-gamma (or Negative Binomial) distribution; when crash data exhibit over-dispersion, the Poisson-gamma model is usually the model of choice most favored by transportation safety modelers. Crash data collected for safety studies often have the unusual attributes of being characterized by low sample mean values. Studies have shown that the goodness-of-fit of statistical models produced from such datasets can be significantly affected. This issue has been defined as the "low mean problem" (LMP). Despite recent developments on methods to circumvent the LMP and test the goodness-of-fit of models developed using such datasets, no work has so far examined how the LMP affects the fixed dispersion parameter of Poisson-gamma models used for modeling motor vehicle crashes. The dispersion parameter plays an important role in many types of safety studies and should, therefore, be reliably estimated. The primary objective of this research project was to verify whether the LMP affects the estimation of the dispersion parameter and, if it is, to determine the magnitude of the problem. The secondary objective consisted of determining the effects of an unreliably estimated dispersion parameter on common analyses performed in highway safety studies. To accomplish the objectives of the study, a series of Poisson-gamma distributions were simulated using different values describing the mean, the dispersion parameter, and the sample size. Three estimators commonly used by transportation safety modelers for estimating the dispersion parameter of Poisson-gamma models were evaluated: the method of moments, the weighted regression, and the maximum likelihood method. In an attempt to complement the outcome of the simulation study, Poisson-gamma models were fitted to crash data collected in Toronto, Ont. characterized by a low sample mean and small sample size. The study shows that a low sample mean combined with a small sample size can seriously affect the estimation of the dispersion parameter, no matter which estimator is used within the estimation process. The probability the dispersion parameter becomes unreliably estimated increases significantly as the sample mean and sample size decrease. Consequently, the results show that an unreliably estimated dispersion parameter can significantly undermine empirical Bayes (EB) estimates as well as the estimation of confidence intervals for the gamma mean and predicted response. The paper ends with recommendations about minimizing the likelihood of producing Poisson-gamma models with an unreliable dispersion parameter for modeling motor vehicle crashes.

  3. Analysis of bedside entertainment services' effect on post cardiac surgery physical activity: a prospective, randomised clinical trial.

    PubMed

    Papaspyros, Sotiris; Uppal, Shitansu; Khan, Shakeeb A; Paul, Sanjoy; O'Regan, David J

    2008-11-01

    A rising number of acute hospitals in the UK have been providing patients with bedside entertainment services (BES) since 1995. However, their effect on postoperative patient mobility has not been explored. The aim of this prospective randomised clinical trial was to compare the level of postoperative physical activity and length of in-hospital stay of patients undergoing cardiac surgery depending on whether they had access to BES or not. One hundred patients requiring elective cardiac surgery were randomised to receive access to BES (52 patients) or not (48 patients). Pedometers were used to quantify postoperative physical activity for 5 days. To assess the significance of the effect of intervention (TV off or on) on the pedometer counts over time a mixed effect Poisson regression model is used, with the time varying aspect as random component. The potential influence of gender difference and age on pedometer counts were assessed by incorporating these two factors as covariates in the Poisson model. On average, patients with no access to BES walked more than those with BES access. This difference ranged between 192 and 609 steps in favour of the first group for each individual postoperative day. Patients with no access to BES were 84% more likely (risk ratio: 1.84, 95% CI: 1.29-2.63) to walk higher number of steps than patients with access to BES. On average, participants with access to BES were likely to stay longer in hospital (median of 7 days with interquartile range 6-7 days), than participants with no access to BES (median of 6 days with interquartile range 5-7 days), however the difference did not reach statistical significance. We have demonstrated that the bedside entertainment systems may have an adverse effect on post cardiac surgery patient ambulation and may contribute to an increase in hospital stay.

  4. The effect of dissipative inhomogeneous medium on the statistics of the wave intensity

    NASA Technical Reports Server (NTRS)

    Saatchi, Sasan S.

    1993-01-01

    One of the main theoretical points in the theory of wave propagation in random medium is the derivation of closed form equations to describe the statistics of the propagating waves. In particular, in one dimensional problems, the closed form representation of the multiple scattering effects is important since it contributes in understanding such problems like wave localization, backscattering enhancement, and intensity fluctuations. In this the propagation of plane waves in a layer of one-dimensional dissipative random medium is considered. The medium is modeled by a complex permittivity whose real part is a constant representing the absorption. The one dimensional problem is mathematically equivalent to the analysis of a transmission line with randomly perturbed distributed parameters and a single mode lossy waveguide and the results can be used to study the propagation of radio waves through atmosphere and the remote sensing of geophysical media. It is assumed the scattering medium consists of an ensemble of one-dimensional point scatterers randomly positioned in a layer of thickness L with diffuse boundaries. A Poisson impulse process with density lambda is used to model the position of scatterers in the medium. By employing the Markov properties of this process an exact closed form equation of Kolmogorov-Feller type was obtained for the probability density of the reflection coefficient. This equation was solved by combining two limiting cases: (1) when the density of scatterers is small; and (2) when the medium is weakly dissipative. A two variable perturbation method for small lambda was used to obtain solutions valid for thick layers. These solutions are then asymptotically evaluated for small dissipation. To show the effect of dissipation, the mean and fluctuations of the reflected power are obtained. The results were compared with a lossy homogeneous medium and with a lossless inhomogeneous medium and the regions where the effect of absorption is not essential were discussed.

  5. Fractional Poisson Fields and Martingales

    NASA Astrophysics Data System (ADS)

    Aletti, Giacomo; Leonenko, Nikolai; Merzbach, Ely

    2018-02-01

    We present new properties for the Fractional Poisson process (FPP) and the Fractional Poisson field on the plane. A martingale characterization for FPPs is given. We extend this result to Fractional Poisson fields, obtaining some other characterizations. The fractional differential equations are studied. We consider a more general Mixed-Fractional Poisson process and show that this process is the stochastic solution of a system of fractional differential-difference equations. Finally, we give some simulations of the Fractional Poisson field on the plane.

  6. On a Poisson homogeneous space of bilinear forms with a Poisson-Lie action

    NASA Astrophysics Data System (ADS)

    Chekhov, L. O.; Mazzocco, M.

    2017-12-01

    Let \\mathscr A be the space of bilinear forms on C^N with defining matrices A endowed with a quadratic Poisson structure of reflection equation type. The paper begins with a short description of previous studies of the structure, and then this structure is extended to systems of bilinear forms whose dynamics is governed by the natural action A\\mapsto B ABT} of the {GL}_N Poisson-Lie group on \\mathscr A. A classification is given of all possible quadratic brackets on (B, A)\\in {GL}_N× \\mathscr A preserving the Poisson property of the action, thus endowing \\mathscr A with the structure of a Poisson homogeneous space. Besides the product Poisson structure on {GL}_N× \\mathscr A, there are two other (mutually dual) structures, which (unlike the product Poisson structure) admit reductions by the Dirac procedure to a space of bilinear forms with block upper triangular defining matrices. Further generalisations of this construction are considered, to triples (B,C, A)\\in {GL}_N× {GL}_N× \\mathscr A with the Poisson action A\\mapsto B ACT}, and it is shown that \\mathscr A then acquires the structure of a Poisson symmetric space. Generalisations to chains of transformations and to the quantum and quantum affine algebras are investigated, as well as the relations between constructions of Poisson symmetric spaces and the Poisson groupoid. Bibliography: 30 titles.

  7. Conditional analysis of mixed Poisson processes with baseline counts: implications for trial design and analysis.

    PubMed

    Cook, Richard J; Wei, Wei

    2003-07-01

    The design of clinical trials is typically based on marginal comparisons of a primary response under two or more treatments. The considerable gains in efficiency afforded by models conditional on one or more baseline responses has been extensively studied for Gaussian models. The purpose of this article is to present methods for the design and analysis of clinical trials in which the response is a count or a point process, and a corresponding baseline count is available prior to randomization. The methods are based on a conditional negative binomial model for the response given the baseline count and can be used to examine the effect of introducing selection criteria on power and sample size requirements. We show that designs based on this approach are more efficient than those proposed by McMahon et al. (1994).

  8. Cracked rocks with positive and negative Poisson's ratio: real-crack properties extracted from pressure dependence of elastic-wave velocities

    NASA Astrophysics Data System (ADS)

    Zaitsev, Vladimir Y.; Radostin, Andrey V.; Dyskin, Arcady V.; Pasternak, Elena

    2017-04-01

    We report results of analysis of literature data on P- and S-wave velocities of rocks subjected to variable hydrostatic pressure. Out of about 90 examined samples, in more than 40% of the samples the reconstructed Poisson's ratios are negative for lowest confining pressure with gradual transition to the conventional positive values at higher pressure. The portion of rocks exhibiting negative Poisson's ratio appeared to be unexpectedly high. To understand the mechanism of negative Poisson's ratio, pressure dependences of P- and S-wave velocities were analyzed using the effective medium model in which the reduction in the elastic moduli due to cracks is described in terms of compliances with respect to shear and normal loading that are imparted to the rock by the presence of cracks. This is in contrast to widely used descriptions of effective cracked medium based on a specific crack model (e.g., penny-shape crack) in which the ratio between normal and shear compliances of such a crack is strictly predetermined. The analysis of pressure-dependences of the elastic wave velocities makes it possible to reveal the ratio between pure normal and shear compliances (called q-ratio below) for real defects and quantify their integral content in the rock. The examination performed demonstrates that a significant portion (over 50%) of cracks exhibit q-ratio several times higher than that assumed for the conventional penny-shape cracks. This leads to faster reduction of the Poisson's ratio with increasing the crack concentration. Samples with negative Poisson's ratio are characterized by elevated q-ratio and simultaneously crack concentration. Our results clearly indicate that the traditional crack model is not adequate for a significant portion of rocks and that the interaction between the opposite crack faces leading to domination of the normal compliance and reduced shear displacement discontinuity can play an important role in the mechanical behavior of rocks.

  9. Distributional assumptions in food and feed commodities- development of fit-for-purpose sampling protocols.

    PubMed

    Paoletti, Claudia; Esbensen, Kim H

    2015-01-01

    Material heterogeneity influences the effectiveness of sampling procedures. Most sampling guidelines used for assessment of food and/or feed commodities are based on classical statistical distribution requirements, the normal, binomial, and Poisson distributions-and almost universally rely on the assumption of randomness. However, this is unrealistic. The scientific food and feed community recognizes a strong preponderance of non random distribution within commodity lots, which should be a more realistic prerequisite for definition of effective sampling protocols. Nevertheless, these heterogeneity issues are overlooked as the prime focus is often placed only on financial, time, equipment, and personnel constraints instead of mandating acquisition of documented representative samples under realistic heterogeneity conditions. This study shows how the principles promulgated in the Theory of Sampling (TOS) and practically tested over 60 years provide an effective framework for dealing with the complete set of adverse aspects of both compositional and distributional heterogeneity (material sampling errors), as well as with the errors incurred by the sampling process itself. The results of an empirical European Union study on genetically modified soybean heterogeneity, Kernel Lot Distribution Assessment are summarized, as they have a strong bearing on the issue of proper sampling protocol development. TOS principles apply universally in the food and feed realm and must therefore be considered the only basis for development of valid sampling protocols free from distributional constraints.

  10. Nearest-Neighbor Distances and Aggregative Effects in Turbulence

    NASA Astrophysics Data System (ADS)

    Lanerolle, Lyon W. J.; Rothschild, B. J.; Yeung, P. K.

    2000-11-01

    The dispersive nature of turbulence which causes fluid elements to move apart (on average) is well known. Here we study another facet of turbulent mixing relevant to marine population dynamics - on how small organisms (approximated by fluid particles) are brought close to each other and allowed to interact. The crucial role played by the small scales in this process allows us to use direct numerical simulations of stationary isotropic turbulence, here with Taylor-scale Reynolds numbers (R_λ) from 38 to 91. We study the evolution of the Nearest-Neighbor Distances (NND) for collections of fluid particles initially located randomly in space satisfying Poisson-type distributions with mean values from 0.5 to 2.0 Kolmogorov length scales. Our results show that as particles begin to disperse on average, some also begin to aggregate in space. In particular, we find that (i) a significant proportion of particles are closer to each other than if their NNDs were randomly distributed, (ii) aggregative effects become stronger with R_λ, and (iii) although the mean value of NND grows monotonically with time in Kolmogorov variables, the growth rates are slower at higher R_λ. These results may assist in explaining the ``patchiness'' in plankton distributions observed in biological oceanography. Further details are given in B. J. Rothschild et al., The Biophysical Interpretation of Spatial Effects of Small-scale Turbulent Flow in the Ocean (paper in prep.).

  11. Therapeutic Exercise Training to Reduce Chronic Headache in Working Women: Design of a Randomized Controlled Trial.

    PubMed

    Rinne, Marjo; Garam, Sanna; Häkkinen, Arja; Ylinen, Jari; Kukkonen-Harjula, Katriina; Nikander, Riku

    2016-05-01

    Cervicogenic headache and migraine are common causes of visits to physicians and physical therapists. Few randomized trials utilizing active physical therapy and progressive therapeutic exercise have been previously published. The existing evidence on active treatment methods supports a moderate effect on cervicogenic headache. The aim of this study is to investigate whether a progressive, group-based therapeutic exercise program decreases the intensity and frequency of chronic headache among women compared with a control group receiving a sham dose of transcutaneous electrical nerve stimulation (TENS) and stretching exercises. A randomized controlled trial with 6-month intervention and follow-up was developed. The participants were randomly assigned to either a treatment group or a control group. The study is being conducted at 2 study centers. The participants are women aged 18 to 60 years with chronic cervicogenic headache or migraine. The treatment group's exercise program consisted of 6 progressive therapeutic exercise modules, including proprioceptive low-load progressive craniocervical and cervical exercises and high-load exercises for the neck muscles. The participants in the control group received 6 individually performed sham TENS treatment sessions. The primary outcome is the intensity of headache. The secondary outcomes are changes in frequency and duration of headache, neck muscle strength, neck and shoulder flexibility, impact of headache on daily life, neck disability, fear-avoidance beliefs, work ability, and quality of life. Between-group differences will be analyzed separately at 6, 12, and 24 months with generalized linear mixed models. In the case of count data (eg, frequency of headache), Poisson or negative binomial regression will be used. The therapists are not blinded. The effects of specific therapeutic exercises on frequency, intensity, and duration of chronic headache and migraine will be reported. © 2016 American Physical Therapy Association.

  12. Analysis of fluctuations in semiconductor devices

    NASA Astrophysics Data System (ADS)

    Andrei, Petru

    The random nature of ion implantation and diffusion processes as well as inevitable tolerances in fabrication result in random fluctuations of doping concentrations and oxide thickness in semiconductor devices. These fluctuations are especially pronounced in ultrasmall (nanoscale) semiconductor devices when the spatial scale of doping and oxide thickness variations become comparable with the geometric dimensions of devices. In the dissertation, the effects of these fluctuations on device characteristics are analyzed by using a new technique for the analysis of random doping and oxide thickness induced fluctuations. This technique is universal in nature in the sense that it is applicable to any transport model (drift-diffusion, semiclassical transport, quantum transport etc.) and it can be naturally extended to take into account random fluctuations of the oxide (trapped) charges and channel length. The technique is based on linearization of the transport equations with respect to the fluctuating quantities. It is computationally much (a few orders of magnitude) more efficient than the traditional Monte-Carlo approach and it yields information on the sensitivity of fluctuations of parameters of interest (e.g. threshold voltage, small-signal parameters, cut-off frequencies, etc.) to the locations of doping and oxide thickness fluctuations. For this reason, it can be very instrumental in the design of fluctuation-resistant structures of semiconductor devices. Quantum mechanical effects are taken into account by using the density-gradient model as well as through self-consistent Poisson-Schrodinger computations. Special attention is paid to the presenting of the technique in a form that is suitable for implementation on commercial device simulators. The numerical implementation of the technique is discussed in detail and numerous computational results are presented and compared with those previously published in literature.

  13. On the Singularity of the Vlasov-Poisson System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    and Hong Qin, Jian Zheng

    2013-04-26

    The Vlasov-Poisson system can be viewed as the collisionless limit of the corresponding Fokker- Planck-Poisson system. It is reasonable to expect that the result of Landau damping can also be obtained from the Fokker-Planck-Poisson system when the collision frequency v approaches zero. However, we show that the colllisionless Vlasov-Poisson system is a singular limit of the collisional Fokker-Planck-Poisson system, and Landau's result can be recovered only as the approaching zero from the positive side.

  14. On the singularity of the Vlasov-Poisson system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Jian; Qin, Hong; Plasma Physics Laboratory, Princeton University, Princeton, New Jersey 08550

    2013-09-15

    The Vlasov-Poisson system can be viewed as the collisionless limit of the corresponding Fokker-Planck-Poisson system. It is reasonable to expect that the result of Landau damping can also be obtained from the Fokker-Planck-Poisson system when the collision frequency ν approaches zero. However, we show that the collisionless Vlasov-Poisson system is a singular limit of the collisional Fokker-Planck-Poisson system, and Landau's result can be recovered only as the ν approaches zero from the positive side.

  15. Auxetic Mechanical Metamaterials to Enhance Sensitivity of Stretchable Strain Sensors.

    PubMed

    Jiang, Ying; Liu, Zhiyuan; Matsuhisa, Naoji; Qi, Dianpeng; Leow, Wan Ru; Yang, Hui; Yu, Jiancan; Chen, Geng; Liu, Yaqing; Wan, Changjin; Liu, Zhuangjian; Chen, Xiaodong

    2018-03-01

    Stretchable strain sensors play a pivotal role in wearable devices, soft robotics, and Internet-of-Things, yet these viable applications, which require subtle strain detection under various strain, are often limited by low sensitivity. This inadequate sensitivity stems from the Poisson effect in conventional strain sensors, where stretched elastomer substrates expand in the longitudinal direction but compress transversely. In stretchable strain sensors, expansion separates the active materials and contributes to the sensitivity, while Poisson compression squeezes active materials together, and thus intrinsically limits the sensitivity. Alternatively, auxetic mechanical metamaterials undergo 2D expansion in both directions, due to their negative structural Poisson's ratio. Herein, it is demonstrated that such auxetic metamaterials can be incorporated into stretchable strain sensors to significantly enhance the sensitivity. Compared to conventional sensors, the sensitivity is greatly elevated with a 24-fold improvement. This sensitivity enhancement is due to the synergistic effect of reduced structural Poisson's ratio and strain concentration. Furthermore, microcracks are elongated as an underlying mechanism, verified by both experiments and numerical simulations. This strategy of employing auxetic metamaterials can be further applied to other stretchable strain sensors with different constituent materials. Moreover, it paves the way for utilizing mechanical metamaterials into a broader library of stretchable electronics. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Bayesian semi-parametric analysis of Poisson change-point regression models: application to policy making in Cali, Colombia.

    PubMed

    Park, Taeyoung; Krafty, Robert T; Sánchez, Alvaro I

    2012-07-27

    A Poisson regression model with an offset assumes a constant baseline rate after accounting for measured covariates, which may lead to biased estimates of coefficients in an inhomogeneous Poisson process. To correctly estimate the effect of time-dependent covariates, we propose a Poisson change-point regression model with an offset that allows a time-varying baseline rate. When the nonconstant pattern of a log baseline rate is modeled with a nonparametric step function, the resulting semi-parametric model involves a model component of varying dimension and thus requires a sophisticated varying-dimensional inference to obtain correct estimates of model parameters of fixed dimension. To fit the proposed varying-dimensional model, we devise a state-of-the-art MCMC-type algorithm based on partial collapse. The proposed model and methods are used to investigate an association between daily homicide rates in Cali, Colombia and policies that restrict the hours during which the legal sale of alcoholic beverages is permitted. While simultaneously identifying the latent changes in the baseline homicide rate which correspond to the incidence of sociopolitical events, we explore the effect of policies governing the sale of alcohol on homicide rates and seek a policy that balances the economic and cultural dependencies on alcohol sales to the health of the public.

  17. A Stabilized Finite Element Method for Modified Poisson-Nernst-Planck Equations to Determine Ion Flow Through a Nanopore

    PubMed Central

    Chaudhry, Jehanzeb Hameed; Comer, Jeffrey; Aksimentiev, Aleksei; Olson, Luke N.

    2013-01-01

    The conventional Poisson-Nernst-Planck equations do not account for the finite size of ions explicitly. This leads to solutions featuring unrealistically high ionic concentrations in the regions subject to external potentials, in particular, near highly charged surfaces. A modified form of the Poisson-Nernst-Planck equations accounts for steric effects and results in solutions with finite ion concentrations. Here, we evaluate numerical methods for solving the modified Poisson-Nernst-Planck equations by modeling electric field-driven transport of ions through a nanopore. We describe a novel, robust finite element solver that combines the applications of the Newton's method to the nonlinear Galerkin form of the equations, augmented with stabilization terms to appropriately handle the drift-diffusion processes. To make direct comparison with particle-based simulations possible, our method is specifically designed to produce solutions under periodic boundary conditions and to conserve the number of ions in the solution domain. We test our finite element solver on a set of challenging numerical experiments that include calculations of the ion distribution in a volume confined between two charged plates, calculations of the ionic current though a nanopore subject to an external electric field, and modeling the effect of a DNA molecule on the ion concentration and nanopore current. PMID:24363784

  18. Technical and biological variance structure in mRNA-Seq data: life in the real world

    PubMed Central

    2012-01-01

    Background mRNA expression data from next generation sequencing platforms is obtained in the form of counts per gene or exon. Counts have classically been assumed to follow a Poisson distribution in which the variance is equal to the mean. The Negative Binomial distribution which allows for over-dispersion, i.e., for the variance to be greater than the mean, is commonly used to model count data as well. Results In mRNA-Seq data from 25 subjects, we found technical variation to generally follow a Poisson distribution as has been reported previously and biological variability was over-dispersed relative to the Poisson model. The mean-variance relationship across all genes was quadratic, in keeping with a Negative Binomial (NB) distribution. Over-dispersed Poisson and NB distributional assumptions demonstrated marked improvements in goodness-of-fit (GOF) over the standard Poisson model assumptions, but with evidence of over-fitting in some genes. Modeling of experimental effects improved GOF for high variance genes but increased the over-fitting problem. Conclusions These conclusions will guide development of analytical strategies for accurate modeling of variance structure in these data and sample size determination which in turn will aid in the identification of true biological signals that inform our understanding of biological systems. PMID:22769017

  19. Measuring mouse retina response near the detection threshold to direct stimulation of photons with sub-poisson statistics

    NASA Astrophysics Data System (ADS)

    Tavala, Amir; Dovzhik, Krishna; Schicker, Klaus; Koschak, Alexandra; Zeilinger, Anton

    Probing the visual system of human and animals at very low photon rate regime has recently attracted the quantum optics community. In an experiment on the isolated photoreceptor cells of Xenopus, the cell output signal was measured while stimulating it by pulses with sub-poisson distributed photons. The results showed single photon detection efficiency of 29 +/-4.7% [1]. Another behavioral experiment on human suggests a less detection capability at perception level with the chance of 0.516 +/-0.01 (i.e. slightly better than random guess) [2]. Although the species are different, both biological models and experimental observations with classical light stimuli expect that a fraction of single photon responses is filtered somewhere within the retina network and/or during the neural processes in the brain. In this ongoing experiment, we look for a quantitative answer to this question by measuring the output signals of the last neural layer of WT mouse retina using microelectrode arrays. We use a heralded downconversion single-photon source. We stimulate the retina directly since the eye lens (responsible for 20-50% of optical loss and scattering [2]) is being removed. Here, we demonstrate our first results that confirms the response to the sub-poisson distributied pulses. This project was supported by Austrian Academy of Sciences, SFB FoQuS F 4007-N23 funded by FWF and ERC QIT4QAD 227844 funded by EU Commission.

  20. Electro-osmosis of non-Newtonian fluids in porous media using lattice Poisson-Boltzmann method.

    PubMed

    Chen, Simeng; He, Xinting; Bertola, Volfango; Wang, Moran

    2014-12-15

    Electro-osmosis in porous media has many important applications in various areas such as oil and gas exploitation and biomedical detection. Very often, fluids relevant to these applications are non-Newtonian because of the shear-rate dependent viscosity. The purpose of this study was to investigate the behaviors and physical mechanism of electro-osmosis of non-Newtonian fluids in porous media. Model porous microstructures (granular, fibrous, and network) were created by a random generation-growth method. The nonlinear governing equations of electro-kinetic transport for a power-law fluid were solved by the lattice Poisson-Boltzmann method (LPBM). The model results indicate that: (i) the electro-osmosis of non-Newtonian fluids exhibits distinct nonlinear behaviors compared to that of Newtonian fluids; (ii) when the bulk ion concentration or zeta potential is high enough, shear-thinning fluids exhibit higher electro-osmotic permeability, while shear-thickening fluids lead to the higher electro-osmotic permeability for very low bulk ion concentration or zeta potential; (iii) the effect of the porous medium structure depends significantly on the constitutive parameters: for fluids with large constitutive coefficients strongly dependent on the power-law index, the network structure shows the highest electro-osmotic permeability while the granular structure exhibits the lowest permeability on the entire range of power law indices considered; when the dependence of the constitutive coefficient on the power law index is weaker, different behaviors can be observed especially in case of strong shear thinning. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Wave-height hazard analysis in Eastern Coast of Spain - Bayesian approach using generalized Pareto distribution

    NASA Astrophysics Data System (ADS)

    Egozcue, J. J.; Pawlowsky-Glahn, V.; Ortego, M. I.

    2005-03-01

    Standard practice of wave-height hazard analysis often pays little attention to the uncertainty of assessed return periods and occurrence probabilities. This fact favors the opinion that, when large events happen, the hazard assessment should change accordingly. However, uncertainty of the hazard estimates is normally able to hide the effect of those large events. This is illustrated using data from the Mediterranean coast of Spain, where the last years have been extremely disastrous. Thus, it is possible to compare the hazard assessment based on data previous to those years with the analysis including them. With our approach, no significant change is detected when the statistical uncertainty is taken into account. The hazard analysis is carried out with a standard model. Time-occurrence of events is assumed Poisson distributed. The wave-height of each event is modelled as a random variable which upper tail follows a Generalized Pareto Distribution (GPD). Moreover, wave-heights are assumed independent from event to event and also independent of their occurrence in time. A threshold for excesses is assessed empirically. The other three parameters (Poisson rate, shape and scale parameters of GPD) are jointly estimated using Bayes' theorem. Prior distribution accounts for physical features of ocean waves in the Mediterranean sea and experience with these phenomena. Posterior distribution of the parameters allows to obtain posterior distributions of other derived parameters like occurrence probabilities and return periods. Predictives are also available. Computations are carried out using the program BGPE v2.0.

  2. Nambu-Poisson gauge theory

    NASA Astrophysics Data System (ADS)

    Jurčo, Branislav; Schupp, Peter; Vysoký, Jan

    2014-06-01

    We generalize noncommutative gauge theory using Nambu-Poisson structures to obtain a new type of gauge theory with higher brackets and gauge fields. The approach is based on covariant coordinates and higher versions of the Seiberg-Witten map. We construct a covariant Nambu-Poisson gauge theory action, give its first order expansion in the Nambu-Poisson tensor and relate it to a Nambu-Poisson matrix model.

  3. Photon statistics in scintillation crystals

    NASA Astrophysics Data System (ADS)

    Bora, Vaibhav Joga Singh

    Scintillation based gamma-ray detectors are widely used in medical imaging, high-energy physics, astronomy and national security. Scintillation gamma-ray detectors are eld-tested, relatively inexpensive, and have good detection eciency. Semi-conductor detectors are gaining popularity because of their superior capability to resolve gamma-ray energies. However, they are relatively hard to manufacture and therefore, at this time, not available in as large formats and much more expensive than scintillation gamma-ray detectors. Scintillation gamma-ray detectors consist of: a scintillator, a material that emits optical (scintillation) photons when it interacts with ionization radiation, and an optical detector that detects the emitted scintillation photons and converts them into an electrical signal. Compared to semiconductor gamma-ray detectors, scintillation gamma-ray detectors have relatively poor capability to resolve gamma-ray energies. This is in large part attributed to the "statistical limit" on the number of scintillation photons. The origin of this statistical limit is the assumption that scintillation photons are either Poisson distributed or super-Poisson distributed. This statistical limit is often dened by the Fano factor. The Fano factor of an integer-valued random process is dened as the ratio of its variance to its mean. Therefore, a Poisson process has a Fano factor of one. The classical theory of light limits the Fano factor of the number of photons to a value greater than or equal to one (Poisson case). However, the quantum theory of light allows for Fano factors to be less than one. We used two methods to look at the correlations between two detectors looking at same scintillation pulse to estimate the Fano factor of the scintillation photons. The relationship between the Fano factor and the correlation between the integral of the two signals detected was analytically derived, and the Fano factor was estimated using the measurements for SrI2:Eu, YAP:Ce and CsI:Na. We also found an empirical relationship between the Fano factor and the covariance as a function of time between two detectors looking at the same scintillation pulse. This empirical model was used to estimate the Fano factor of LaBr3:Ce and YAP:Ce using the experimentally measured timing-covariance. The estimates of the Fano factor from the time-covariance results were consistent with the estimates of the correlation between the integral signals. We found scintillation light from some scintillators to be sub-Poisson. For the same mean number of total scintillation photons, sub-Poisson light has lower noise. We then conducted a simulation study to investigate whether this low-noise sub-Poisson light can be used to improve spatial resolution. We calculated the Cramer-Rao bound for dierent detector geometries, position of interactions and Fano factors. The Cramer-Rao calculations were veried by generating simulated data and estimating the variance of the maximum likelihood estimator. We found that the Fano factor has no impact on the spatial resolution in gamma-ray imaging systems.

  4. A multiscale filter for noise reduction of low-dose cone beam projections.

    PubMed

    Yao, Weiguang; Farr, Jonathan B

    2015-08-21

    The Poisson or compound Poisson process governs the randomness of photon fluence in cone beam computed tomography (CBCT) imaging systems. The probability density function depends on the mean (noiseless) of the fluence at a certain detector. This dependence indicates the natural requirement of multiscale filters to smooth noise while preserving structures of the imaged object on the low-dose cone beam projection. In this work, we used a Gaussian filter, exp(-x2/2σ(2)(f)) as the multiscale filter to de-noise the low-dose cone beam projections. We analytically obtained the expression of σ(f), which represents the scale of the filter, by minimizing local noise-to-signal ratio. We analytically derived the variance of residual noise from the Poisson or compound Poisson processes after Gaussian filtering. From the derived analytical form of the variance of residual noise, optimal σ(2)(f)) is proved to be proportional to the noiseless fluence and modulated by local structure strength expressed as the linear fitting error of the structure. A strategy was used to obtain the reliable linear fitting error: smoothing the projection along the longitudinal direction to calculate the linear fitting error along the lateral direction and vice versa. The performance of our multiscale filter was examined on low-dose cone beam projections of a Catphan phantom and a head-and-neck patient. After performing the filter on the Catphan phantom projections scanned with pulse time 4 ms, the number of visible line pairs was similar to that scanned with 16 ms, and the contrast-to-noise ratio of the inserts was higher than that scanned with 16 ms about 64% in average. For the simulated head-and-neck patient projections with pulse time 4 ms, the visibility of soft tissue structures in the patient was comparable to that scanned with 20 ms. The image processing took less than 0.5 s per projection with 1024   ×   768 pixels.

  5. The process of learning in neural net models with Poisson and Gauss connectivities.

    PubMed

    Sivridis, L; Kotini, A; Anninos, P

    2008-01-01

    In this study we examined the dynamic behavior of isolated and non-isolated neural networks with chemical markers that follow a Poisson or Gauss distribution of connectivity. The Poisson distribution shows higher activity in comparison to the Gauss distribution although the latter has more connections that obliterated due to randomness. We examined 57 hematoxylin and eosin stained sections from an equal number of autopsy specimens with a diagnosis of "cerebral matter within normal limits". Neural counting was carried out in 5 continuous optic fields, with the use of a simple optical microscope connected to a computer (software programmer Nikon Act-1 vers-2). The number of neurons that corresponded to a surface was equal to 0.15 mm(2). There was a gradual reduction in the number of neurons as age increased. A mean value of 45.8 neurons /0.15 mm(2) was observed within the age range 21-25, 33 neurons /0.15 mm(2) within the age range 41-45, 19.3 neurons /0.15 mm(2) within the age range 56-60 years. After the age of 60 it was observed that the number of neurons per unit area stopped decreasing. A correlation was observed between these experimental findings and the theoretical neural model developed by professor Anninos and his colleagues. Equivalence between the mean numbers of neurons of the above mentioned age groups and the highest possible number of synaptic connections per neuron (highest number of synaptic connections corresponded to the age group 21-25) was created. We then used both inhibitory and excitatory post-synaptic potentials and applied these values to the Poisson and Gauss distributions, whereas the neuron threshold was varied between 3 and 5. According to the obtained phase diagrams, the hysteresis loops decrease as age increases. These findings were significant as the hysteresis loops can be regarded as the basis for short-term memory.

  6. Poisson-event-based analysis of cell proliferation.

    PubMed

    Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

    2015-05-01

    A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.

  7. Statistical description of non-Gaussian samples in the F2 layer of the ionosphere during heliogeophysical disturbances

    NASA Astrophysics Data System (ADS)

    Sergeenko, N. P.

    2017-11-01

    An adequate statistical method should be developed in order to predict probabilistically the range of ionospheric parameters. This problem is solved in this paper. The time series of the critical frequency of the layer F2- foF2( t) were subjected to statistical processing. For the obtained samples {δ foF2}, statistical distributions and invariants up to the fourth order are calculated. The analysis shows that the distributions differ from the Gaussian law during the disturbances. At levels of sufficiently small probability distributions, there are arbitrarily large deviations from the model of the normal process. Therefore, it is attempted to describe statistical samples {δ foF2} based on the Poisson model. For the studied samples, the exponential characteristic function is selected under the assumption that time series are a superposition of some deterministic and random processes. Using the Fourier transform, the characteristic function is transformed into a nonholomorphic excessive-asymmetric probability-density function. The statistical distributions of the samples {δ foF2} calculated for the disturbed periods are compared with the obtained model distribution function. According to the Kolmogorov's criterion, the probabilities of the coincidence of a posteriori distributions with the theoretical ones are P 0.7-0.9. The conducted analysis makes it possible to draw a conclusion about the applicability of a model based on the Poisson random process for the statistical description and probabilistic variation estimates during heliogeophysical disturbances of the variations {δ foF2}.

  8. Fast and Precise Emulation of Stochastic Biochemical Reaction Networks With Amplified Thermal Noise in Silicon Chips.

    PubMed

    Kim, Jaewook; Woo, Sung Sik; Sarpeshkar, Rahul

    2018-04-01

    The analysis and simulation of complex interacting biochemical reaction pathways in cells is important in all of systems biology and medicine. Yet, the dynamics of even a modest number of noisy or stochastic coupled biochemical reactions is extremely time consuming to simulate. In large part, this is because of the expensive cost of random number and Poisson process generation and the presence of stiff, coupled, nonlinear differential equations. Here, we demonstrate that we can amplify inherent thermal noise in chips to emulate randomness physically, thus alleviating these costs significantly. Concurrently, molecular flux in thermodynamic biochemical reactions maps to thermodynamic electronic current in a transistor such that stiff nonlinear biochemical differential equations are emulated exactly in compact, digitally programmable, highly parallel analog "cytomorphic" transistor circuits. For even small-scale systems involving just 80 stochastic reactions, our 0.35-μm BiCMOS chips yield a 311× speedup in the simulation time of Gillespie's stochastic algorithm over COPASI, a fast biochemical-reaction software simulator that is widely used in computational biology; they yield a 15 500× speedup over equivalent MATLAB stochastic simulations. The chip emulation results are consistent with these software simulations over a large range of signal-to-noise ratios. Most importantly, our physical emulation of Poisson chemical dynamics does not involve any inherently sequential processes and updates such that, unlike prior exact simulation approaches, they are parallelizable, asynchronous, and enable even more speedup for larger-size networks.

  9. Recalculated probability of M ≥ 7 earthquakes beneath the Sea of Marmara, Turkey

    USGS Publications Warehouse

    Parsons, T.

    2004-01-01

    New earthquake probability calculations are made for the Sea of Marmara region and the city of Istanbul, providing a revised forecast and an evaluation of time-dependent interaction techniques. Calculations incorporate newly obtained bathymetric images of the North Anatolian fault beneath the Sea of Marmara [Le Pichon et al., 2001; Armijo et al., 2002]. Newly interpreted fault segmentation enables an improved regional A.D. 1500-2000 earthquake catalog and interevent model, which form the basis for time-dependent probability estimates. Calculations presented here also employ detailed models of coseismic and postseismic slip associated with the 17 August 1999 M = 7.4 Izmit earthquake to investigate effects of stress transfer on seismic hazard. Probability changes caused by the 1999 shock depend on Marmara Sea fault-stressing rates, which are calculated with a new finite element model. The combined 2004-2034 regional Poisson probability of M≥7 earthquakes is ~38%, the regional time-dependent probability is 44 ± 18%, and incorporation of stress transfer raises it to 53 ± 18%. The most important effect of adding time dependence and stress transfer to the calculations is an increase in the 30 year probability of a M ??? 7 earthquake affecting Istanbul. The 30 year Poisson probability at Istanbul is 21%, and the addition of time dependence and stress transfer raises it to 41 ± 14%. The ranges given on probability values are sensitivities of the calculations to input parameters determined by Monte Carlo analysis; 1000 calculations are made using parameters drawn at random from distributions. Sensitivities are large relative to mean probability values and enhancements caused by stress transfer, reflecting a poor understanding of large-earthquake aperiodicity.

  10. Meteorite Material Model for Structural Properties

    NASA Technical Reports Server (NTRS)

    Agrawal, Parul; Carlozzi, Alexander A.; Karajeh, Zaid S.; Bryson, Kathryn L.

    2017-01-01

    To assess the threat posed by an asteroid entering Earth's atmosphere, one must predict if, when, and how it fragments during entry. A comprehensive understanding of the asteroid material properties is needed to achieve this objective. At present, the meteorite material found on earth are the only objects from an entering asteroid that can be used as representative material and be tested inside a laboratory setting. Due to complex petrology, it is technically challenging and expensive to obtain reliable material properties by means of laboratory test for a family of meteorites. In order to circumvent this challenge, meteorite unit models are developed to determine the effective material properties including Youngs modulus, compressive and tensile strengths and Poissons ratio, that in turn would help deduce the properties of asteroids. The meteorite unit is a representative volume that accounts for diverse minerals, porosity, cracks and matrix composition. The Youngs Modulus and Poissons Ratio in the meteorite units are calculated by performing several hundreds of Monte-Carlo simulations by randomly distributing the various phases inside these units. Once these values are obtained, cracks are introduced in these meteorite units. The size, orientation and distribution of cracks are derived by extensive CT-scans and visual scans of various meteorites from the same family. Subsequently, simulations are performed to attain stress-strain relations, strength and effective modulus values in the presence of these cracks. The meteorite unit models are presented for H, L and LL ordinary chondrites, as well as for terrestrial basalt. In the case of the latter, data from the simulations is compared with experimental data to validate the methodology. These material models will be subsequently used in fragmentation modeling of full scale asteroids.

  11. Beyond Poisson-Boltzmann: Fluctuation effects and correlation functions

    NASA Astrophysics Data System (ADS)

    Netz, R. R.; Orland, H.

    2000-02-01

    We formulate the exact non-linear field theory for a fluctuating counter-ion distribution in the presence of a fixed, arbitrary charge distribution. The Poisson-Boltzmann equation is obtained as the saddle-point of the field-theoretic action, and the effects of counter-ion fluctuations are included by a loop-wise expansion around this saddle point. The Poisson equation is obeyed at each order in this loop expansion. We explicitly give the expansion of the Gibbs potential up to two loops. We then apply our field-theoretic formalism to the case of a single impenetrable wall with counter ions only (in the absence of salt ions). We obtain the fluctuation corrections to the electrostatic potential and the counter-ion density to one-loop order without further approximations. The relative importance of fluctuation corrections is controlled by a single parameter, which is proportional to the cube of the counter-ion valency and to the surface charge density. The effective interactions and correlation functions between charged particles close to the charged wall are obtained on the one-loop level.

  12. Selecting the right statistical model for analysis of insect count data by using information theoretic measures.

    PubMed

    Sileshi, G

    2006-10-01

    Researchers and regulatory agencies often make statistical inferences from insect count data using modelling approaches that assume homogeneous variance. Such models do not allow for formal appraisal of variability which in its different forms is the subject of interest in ecology. Therefore, the objectives of this paper were to (i) compare models suitable for handling variance heterogeneity and (ii) select optimal models to ensure valid statistical inferences from insect count data. The log-normal, standard Poisson, Poisson corrected for overdispersion, zero-inflated Poisson, the negative binomial distribution and zero-inflated negative binomial models were compared using six count datasets on foliage-dwelling insects and five families of soil-dwelling insects. Akaike's and Schwarz Bayesian information criteria were used for comparing the various models. Over 50% of the counts were zeros even in locally abundant species such as Ootheca bennigseni Weise, Mesoplatys ochroptera Stål and Diaecoderus spp. The Poisson model after correction for overdispersion and the standard negative binomial distribution model provided better description of the probability distribution of seven out of the 11 insects than the log-normal, standard Poisson, zero-inflated Poisson or zero-inflated negative binomial models. It is concluded that excess zeros and variance heterogeneity are common data phenomena in insect counts. If not properly modelled, these properties can invalidate the normal distribution assumptions resulting in biased estimation of ecological effects and jeopardizing the integrity of the scientific inferences. Therefore, it is recommended that statistical models appropriate for handling these data properties be selected using objective criteria to ensure efficient statistical inference.

  13. The Constitutive Modeling of Thin Films with Randon Material Wrinkles

    NASA Technical Reports Server (NTRS)

    Murphey, Thomas W.; Mikulas, Martin M.

    2001-01-01

    Material wrinkles drastically alter the structural constitutive properties of thin films. Normally linear elastic materials, when wrinkled, become highly nonlinear and initially inelastic. Stiffness' reduced by 99% and negative Poisson's ratios are typically observed. This paper presents an effective continuum constitutive model for the elastic effects of material wrinkles in thin films. The model considers general two-dimensional stress and strain states (simultaneous bi-axial and shear stress/strain) and neglects out of plane bending. The constitutive model is derived from a traditional mechanics analysis of an idealized physical model of random material wrinkles. Model parameters are the directly measurable wrinkle characteristics of amplitude and wavelength. For these reasons, the equations are mechanistic and deterministic. The model is compared with bi-axial tensile test data for wrinkled Kaptong(Registered Trademark) HN and is shown to deterministically predict strain as a function of stress with an average RMS error of 22%. On average, fitting the model to test data yields an RMS error of 1.2%

  14. Smartphone based hand-held quantitative phase microscope using the transport of intensity equation method.

    PubMed

    Meng, Xin; Huang, Huachuan; Yan, Keding; Tian, Xiaolin; Yu, Wei; Cui, Haoyang; Kong, Yan; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2016-12-20

    In order to realize high contrast imaging with portable devices for potential mobile healthcare, we demonstrate a hand-held smartphone based quantitative phase microscope using the transport of intensity equation method. With a cost-effective illumination source and compact microscope system, multi-focal images of samples can be captured by the smartphone's camera via manual focusing. Phase retrieval is performed using a self-developed Android application, which calculates sample phases from multi-plane intensities via solving the Poisson equation. We test the portable microscope using a random phase plate with known phases, and to further demonstrate its performance, a red blood cell smear, a Pap smear and monocot root and broad bean epidermis sections are also successfully imaged. Considering its advantages as an accurate, high-contrast, cost-effective and field-portable device, the smartphone based hand-held quantitative phase microscope is a promising tool which can be adopted in the future in remote healthcare and medical diagnosis.

  15. The Effects of a Skill-Based Intervention for Victims of Bullying in Brazil

    PubMed Central

    da Silva, Jorge Luiz; de Oliveira, Wanderlei Abadio; Braga, Iara Falleiros; Farias, Marilurdes Silva; da Silva Lizzi, Elisangela Aparecida; Gonçalves, Marlene Fagundes Carvalho; Pereira, Beatriz Oliveira; Silva, Marta Angélica Iossi

    2016-01-01

    This study’s objective was to verify whether improved social and emotional skills would reduce victimization among Brazilian 6th grade student victims of bullying. The targets of this intervention were victimized students; a total of 78 victims participated. A cognitive-behavioral intervention based on social and emotional skills was held in eight weekly sessions. The sessions focused on civility, the ability to make friends, self-control, emotional expressiveness, empathy, assertiveness, and interpersonal problem-solving capacity. Data were analyzed through Poisson regression models with random effects. Pre- and post-analyses reveal that intervention and comparison groups presented significant reduced victimization by bullying. No significant improvement was found in regard to difficulties in practicing social skills. Victimization reduction cannot be attributed to the program. This study contributes to the incipient literature addressing anti-bullying interventions conducted in developing countries and highlights the need for approaches that do not exclusively focus on the students’ individual aspects. PMID:27792206

  16. A Three-dimensional Polymer Scaffolding Material Exhibiting a Zero Poisson's Ratio.

    PubMed

    Soman, Pranav; Fozdar, David Y; Lee, Jin Woo; Phadke, Ameya; Varghese, Shyni; Chen, Shaochen

    2012-05-14

    Poisson's ratio describes the degree to which a material contracts (expands) transversally when axially strained. A material with a zero Poisson's ratio does not transversally deform in response to an axial strain (stretching). In tissue engineering applications, scaffolding having a zero Poisson's ratio (ZPR) may be more suitable for emulating the behavior of native tissues and accommodating and transmitting forces to the host tissue site during wound healing (or tissue regrowth). For example, scaffolding with a zero Poisson's ratio may be beneficial in the engineering of cartilage, ligament, corneal, and brain tissues, which are known to possess Poisson's ratios of nearly zero. Here, we report a 3D biomaterial constructed from polyethylene glycol (PEG) exhibiting in-plane Poisson's ratios of zero for large values of axial strain. We use digital micro-mirror device projection printing (DMD-PP) to create single- and double-layer scaffolds composed of semi re-entrant pores whose arrangement and deformation mechanisms contribute the zero Poisson's ratio. Strain experiments prove the zero Poisson's behavior of the scaffolds and that the addition of layers does not change the Poisson's ratio. Human mesenchymal stem cells (hMSCs) cultured on biomaterials with zero Poisson's ratio demonstrate the feasibility of utilizing these novel materials for biological applications which require little to no transverse deformations resulting from axial strains. Techniques used in this work allow Poisson's ratio to be both scale-independent and independent of the choice of strut material for strains in the elastic regime, and therefore ZPR behavior can be imparted to a variety of photocurable biomaterial.

  17. From Loss of Memory to Poisson.

    ERIC Educational Resources Information Center

    Johnson, Bruce R.

    1983-01-01

    A way of presenting the Poisson process and deriving the Poisson distribution for upper-division courses in probability or mathematical statistics is presented. The main feature of the approach lies in the formulation of Poisson postulates with immediate intuitive appeal. (MNS)

  18. How happy is your web browsing? A model to quantify satisfaction of an Internet user searching for desired information

    NASA Astrophysics Data System (ADS)

    Banerji, Anirban; Magarkar, Aniket

    2012-09-01

    We feel happy when web browsing operations provide us with necessary information; otherwise, we feel bitter. How to measure this happiness (or bitterness)? How does the profile of happiness grow and decay during the course of web browsing? We propose a probabilistic framework that models the evolution of user satisfaction, on top of his/her continuous frustration at not finding the required information. It is found that the cumulative satisfaction profile of a web-searching individual can be modeled effectively as the sum of a random number of random terms, where each term is a mutually independent random variable, originating from ‘memoryless’ Poisson flow. Evolution of satisfaction over the entire time interval of a user’s browsing was modeled using auto-correlation analysis. A utilitarian marker, a magnitude of greater than unity of which describes happy web-searching operations, and an empirical limit that connects user’s satisfaction with his frustration level-are proposed too. The presence of pertinent information in the very first page of a website and magnitude of the decay parameter of user satisfaction (frustration, irritation etc.) are found to be two key aspects that dominate the web user’s psychology. The proposed model employed different combinations of decay parameter, searching time and number of helpful websites. The obtained results are found to match the results from three real-life case studies.

  19. Anderson Localization in Quark-Gluon Plasma

    NASA Astrophysics Data System (ADS)

    Kovács, Tamás G.; Pittler, Ferenc

    2010-11-01

    At low temperature the low end of the QCD Dirac spectrum is well described by chiral random matrix theory. In contrast, at high temperature there is no similar statistical description of the spectrum. We show that at high temperature the lowest part of the spectrum consists of a band of statistically uncorrelated eigenvalues obeying essentially Poisson statistics and the corresponding eigenvectors are extremely localized. Going up in the spectrum the spectral density rapidly increases and the eigenvectors become more and more delocalized. At the same time the spectral statistics gradually crosses over to the bulk statistics expected from the corresponding random matrix ensemble. This phenomenon is reminiscent of Anderson localization in disordered conductors. Our findings are based on staggered Dirac spectra in quenched lattice simulations with the SU(2) gauge group.

  20. Statistical model for speckle pattern optimization.

    PubMed

    Su, Yong; Zhang, Qingchuan; Gao, Zeren

    2017-11-27

    Image registration is the key technique of optical metrologies such as digital image correlation (DIC), particle image velocimetry (PIV), and speckle metrology. Its performance depends critically on the quality of image pattern, and thus pattern optimization attracts extensive attention. In this article, a statistical model is built to optimize speckle patterns that are composed of randomly positioned speckles. It is found that the process of speckle pattern generation is essentially a filtered Poisson process. The dependence of measurement errors (including systematic errors, random errors, and overall errors) upon speckle pattern generation parameters is characterized analytically. By minimizing the errors, formulas of the optimal speckle radius are presented. Although the primary motivation is from the field of DIC, we believed that scholars in other optical measurement communities, such as PIV and speckle metrology, will benefit from these discussions.

  1. Physical mechanisms of timing jitter in photon detection by current-carrying superconducting nanowires

    NASA Astrophysics Data System (ADS)

    Sidorova, Mariia; Semenov, Alexej; Hübers, Heinz-Wilhelm; Charaev, Ilya; Kuzmin, Artem; Doerner, Steffen; Siegel, Michael

    2017-11-01

    We studied timing jitter in the appearance of photon counts in meandering nanowires with different fractional amount of bends. Intrinsic timing jitter, which is the probability density function of the random time delay between photon absorption in current-carrying superconducting nanowire and appearance of the normal domain, reveals two different underlying physical mechanisms. In the deterministic regime, which is realized at large photon energies and large currents, jitter is controlled by position-dependent detection threshold in straight parts of meanders. It decreases with the increase in the current. At small photon energies, jitter increases and its current dependence disappears. In this probabilistic regime jitter is controlled by Poisson process in that magnetic vortices jump randomly across the wire in areas adjacent to the bends.

  2. Dielectric Self-Energy in Poisson-Boltzmann and Poisson-Nernst-Planck Models of Ion Channels

    PubMed Central

    Corry, Ben; Kuyucak, Serdar; Chung, Shin-Ho

    2003-01-01

    We demonstrated previously that the two continuum theories widely used in modeling biological ion channels give unreliable results when the radius of the conduit is less than two Debye lengths. The reason for this failure is the neglect of surface charges on the protein wall induced by permeating ions. Here we attempt to improve the accuracy of the Poisson-Boltzmann and Poisson-Nernst-Planck theories, when applied to channel-like environments, by including a specific dielectric self-energy term to overcome spurious shielding effects inherent in these theories. By comparing results with Brownian dynamics simulations, we show that the inclusion of an additional term in the equations yields significant qualitative improvements. The modified theories perform well in very wide and very narrow channels, but are less successful at intermediate sizes. The situation is worse in multi-ion channels because of the inability of the continuum theories to handle the ion-to-ion interactions correctly. Thus, further work is required if these continuum theories are to be reliably salvaged for quantitative studies of biological ion channels in all situations. PMID:12770869

  3. Accuracy assessment of the linear Poisson-Boltzmann equation and reparametrization of the OBC generalized Born model for nucleic acids and nucleic acid-protein complexes.

    PubMed

    Fogolari, Federico; Corazza, Alessandra; Esposito, Gennaro

    2015-04-05

    The generalized Born model in the Onufriev, Bashford, and Case (Onufriev et al., Proteins: Struct Funct Genet 2004, 55, 383) implementation has emerged as one of the best compromises between accuracy and speed of computation. For simulations of nucleic acids, however, a number of issues should be addressed: (1) the generalized Born model is based on a linear model and the linearization of the reference Poisson-Boltmann equation may be questioned for highly charged systems as nucleic acids; (2) although much attention has been given to potentials, solvation forces could be much less sensitive to linearization than the potentials; and (3) the accuracy of the Onufriev-Bashford-Case (OBC) model for nucleic acids depends on fine tuning of parameters. Here, we show that the linearization of the Poisson Boltzmann equation has mild effects on computed forces, and that with optimal choice of the OBC model parameters, solvation forces, essential for molecular dynamics simulations, agree well with those computed using the reference Poisson-Boltzmann model. © 2015 Wiley Periodicals, Inc.

  4. Determination of oral mucosal Poisson's ratio and coefficient of friction from in-vivo contact pressure measurements.

    PubMed

    Chen, Junning; Suenaga, Hanako; Hogg, Michael; Li, Wei; Swain, Michael; Li, Qing

    2016-01-01

    Despite their considerable importance to biomechanics, there are no existing methods available to directly measure apparent Poisson's ratio and friction coefficient of oral mucosa. This study aimed to develop an inverse procedure to determine these two biomechanical parameters by utilizing in vivo experiment of contact pressure between partial denture and beneath mucosa through nonlinear finite element (FE) analysis and surrogate response surface (RS) modelling technique. First, the in vivo denture-mucosa contact pressure was measured by a tactile electronic sensing sheet. Second, a 3D FE model was constructed based on the patient CT images. Third, a range of apparent Poisson's ratios and the coefficients of friction from literature was considered as the design variables in a series of FE runs for constructing a RS surrogate model. Finally, the discrepancy between computed in silico and measured in vivo results was minimized to identify the best matching Poisson's ratio and coefficient of friction. The established non-invasive methodology was demonstrated effective to identify such biomechanical parameters of oral mucosa and can be potentially used for determining the biomaterial properties of other soft biological tissues.

  5. A chi-square goodness-of-fit test for non-identically distributed random variables: with application to empirical Bayes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conover, W.J.; Cox, D.D.; Martz, H.F.

    1997-12-01

    When using parametric empirical Bayes estimation methods for estimating the binomial or Poisson parameter, the validity of the assumed beta or gamma conjugate prior distribution is an important diagnostic consideration. Chi-square goodness-of-fit tests of the beta or gamma prior hypothesis are developed for use when the binomial sample sizes or Poisson exposure times vary. Nine examples illustrate the application of the methods, using real data from such diverse applications as the loss of feedwater flow rates in nuclear power plants, the probability of failure to run on demand and the failure rates of the high pressure coolant injection systems atmore » US commercial boiling water reactors, the probability of failure to run on demand of emergency diesel generators in US commercial nuclear power plants, the rate of failure of aircraft air conditioners, baseball batting averages, the probability of testing positive for toxoplasmosis, and the probability of tumors in rats. The tests are easily applied in practice by means of corresponding Mathematica{reg_sign} computer programs which are provided.« less

  6. Statistical guides to estimating the number of undiscovered mineral deposits: an example with porphyry copper deposits

    USGS Publications Warehouse

    Singer, Donald A.; Menzie, W.D.; Cheng, Qiuming; Bonham-Carter, G. F.

    2005-01-01

    Estimating numbers of undiscovered mineral deposits is a fundamental part of assessing mineral resources. Some statistical tools can act as guides to low variance, unbiased estimates of the number of deposits. The primary guide is that the estimates must be consistent with the grade and tonnage models. Another statistical guide is the deposit density (i.e., the number of deposits per unit area of permissive rock in well-explored control areas). Preliminary estimates and confidence limits of the number of undiscovered deposits in a tract of given area may be calculated using linear regression and refined using frequency distributions with appropriate parameters. A Poisson distribution leads to estimates having lower relative variances than the regression estimates and implies a random distribution of deposits. Coefficients of variation are used to compare uncertainties of negative binomial, Poisson, or MARK3 empirical distributions that have the same expected number of deposits as the deposit density. Statistical guides presented here allow simple yet robust estimation of the number of undiscovered deposits in permissive terranes. 

  7. Stochastic modeling for neural spiking events based on fractional superstatistical Poisson process

    NASA Astrophysics Data System (ADS)

    Konno, Hidetoshi; Tamura, Yoshiyasu

    2018-01-01

    In neural spike counting experiments, it is known that there are two main features: (i) the counting number has a fractional power-law growth with time and (ii) the waiting time (i.e., the inter-spike-interval) distribution has a heavy tail. The method of superstatistical Poisson processes (SSPPs) is examined whether these main features are properly modeled. Although various mixed/compound Poisson processes are generated with selecting a suitable distribution of the birth-rate of spiking neurons, only the second feature (ii) can be modeled by the method of SSPPs. Namely, the first one (i) associated with the effect of long-memory cannot be modeled properly. Then, it is shown that the two main features can be modeled successfully by a class of fractional SSPP (FSSPP).

  8. A new approach for handling longitudinal count data with zero-inflation and overdispersion: poisson geometric process model.

    PubMed

    Wan, Wai-Yin; Chan, Jennifer S K

    2009-08-01

    For time series of count data, correlated measurements, clustering as well as excessive zeros occur simultaneously in biomedical applications. Ignoring such effects might contribute to misleading treatment outcomes. A generalized mixture Poisson geometric process (GMPGP) model and a zero-altered mixture Poisson geometric process (ZMPGP) model are developed from the geometric process model, which was originally developed for modelling positive continuous data and was extended to handle count data. These models are motivated by evaluating the trend development of new tumour counts for bladder cancer patients as well as by identifying useful covariates which affect the count level. The models are implemented using Bayesian method with Markov chain Monte Carlo (MCMC) algorithms and are assessed using deviance information criterion (DIC).

  9. Effect of peer-based low back pain information and reassurance at the workplace on sick leave: a cluster randomized trial.

    PubMed

    Odeen, Magnus; Ihlebæk, Camilla; Indahl, Aage; Wormgoor, Marjon E A; Lie, Stein A; Eriksen, Hege R

    2013-06-01

    To evaluate whether information and reassurance about low back pain (LBP) given to employees at the workplace could reduce sick leave. A Cluster randomized controlled trial with 135 work units of about 3,500 public sector employees in two Norwegian municipalities, randomized into two intervention groups; Education and peer support (EPS) (n = 45 units), education and "peer support and access to an outpatient clinic" (EPSOC) (n = 48 units), and a control group (n = 42 units). Both interventions consisted of educational meetings based on a "non-injury model" and a "peer adviser" appointed by colleagues. Employees in the EPSOC group had access to an outpatient clinic for medical examination and further education. The control group received no intervention. The main outcome was sick leave based on municipal records. Secondary outcomes were self-reported pain, pain related fear of movement, coping, and beliefs about LBP from survey data of 1,746 employees (response rate about 50 %). EPS reduced sick leave by 7 % and EPSOC reduced sick leave by 4 % during the intervention year, while sick leave in the control group was increased by 7 % during the same period. Overall, Rate Ratios (RR) were statistically significant for EPSOC (RR = .84 (C.I = 0.71-.99) but not EPS (RR = .92 (C.I = 0.78-1.09)) in a mixed Poisson regression analysis. Faulty beliefs about LBP were reduced in both intervention groups. Educational meetings, combined with peer support and access to an outpatient clinic, were effective in reducing sick leave in public sector employees.

  10. Effectiveness of an intensive E-mail based intervention in smoking cessation (TABATIC study): study protocol for a randomized controlled trial.

    PubMed

    Díaz-Gete, Laura; Puigdomènech, Elisa; Briones, Elena Mercedes; Fàbregas-Escurriola, Mireia; Fernandez, Soraya; Del Val, Jose Luis; Ballvé, Jose Luis; Casajuana, Marc; Sánchez-Fondevila, Jessica; Clemente, Lourdes; Castaño, Carmen; Martín-Cantera, Carlos

    2013-04-18

    Intensive interventions on smoking cessation increase abstinence rates. However, few electronic mail (E-mail) based intensive interventions have been tested in smokers and none in primary care (PC) setting. The aim of the present study is to evaluate the effectiveness of an intensive E-mail based intervention in smokers attending PC services. Randomized Controlled Multicentric Trial. 1060 smokers aged between 18-70 years from Catalonia, Salamanca and Aragón (Spain) who have and check regularly an E-mail account. Patients will be randomly assigned to control or intervention group. Six phase intensive intervention with two face to face interviews and four automatically created and personal E-mail patients tracking, if needed other E-mail contacts will be made. Control group will receive a brief advice on smoking cessation. Will be measured at 6 and 12 months after intervention: self reported continuous abstinence (confirmed by cooximetry), point prevalence abstinence, tobacco consumption, evolution of stage according to Prochaska and DiClemente's Stages of Change Model, length of visit, costs for the patient to access Primary Care Center. Descriptive and logistic and Poisson regression analysis under the intention to treat basis using SPSS v.17. The proposed intervention is an E-mail based intensive intervention in smokers attending primary care. Positive results could be useful to demonstrate a higher percentage of short and long-term abstinence among smokers attended in PC in Spain who regularly use E-mail. Furthermore, this intervention could be helpful in all health services to help smokers to quit. Clinical Trials.gov Identifier: NCT01494246.

  11. Characterization of Nonhomogeneous Poisson Processes Via Moment Conditions.

    DTIC Science & Technology

    1986-08-01

    Poisson processes play an important role in many fields. The Poisson process is one of the simplest counting processes and is a building block for...place of independent increments. This provides a somewhat different viewpoint for examining Poisson processes . In addition, new characterizations for

  12. Constructions and classifications of projective Poisson varieties.

    PubMed

    Pym, Brent

    2018-01-01

    This paper is intended both as an introduction to the algebraic geometry of holomorphic Poisson brackets, and as a survey of results on the classification of projective Poisson manifolds that have been obtained in the past 20 years. It is based on the lecture series delivered by the author at the Poisson 2016 Summer School in Geneva. The paper begins with a detailed treatment of Poisson surfaces, including adjunction, ruled surfaces and blowups, and leading to a statement of the full birational classification. We then describe several constructions of Poisson threefolds, outlining the classification in the regular case, and the case of rank-one Fano threefolds (such as projective space). Following a brief introduction to the notion of Poisson subspaces, we discuss Bondal's conjecture on the dimensions of degeneracy loci on Poisson Fano manifolds. We close with a discussion of log symplectic manifolds with simple normal crossings degeneracy divisor, including a new proof of the classification in the case of rank-one Fano manifolds.

  13. Constructions and classifications of projective Poisson varieties

    NASA Astrophysics Data System (ADS)

    Pym, Brent

    2018-03-01

    This paper is intended both as an introduction to the algebraic geometry of holomorphic Poisson brackets, and as a survey of results on the classification of projective Poisson manifolds that have been obtained in the past 20 years. It is based on the lecture series delivered by the author at the Poisson 2016 Summer School in Geneva. The paper begins with a detailed treatment of Poisson surfaces, including adjunction, ruled surfaces and blowups, and leading to a statement of the full birational classification. We then describe several constructions of Poisson threefolds, outlining the classification in the regular case, and the case of rank-one Fano threefolds (such as projective space). Following a brief introduction to the notion of Poisson subspaces, we discuss Bondal's conjecture on the dimensions of degeneracy loci on Poisson Fano manifolds. We close with a discussion of log symplectic manifolds with simple normal crossings degeneracy divisor, including a new proof of the classification in the case of rank-one Fano manifolds.

  14. Disk Density Tuning of a Maximal Random Packing

    PubMed Central

    Ebeida, Mohamed S.; Rushdi, Ahmad A.; Awad, Muhammad A.; Mahmoud, Ahmed H.; Yan, Dong-Ming; English, Shawn A.; Owens, John D.; Bajaj, Chandrajit L.; Mitchell, Scott A.

    2016-01-01

    We introduce an algorithmic framework for tuning the spatial density of disks in a maximal random packing, without changing the sizing function or radii of disks. Starting from any maximal random packing such as a Maximal Poisson-disk Sampling (MPS), we iteratively relocate, inject (add), or eject (remove) disks, using a set of three successively more-aggressive local operations. We may achieve a user-defined density, either more dense or more sparse, almost up to the theoretical structured limits. The tuned samples are conflict-free, retain coverage maximality, and, except in the extremes, retain the blue noise randomness properties of the input. We change the density of the packing one disk at a time, maintaining the minimum disk separation distance and the maximum domain coverage distance required of any maximal packing. These properties are local, and we can handle spatially-varying sizing functions. Using fewer points to satisfy a sizing function improves the efficiency of some applications. We apply the framework to improve the quality of meshes, removing non-obtuse angles; and to more accurately model fiber reinforced polymers for elastic and failure simulations. PMID:27563162

  15. Disk Density Tuning of a Maximal Random Packing.

    PubMed

    Ebeida, Mohamed S; Rushdi, Ahmad A; Awad, Muhammad A; Mahmoud, Ahmed H; Yan, Dong-Ming; English, Shawn A; Owens, John D; Bajaj, Chandrajit L; Mitchell, Scott A

    2016-08-01

    We introduce an algorithmic framework for tuning the spatial density of disks in a maximal random packing, without changing the sizing function or radii of disks. Starting from any maximal random packing such as a Maximal Poisson-disk Sampling (MPS), we iteratively relocate, inject (add), or eject (remove) disks, using a set of three successively more-aggressive local operations. We may achieve a user-defined density, either more dense or more sparse, almost up to the theoretical structured limits. The tuned samples are conflict-free, retain coverage maximality, and, except in the extremes, retain the blue noise randomness properties of the input. We change the density of the packing one disk at a time, maintaining the minimum disk separation distance and the maximum domain coverage distance required of any maximal packing. These properties are local, and we can handle spatially-varying sizing functions. Using fewer points to satisfy a sizing function improves the efficiency of some applications. We apply the framework to improve the quality of meshes, removing non-obtuse angles; and to more accurately model fiber reinforced polymers for elastic and failure simulations.

  16. Saddlepoint approximation to the distribution of the total distance of the continuous time random walk

    NASA Astrophysics Data System (ADS)

    Gatto, Riccardo

    2017-12-01

    This article considers the random walk over Rp, with p ≥ 2, where a given particle starts at the origin and moves stepwise with uniformly distributed step directions and step lengths following a common distribution. Step directions and step lengths are independent. The case where the number of steps of the particle is fixed and the more general case where it follows an independent continuous time inhomogeneous counting process are considered. Saddlepoint approximations to the distribution of the distance from the position of the particle to the origin are provided. Despite the p-dimensional nature of the random walk, the computations of the saddlepoint approximations are one-dimensional and thus simple. Explicit formulae are derived with dimension p = 3: for uniformly and exponentially distributed step lengths, for fixed and for Poisson distributed number of steps. In these situations, the high accuracy of the saddlepoint approximations is illustrated by numerical comparisons with Monte Carlo simulation. Contribution to the "Topical Issue: Continuous Time Random Walk Still Trendy: Fifty-year History, Current State and Outlook", edited by Ryszard Kutner and Jaume Masoliver.

  17. A Novel Method for Preparing Auxetic Foam from Closed-cell Polymer Foam Based on Steam Penetration and Condensation (SPC) Process.

    PubMed

    Fan, Donglei; Li, Minggang; Qiu, Jian; Xing, Haiping; Jiang, Zhiwei; Tang, Tao

    2018-05-31

    Auxetic materials are a class of materials possessing negative Poisson's ratio. Here we establish a novel method for preparing auxetic foam from closed-cell polymer foam based on steam penetration and condensation (SPC) process. Using polyethylene (PE) closed-cell foam as an example, the resultant foams treated by SPC process present negative Poisson's ratio during stretching and compression testing. The effect of steam-treated temperature and time on the conversion efficiency of negative Poisson's ratio foam is investigated, and the mechanism of SPC method for forming re-entrant structure is discussed. The results indicate that the presence of enough steam within the cells is a critical factor for the negative Poisson's ratio conversion in the SPC process. The pressure difference caused by steam condensation is the driving force for the conversion from conventional closed-cell foam to the negative Poisson's ratio foam. Furthermore, the applicability of SPC process for fabricating auxetic foam is studied by replacing PE foam by polyvinyl chloride (PVC) foam with closed-cell structure or replacing water steam by ethanol steam. The results verify the universality of SPC process for fabricating auxetic foams from conventional foams with closed-cell structure. In addition, we explored potential application of the obtained auxetic foams by SPC process in the fabrication of shape memory polymer materials.

  18. Analyzing Propensity Matched Zero-Inflated Count Outcomes in Observational Studies

    PubMed Central

    DeSantis, Stacia M.; Lazaridis, Christos; Ji, Shuang; Spinale, Francis G.

    2013-01-01

    Determining the effectiveness of different treatments from observational data, which are characterized by imbalance between groups due to lack of randomization, is challenging. Propensity matching is often used to rectify imbalances among prognostic variables. However, there are no guidelines on how appropriately to analyze group matched data when the outcome is a zero inflated count. In addition, there is debate over whether to account for correlation of responses induced by matching, and/or whether to adjust for variables used in generating the propensity score in the final analysis. The aim of this research is to compare covariate unadjusted and adjusted zero-inflated Poisson models that do and do not account for the correlation. A simulation study is conducted, demonstrating that it is necessary to adjust for potential residual confounding, but that accounting for correlation is less important. The methods are applied to a biomedical research data set. PMID:24298197

  19. Effects of adaptive degrees of trust on coevolution of quantum strategies on scale-free networks.

    PubMed

    Li, Qiang; Chen, Minyou; Perc, Matjaž; Iqbal, Azhar; Abbott, Derek

    2013-10-15

    We study the impact of adaptive degrees of trust on the evolution of cooperation in the quantum prisoner's dilemma game. In addition to the strategies, links between players are also subject to evolution. Starting with a scale-free interaction network, players adjust trust towards their neighbors based on received payoffs. The latter governs the strategy adoption process, while trust governs the rewiring of links. As soon as the degree of trust towards a neighbor drops to zero, the link is rewired to another randomly chosen player within the network. We find that for small temptations to defect cooperators always dominate, while for intermediate and strong temptations a single quantum strategy is able to outperform all other strategies. In general, reciprocal trust remains within close relationships and favors the dominance of a single strategy. Due to coevolution, the power-law degree distributions transform to Poisson distributions.

  20. Effects of adaptive degrees of trust on coevolution of quantum strategies on scale-free networks

    NASA Astrophysics Data System (ADS)

    Li, Qiang; Chen, Minyou; Perc, Matjaž; Iqbal, Azhar; Abbott, Derek

    2013-10-01

    We study the impact of adaptive degrees of trust on the evolution of cooperation in the quantum prisoner's dilemma game. In addition to the strategies, links between players are also subject to evolution. Starting with a scale-free interaction network, players adjust trust towards their neighbors based on received payoffs. The latter governs the strategy adoption process, while trust governs the rewiring of links. As soon as the degree of trust towards a neighbor drops to zero, the link is rewired to another randomly chosen player within the network. We find that for small temptations to defect cooperators always dominate, while for intermediate and strong temptations a single quantum strategy is able to outperform all other strategies. In general, reciprocal trust remains within close relationships and favors the dominance of a single strategy. Due to coevolution, the power-law degree distributions transform to Poisson distributions.

  1. Testing prediction methods: Earthquake clustering versus the Poisson model

    USGS Publications Warehouse

    Michael, A.J.

    1997-01-01

    Testing earthquake prediction methods requires statistical techniques that compare observed success to random chance. One technique is to produce simulated earthquake catalogs and measure the relative success of predicting real and simulated earthquakes. The accuracy of these tests depends on the validity of the statistical model used to simulate the earthquakes. This study tests the effect of clustering in the statistical earthquake model on the results. Three simulation models were used to produce significance levels for a VLF earthquake prediction method. As the degree of simulated clustering increases, the statistical significance drops. Hence, the use of a seismicity model with insufficient clustering can lead to overly optimistic results. A successful method must pass the statistical tests with a model that fully replicates the observed clustering. However, a method can be rejected based on tests with a model that contains insufficient clustering. U.S. copyright. Published in 1997 by the American Geophysical Union.

  2. Spatial modeling of cutaneous leishmaniasis in the Andean region of Colombia.

    PubMed

    Pérez-Flórez, Mauricio; Ocampo, Clara Beatriz; Valderrama-Ardila, Carlos; Alexander, Neal

    2016-06-27

    The objective of this research was to identify environmental risk factors for cutaneous leishmaniasis (CL) in Colombia and map high-risk municipalities. The study area was the Colombian Andean region, comprising 715 rural and urban municipalities. We used 10 years of CL surveillance: 2000-2009. We used spatial-temporal analysis - conditional autoregressive Poisson random effects modelling - in a Bayesian framework to model the dependence of municipality-level incidence on land use, climate, elevation and population density. Bivariable spatial analysis identified rainforests, forests and secondary vegetation, temperature, and annual precipitation as positively associated with CL incidence. By contrast, livestock agroecosystems and temperature seasonality were negatively associated. Multivariable analysis identified land use - rainforests and agro-livestock - and climate - temperature, rainfall and temperature seasonality - as best predictors of CL. We conclude that climate and land use can be used to identify areas at high risk of CL and that this approach is potentially applicable elsewhere in Latin America.

  3. Monitoring Poisson observations using combined applications of Shewhart and EWMA charts

    NASA Astrophysics Data System (ADS)

    Abujiya, Mu'azu Ramat

    2017-11-01

    The Shewhart and exponentially weighted moving average (EWMA) charts for nonconformities are the most widely used procedures of choice for monitoring Poisson observations in modern industries. Individually, the Shewhart EWMA charts are only sensitive to large and small shifts, respectively. To enhance the detection abilities of the two schemes in monitoring all kinds of shifts in Poisson count data, this study examines the performance of combined applications of the Shewhart, and EWMA Poisson control charts. Furthermore, the study proposes modifications based on well-structured statistical data collection technique, ranked set sampling (RSS), to detect shifts in the mean of a Poisson process more quickly. The relative performance of the proposed Shewhart-EWMA Poisson location charts is evaluated in terms of the average run length (ARL), standard deviation of the run length (SDRL), median run length (MRL), average ratio ARL (ARARL), average extra quadratic loss (AEQL) and performance comparison index (PCI). Consequently, all the new Poisson control charts based on RSS method are generally more superior than most of the existing schemes for monitoring Poisson processes. The use of these combined Shewhart-EWMA Poisson charts is illustrated with an example to demonstrate the practical implementation of the design procedure.

  4. Comment on: 'A Poisson resampling method for simulating reduced counts in nuclear medicine images'.

    PubMed

    de Nijs, Robin

    2015-07-21

    In order to be able to calculate half-count images from already acquired data, White and Lawson published their method based on Poisson resampling. They verified their method experimentally by measurements with a Co-57 flood source. In this comment their results are reproduced and confirmed by a direct numerical simulation in Matlab. Not only Poisson resampling, but also two direct redrawing methods were investigated. Redrawing methods were based on a Poisson and a Gaussian distribution. Mean, standard deviation, skewness and excess kurtosis half-count/full-count ratios were determined for all methods, and compared to the theoretical values for a Poisson distribution. Statistical parameters showed the same behavior as in the original note and showed the superiority of the Poisson resampling method. Rounding off before saving of the half count image had a severe impact on counting statistics for counts below 100. Only Poisson resampling was not affected by this, while Gaussian redrawing was less affected by it than Poisson redrawing. Poisson resampling is the method of choice, when simulating half-count (or less) images from full-count images. It simulates correctly the statistical properties, also in the case of rounding off of the images.

  5. Effect of solid distribution on elastic properties of open-cell cellular solids using numerical and experimental methods.

    PubMed

    Zargarian, A; Esfahanian, M; Kadkhodapour, J; Ziaei-Rad, S

    2014-09-01

    Effect of solid distribution between edges and vertices of three-dimensional cellular solid with an open-cell structure was investigated both numerically and experimentally. Finite element analysis (FEA) with continuum elements and appropriate periodic boundary condition was employed to calculate the elastic properties of cellular solids using tetrakaidecahedral (Kelvin) unit cell. Relative densities between 0.01 and 0.1 and various values of solid fractions were considered. In order to validate the numerical model, three scaffolds with the relative density of 0.08, but different amounts of solid in vertices, were fabricated via 3-D printing technique. Good agreement was observed between numerical simulation and experimental results. Results of numerical simulation showed that, at low relative densities (<0.03), Young׳s modulus increased by shifting materials away from edges to vertices at first and then decreased after reaching a critical point. However, for the high values of relative density, Young׳s modulus increased monotonically. Mechanisms of such a behavior were discussed in detail. Results also indicated that Poisson׳s ratio decreased by increasing relative density and solid fraction in vertices. By fitting a curve to the data obtained from the numerical simulation and considering the relative density and solid fraction in vertices, empirical relations were derived for Young׳s modulus and Poisson׳s ratio. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Poisson-Nernst-Planck equations with steric effects - non-convexity and multiple stationary solutions

    NASA Astrophysics Data System (ADS)

    Gavish, Nir

    2018-04-01

    We study the existence and stability of stationary solutions of Poisson-Nernst-Planck equations with steric effects (PNP-steric equations) with two counter-charged species. We show that within a range of parameters, steric effects give rise to multiple solutions of the corresponding stationary equation that are smooth. The PNP-steric equation, however, is found to be ill-posed at the parameter regime where multiple solutions arise. Following these findings, we introduce a novel PNP-Cahn-Hilliard model, show that it is well-posed and that it admits multiple stationary solutions that are smooth and stable. The various branches of stationary solutions and their stability are mapped utilizing bifurcation analysis and numerical continuation methods.

  7. Systematic design of 3D auxetic lattice materials with programmable Poisson's ratio for finite strains

    NASA Astrophysics Data System (ADS)

    Wang, Fengwen

    2018-05-01

    This paper presents a systematic approach for designing 3D auxetic lattice materials, which exhibit constant negative Poisson's ratios over large strain intervals. A unit cell model mimicking tensile tests is established and based on the proposed model, the secant Poisson's ratio is defined as the negative ratio between the lateral and the longitudinal engineering strains. The optimization problem for designing a material unit cell with a target Poisson's ratio is formulated to minimize the average lateral engineering stresses under the prescribed deformations. Numerical results demonstrate that 3D auxetic lattice materials with constant Poisson's ratios can be achieved by the proposed optimization formulation and that two sets of material architectures are obtained by imposing different symmetry on the unit cell. Moreover, inspired by the topology-optimized material architecture, a subsequent shape optimization is proposed by parametrizing material architectures using super-ellipsoids. By designing two geometrical parameters, simple optimized material microstructures with different target Poisson's ratios are obtained. By interpolating these two parameters as polynomial functions of Poisson's ratios, material architectures for any Poisson's ratio in the interval of ν ∈ [ - 0.78 , 0.00 ] are explicitly presented. Numerical evaluations show that interpolated auxetic lattice materials exhibit constant Poisson's ratios in the target strain interval of [0.00, 0.20] and that 3D auxetic lattice material architectures with programmable Poisson's ratio are achievable.

  8. Developing small-area predictions for smoking and obesity prevalence in the United States for use in Environmental Public Health Tracking.

    PubMed

    Ortega Hinojosa, Alberto M; Davies, Molly M; Jarjour, Sarah; Burnett, Richard T; Mann, Jennifer K; Hughes, Edward; Balmes, John R; Turner, Michelle C; Jerrett, Michael

    2014-10-01

    Globally and in the United States, smoking and obesity are leading causes of death and disability. Reliable estimates of prevalence for these risk factors are often missing variables in public health surveillance programs. This may limit the capacity of public health surveillance to target interventions or to assess associations between other environmental risk factors (e.g., air pollution) and health because smoking and obesity are often important confounders. To generate prevalence estimates of smoking and obesity rates over small areas for the United States (i.e., at the ZIP code and census tract levels). We predicted smoking and obesity prevalence using a combined approach first using a lasso-based variable selection procedure followed by a two-level random effects regression with a Poisson link clustered on state and county. We used data from the Behavioral Risk Factor Surveillance System (BRFSS) from 1991 to 2010 to estimate the model. We used 10-fold cross-validated mean squared errors and the variance of the residuals to test our model. To downscale the estimates we combined the prediction equations with 1990 and 2000 U.S. Census data for each of the four five-year time periods in this time range at the ZIP code and census tract levels. Several sensitivity analyses were conducted using models that included only basic terms, that accounted for spatial autocorrelation, and used Generalized Linear Models that did not include random effects. The two-level random effects model produced improved estimates compared to the fixed effects-only models. Estimates were particularly improved for the two-thirds of the conterminous U.S. where BRFSS data were available to estimate the county level random effects. We downscaled the smoking and obesity rate predictions to derive ZIP code and census tract estimates. To our knowledge these smoking and obesity predictions are the first to be developed for the entire conterminous U.S. for census tracts and ZIP codes. Our estimates could have significant utility for public health surveillance. Copyright © 2014. Published by Elsevier Inc.

  9. Scaling Limits and Generic Bounds for Exploration Processes

    NASA Astrophysics Data System (ADS)

    Bermolen, Paola; Jonckheere, Matthieu; Sanders, Jaron

    2017-12-01

    We consider exploration algorithms of the random sequential adsorption type both for homogeneous random graphs and random geometric graphs based on spatial Poisson processes. At each step, a vertex of the graph becomes active and its neighboring nodes become blocked. Given an initial number of vertices N growing to infinity, we study statistical properties of the proportion of explored (active or blocked) nodes in time using scaling limits. We obtain exact limits for homogeneous graphs and prove an explicit central limit theorem for the final proportion of active nodes, known as the jamming constant, through a diffusion approximation for the exploration process which can be described as a unidimensional process. We then focus on bounding the trajectories of such exploration processes on random geometric graphs, i.e., random sequential adsorption. As opposed to exploration processes on homogeneous random graphs, these do not allow for such a dimensional reduction. Instead we derive a fundamental relationship between the number of explored nodes and the discovered volume in the spatial process, and we obtain generic bounds for the fluid limit and jamming constant: bounds that are independent of the dimension of space and the detailed shape of the volume associated to the discovered node. Lastly, using coupling techinques, we give trajectorial interpretations of the generic bounds.

  10. Dimensional study of the dynamical arrest in a random Lorentz gas.

    PubMed

    Jin, Yuliang; Charbonneau, Patrick

    2015-04-01

    The random Lorentz gas (RLG) is a minimal model for transport in heterogeneous media. Upon increasing the obstacle density, it exhibits a growing subdiffusive transport regime and then a dynamical arrest. Here, we study the dimensional dependence of the dynamical arrest, which can be mapped onto the void percolation transition for Poisson-distributed point obstacles. We numerically determine the arrest in dimensions d=2-6. Comparison of the results with standard mode-coupling theory reveals that the dynamical theory prediction grows increasingly worse with d. In an effort to clarify the origin of this discrepancy, we relate the dynamical arrest in the RLG to the dynamic glass transition of the infinite-range Mari-Kurchan-model glass former. Through a mixed static and dynamical analysis, we then extract an improved dimensional scaling form as well as a geometrical upper bound for the arrest. The results suggest that understanding the asymptotic behavior of the random Lorentz gas may be key to surmounting fundamental difficulties with the mode-coupling theory of glasses.

  11. Random parameter models for accident prediction on two-lane undivided highways in India.

    PubMed

    Dinu, R R; Veeraragavan, A

    2011-02-01

    Generalized linear modeling (GLM), with the assumption of Poisson or negative binomial error structure, has been widely employed in road accident modeling. A number of explanatory variables related to traffic, road geometry, and environment that contribute to accident occurrence have been identified and accident prediction models have been proposed. The accident prediction models reported in literature largely employ the fixed parameter modeling approach, where the magnitude of influence of an explanatory variable is considered to be fixed for any observation in the population. Similar models have been proposed for Indian highways too, which include additional variables representing traffic composition. The mixed traffic on Indian highways comes with a lot of variability within, ranging from difference in vehicle types to variability in driver behavior. This could result in variability in the effect of explanatory variables on accidents across locations. Random parameter models, which can capture some of such variability, are expected to be more appropriate for the Indian situation. The present study is an attempt to employ random parameter modeling for accident prediction on two-lane undivided rural highways in India. Three years of accident history, from nearly 200 km of highway segments, is used to calibrate and validate the models. The results of the analysis suggest that the model coefficients for traffic volume, proportion of cars, motorized two-wheelers and trucks in traffic, and driveway density and horizontal and vertical curvatures are randomly distributed across locations. The paper is concluded with a discussion on modeling results and the limitations of the present study. Copyright © 2010 Elsevier Ltd. All rights reserved.

  12. Unimodularity criteria for Poisson structures on foliated manifolds

    NASA Astrophysics Data System (ADS)

    Pedroza, Andrés; Velasco-Barreras, Eduardo; Vorobiev, Yury

    2018-03-01

    We study the behavior of the modular class of an orientable Poisson manifold and formulate some unimodularity criteria in the semilocal context, around a (singular) symplectic leaf. Our results generalize some known unimodularity criteria for regular Poisson manifolds related to the notion of the Reeb class. In particular, we show that the unimodularity of the transverse Poisson structure of the leaf is a necessary condition for the semilocal unimodular property. Our main tool is an explicit formula for a bigraded decomposition of modular vector fields of a coupling Poisson structure on a foliated manifold. Moreover, we also exploit the notion of the modular class of a Poisson foliation and its relationship with the Reeb class.

  13. Investigation of spectral analysis techniques for randomly sampled velocimetry data

    NASA Technical Reports Server (NTRS)

    Sree, Dave

    1993-01-01

    It is well known that velocimetry (LV) generates individual realization velocity data that are randomly or unevenly sampled in time. Spectral analysis of such data to obtain the turbulence spectra, and hence turbulence scales information, requires special techniques. The 'slotting' technique of Mayo et al, also described by Roberts and Ajmani, and the 'Direct Transform' method of Gaster and Roberts are well known in the LV community. The slotting technique is faster than the direct transform method in computation. There are practical limitations, however, as to how a high frequency and accurate estimate can be made for a given mean sampling rate. These high frequency estimates are important in obtaining the microscale information of turbulence structure. It was found from previous studies that reliable spectral estimates can be made up to about the mean sampling frequency (mean data rate) or less. If the data were evenly samples, the frequency range would be half the sampling frequency (i.e. up to Nyquist frequency); otherwise, aliasing problem would occur. The mean data rate and the sample size (total number of points) basically limit the frequency range. Also, there are large variabilities or errors associated with the high frequency estimates from randomly sampled signals. Roberts and Ajmani proposed certain pre-filtering techniques to reduce these variabilities, but at the cost of low frequency estimates. The prefiltering acts as a high-pass filter. Further, Shapiro and Silverman showed theoretically that, for Poisson sampled signals, it is possible to obtain alias-free spectral estimates far beyond the mean sampling frequency. But the question is, how far? During his tenure under 1993 NASA-ASEE Summer Faculty Fellowship Program, the author investigated from his studies on the spectral analysis techniques for randomly sampled signals that the spectral estimates can be enhanced or improved up to about 4-5 times the mean sampling frequency by using a suitable prefiltering technique. But, this increased bandwidth comes at the cost of the lower frequency estimates. The studies further showed that large data sets of the order of 100,000 points, or more, high data rates, and Poisson sampling are very crucial for obtaining reliable spectral estimates from randomly sampled data, such as LV data. Some of the results of the current study are presented.

  14. A test of inflated zeros for Poisson regression models.

    PubMed

    He, Hua; Zhang, Hui; Ye, Peng; Tang, Wan

    2017-01-01

    Excessive zeros are common in practice and may cause overdispersion and invalidate inference when fitting Poisson regression models. There is a large body of literature on zero-inflated Poisson models. However, methods for testing whether there are excessive zeros are less well developed. The Vuong test comparing a Poisson and a zero-inflated Poisson model is commonly applied in practice. However, the type I error of the test often deviates seriously from the nominal level, rendering serious doubts on the validity of the test in such applications. In this paper, we develop a new approach for testing inflated zeros under the Poisson model. Unlike the Vuong test for inflated zeros, our method does not require a zero-inflated Poisson model to perform the test. Simulation studies show that when compared with the Vuong test our approach not only better at controlling type I error rate, but also yield more power.

  15. A comparison of different ways of including baseline counts in negative binomial models for data from falls prevention trials.

    PubMed

    Zheng, Han; Kimber, Alan; Goodwin, Victoria A; Pickering, Ruth M

    2018-01-01

    A common design for a falls prevention trial is to assess falling at baseline, randomize participants into an intervention or control group, and ask them to record the number of falls they experience during a follow-up period of time. This paper addresses how best to include the baseline count in the analysis of the follow-up count of falls in negative binomial (NB) regression. We examine the performance of various approaches in simulated datasets where both counts are generated from a mixed Poisson distribution with shared random subject effect. Including the baseline count after log-transformation as a regressor in NB regression (NB-logged) or as an offset (NB-offset) resulted in greater power than including the untransformed baseline count (NB-unlogged). Cook and Wei's conditional negative binomial (CNB) model replicates the underlying process generating the data. In our motivating dataset, a statistically significant intervention effect resulted from the NB-logged, NB-offset, and CNB models, but not from NB-unlogged, and large, outlying baseline counts were overly influential in NB-unlogged but not in NB-logged. We conclude that there is little to lose by including the log-transformed baseline count in standard NB regression compared to CNB for moderate to larger sized datasets. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Calculation of the Poisson cumulative distribution function

    NASA Technical Reports Server (NTRS)

    Bowerman, Paul N.; Nolty, Robert G.; Scheuer, Ernest M.

    1990-01-01

    A method for calculating the Poisson cdf (cumulative distribution function) is presented. The method avoids computer underflow and overflow during the process. The computer program uses this technique to calculate the Poisson cdf for arbitrary inputs. An algorithm that determines the Poisson parameter required to yield a specified value of the cdf is presented.

  17. Poisson's Ratio of a Hyperelastic Foam Under Quasi-static and Dynamic Loading

    DOE PAGES

    Sanborn, Brett; Song, Bo

    2018-06-03

    Poisson's ratio is a material constant representing compressibility of material volume. However, when soft, hyperelastic materials such as silicone foam are subjected to large deformation into densification, the Poisson's ratio may rather significantly change, which warrants careful consideration in modeling and simulation of impact/shock mitigation scenarios where foams are used as isolators. The evolution of Poisson's ratio of silicone foam materials has not yet been characterized, particularly under dynamic loading. In this study, radial and axial measurements of specimen strain are conducted simultaneously during quasi-static and dynamic compression tests to determine the Poisson's ratio of silicone foam. The Poisson's ratiomore » of silicone foam exhibited a transition from compressible to nearly incompressible at a threshold strain that coincided with the onset of densification in the material. Poisson's ratio as a function of engineering strain was different at quasi-static and dynamic rates. Here, the Poisson's ratio behavior is presented and can be used to improve constitutive modeling of silicone foams subjected to a broad range of mechanical loading.« less

  18. Poisson's Ratio of a Hyperelastic Foam Under Quasi-static and Dynamic Loading

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanborn, Brett; Song, Bo

    Poisson's ratio is a material constant representing compressibility of material volume. However, when soft, hyperelastic materials such as silicone foam are subjected to large deformation into densification, the Poisson's ratio may rather significantly change, which warrants careful consideration in modeling and simulation of impact/shock mitigation scenarios where foams are used as isolators. The evolution of Poisson's ratio of silicone foam materials has not yet been characterized, particularly under dynamic loading. In this study, radial and axial measurements of specimen strain are conducted simultaneously during quasi-static and dynamic compression tests to determine the Poisson's ratio of silicone foam. The Poisson's ratiomore » of silicone foam exhibited a transition from compressible to nearly incompressible at a threshold strain that coincided with the onset of densification in the material. Poisson's ratio as a function of engineering strain was different at quasi-static and dynamic rates. Here, the Poisson's ratio behavior is presented and can be used to improve constitutive modeling of silicone foams subjected to a broad range of mechanical loading.« less

  19. Investigation of the complex electroviscous effects on electrolyte (single and multiphase) flow in porous medi.

    NASA Astrophysics Data System (ADS)

    Bolet, A. J. S.; Linga, G.; Mathiesen, J.

    2017-12-01

    Surface charge is an important control parameter for wall-bounded flow of electrolyte solution. The electroviscous effect has been studied theoretically in model geometries such as infinite capillaries. However, in more complex geometries a quantification of the electroviscous effect is a non-trival task due to strong non-linarites of the underlying equations. In general, one has to rely on numerical methods. Here we present numerical studies of the full three-dimensional steady state Stokes-Poisson-Nernst-Planck problem in order to model electrolyte transport in artificial porous samples. The simulations are performed using the finite element method. From the simulation, we quantity how the electroviscous effect changes the general flow permeability in complex three-dimensional porous media. The porous media we consider are mostly generated artificially by connecting randomly dispersed cylindrical pores. Furthermore, we present results of electric driven two-phase immiscible flow in two dimensions. The simulations are performed by augmenting the above equations with a phase field model to handle and track the interaction between the two fluids (using parameters corresponding to oil-water interfaces, where oil non-polar). In particular, we consider the electro-osmotic effect on imbibition due to charged walls and electrolyte-solution.

  20. Effects of greening and community reuse of vacant lots on crime

    PubMed Central

    Kondo, Michelle; Hohl, Bernadette; Han, SeungHoon; Branas, Charles

    2016-01-01

    The Youngstown Neighborhood Development Corporation initiated a ‘Lots of Green’ programme to reuse vacant land in 2010. We performed a difference-in-differences analysis of the effects of this programme on crime in and around newly treated lots, in comparison to crimes in and around randomly selected and matched, untreated vacant lot controls. The effects of two types of vacant lot treatments on crime were tested: a cleaning and greening ‘stabilisation’ treatment and a ‘community reuse’ treatment mostly involving community gardens. The combined effects of both types of vacant lot treatments were also tested. After adjustment for various sociodemographic factors, linear and Poisson regression models demonstrated statistically significant reductions in all crime classes for at least one lot treatment type. Regression models adjusted for spatial autocorrelation found the most consistent significant reductions in burglaries around stabilisation lots, and in assaults around community reuse lots. Spill-over crime reduction effects were found in contiguous areas around newly treated lots. Significant increases in motor vehicle thefts around both types of lots were also found after they had been greened. Community-initiated vacant lot greening may have a greater impact on reducing more serious, violent crimes. PMID:28529389

  1. A fast Poisson solver for unsteady incompressible Navier-Stokes equations on the half-staggered grid

    NASA Technical Reports Server (NTRS)

    Golub, G. H.; Huang, L. C.; Simon, H.; Tang, W. -P.

    1995-01-01

    In this paper, a fast Poisson solver for unsteady, incompressible Navier-Stokes equations with finite difference methods on the non-uniform, half-staggered grid is presented. To achieve this, new algorithms for diagonalizing a semi-definite pair are developed. Our fast solver can also be extended to the three dimensional case. The motivation and related issues in using this second kind of staggered grid are also discussed. Numerical testing has indicated the effectiveness of this algorithm.

  2. Map scale effects on estimating the number of undiscovered mineral deposits

    USGS Publications Warehouse

    Singer, D.A.; Menzie, W.D.

    2008-01-01

    Estimates of numbers of undiscovered mineral deposits, fundamental to assessing mineral resources, are affected by map scale. Where consistently defined deposits of a particular type are estimated, spatial and frequency distributions of deposits are linked in that some frequency distributions can be generated by processes randomly in space whereas others are generated by processes suggesting clustering in space. Possible spatial distributions of mineral deposits and their related frequency distributions are affected by map scale and associated inclusions of non-permissive or covered geological settings. More generalized map scales are more likely to cause inclusion of geologic settings that are not really permissive for the deposit type, or that include unreported cover over permissive areas, resulting in the appearance of deposit clustering. Thus, overly generalized map scales can cause deposits to appear clustered. We propose a model that captures the effects of map scale and the related inclusion of non-permissive geologic settings on numbers of deposits estimates, the zero-inflated Poisson distribution. Effects of map scale as represented by the zero-inflated Poisson distribution suggest that the appearance of deposit clustering should diminish as mapping becomes more detailed because the number of inflated zeros would decrease with more detailed maps. Based on observed worldwide relationships between map scale and areas permissive for deposit types, mapping at a scale with twice the detail should cut permissive area size of a porphyry copper tract to 29% and a volcanic-hosted massive sulfide tract to 50% of their original sizes. Thus some direct benefits of mapping an area at a more detailed scale are indicated by significant reductions in areas permissive for deposit types, increased deposit density and, as a consequence, reduced uncertainty in the estimate of number of undiscovered deposits. Exploration enterprises benefit from reduced areas requiring detailed and expensive exploration, and land-use planners benefit from reduced areas of concern. ?? 2008 International Association for Mathematical Geology.

  3. Effect of Fiber Poisson Contraction on Matrix Multicracking Evolution of Fiber-Reinforced Ceramic-Matrix Composites

    NASA Astrophysics Data System (ADS)

    Longbiao, Li

    2015-12-01

    An analytical methodology has been developed to investigate the effect of fiber Poisson contraction on matrix multicracking evolution of fiber-reinforced ceramic-matrix composites (CMCs). The modified shear-lag model incorporated with the Coulomb friction law is adopted to solve the stress distribution in the interface slip region and intact region of the damaged composite. The critical matrix strain energy criterion which presupposes the existence of an ultimate or critical strain energy limit beyond which the matrix fails has been adopted to describe matrix multicracking of CMCs. As more energy is placed into the composite, matrix fractures and the interface debonding occurs to dissipate the extra energy. The interface debonded length under the process of matrix multicracking is obtained by treating the interface debonding as a particular crack propagation problem along the fiber/matrix interface. The effects of the interfacial frictional coefficient, fiber Poisson ratio, fiber volume fraction, interface debonded energy and cycle number on the interface debonding and matrix multicracking evolution have been analyzed. The theoretical results are compared with experimental data of unidirectional SiC/CAS, SiC/CAS-II and SiC/Borosilicate composites.

  4. Gaussian process-based Bayesian nonparametric inference of population size trajectories from gene genealogies.

    PubMed

    Palacios, Julia A; Minin, Vladimir N

    2013-03-01

    Changes in population size influence genetic diversity of the population and, as a result, leave a signature of these changes in individual genomes in the population. We are interested in the inverse problem of reconstructing past population dynamics from genomic data. We start with a standard framework based on the coalescent, a stochastic process that generates genealogies connecting randomly sampled individuals from the population of interest. These genealogies serve as a glue between the population demographic history and genomic sequences. It turns out that only the times of genealogical lineage coalescences contain information about population size dynamics. Viewing these coalescent times as a point process, estimating population size trajectories is equivalent to estimating a conditional intensity of this point process. Therefore, our inverse problem is similar to estimating an inhomogeneous Poisson process intensity function. We demonstrate how recent advances in Gaussian process-based nonparametric inference for Poisson processes can be extended to Bayesian nonparametric estimation of population size dynamics under the coalescent. We compare our Gaussian process (GP) approach to one of the state-of-the-art Gaussian Markov random field (GMRF) methods for estimating population trajectories. Using simulated data, we demonstrate that our method has better accuracy and precision. Next, we analyze two genealogies reconstructed from real sequences of hepatitis C and human Influenza A viruses. In both cases, we recover more believed aspects of the viral demographic histories than the GMRF approach. We also find that our GP method produces more reasonable uncertainty estimates than the GMRF method. Copyright © 2013, The International Biometric Society.

  5. A PET reconstruction formulation that enforces non-negativity in projection space for bias reduction in Y-90 imaging

    NASA Astrophysics Data System (ADS)

    Lim, Hongki; Dewaraja, Yuni K.; Fessler, Jeffrey A.

    2018-02-01

    Most existing PET image reconstruction methods impose a nonnegativity constraint in the image domain that is natural physically, but can lead to biased reconstructions. This bias is particularly problematic for Y-90 PET because of the low probability positron production and high random coincidence fraction. This paper investigates a new PET reconstruction formulation that enforces nonnegativity of the projections instead of the voxel values. This formulation allows some negative voxel values, thereby potentially reducing bias. Unlike the previously reported NEG-ML approach that modifies the Poisson log-likelihood to allow negative values, the new formulation retains the classical Poisson statistical model. To relax the non-negativity constraint embedded in the standard methods for PET reconstruction, we used an alternating direction method of multipliers (ADMM). Because choice of ADMM parameters can greatly influence convergence rate, we applied an automatic parameter selection method to improve the convergence speed. We investigated the methods using lung to liver slices of XCAT phantom. We simulated low true coincidence count-rates with high random fractions corresponding to the typical values from patient imaging in Y-90 microsphere radioembolization. We compared our new methods with standard reconstruction algorithms and NEG-ML and a regularized version thereof. Both our new method and NEG-ML allow more accurate quantification in all volumes of interest while yielding lower noise than the standard method. The performance of NEG-ML can degrade when its user-defined parameter is tuned poorly, while the proposed algorithm is robust to any count level without requiring parameter tuning.

  6. A Martingale Characterization of Mixed Poisson Processes.

    DTIC Science & Technology

    1985-10-01

    03LA A 11. TITLE (Inciuae Security Clanafication, ",A martingale characterization of mixed Poisson processes " ________________ 12. PERSONAL AUTHOR... POISSON PROCESSES Jostification .......... . ... . . Di.;t ib,,jtion by Availability Codes Dietmar Pfeifer* Technical University Aachen Dist Special and...Mixed Poisson processes play an important role in many branches of applied probability, for instance in insurance mathematics and physics (see Albrecht

  7. Generation of Non-Homogeneous Poisson Processes by Thinning: Programming Considerations and Comparision with Competing Algorithms.

    DTIC Science & Technology

    1978-12-01

    Poisson processes . The method is valid for Poisson processes with any given intensity function. The basic thinning algorithm is modified to exploit several refinements which reduce computer execution time by approximately one-third. The basic and modified thinning programs are compared with the Poisson decomposition and gap-statistics algorithm, which is easily implemented for Poisson processes with intensity functions of the form exp(a sub 0 + a sub 1t + a sub 2 t-squared. The thinning programs are competitive in both execution

  8. Deformation mechanisms in negative Poisson's ratio materials - Structural aspects

    NASA Technical Reports Server (NTRS)

    Lakes, R.

    1991-01-01

    Poisson's ratio in materials is governed by the following aspects of the microstructure: the presence of rotational degrees of freedom, non-affine deformation kinematics, or anisotropic structure. Several structural models are examined. The non-affine kinematics are seen to be essential for the production of negative Poisson's ratios for isotropic materials containing central force linkages of positive stiffness. Non-central forces combined with pre-load can also give rise to a negative Poisson's ratio in isotropic materials. A chiral microstructure with non-central force interaction or non-affine deformation can also exhibit a negative Poisson's ratio. Toughness and damage resistance in these materials may be affected by the Poisson's ratio itself, as well as by generalized continuum aspects associated with the microstructure.

  9. Exact solution for the Poisson field in a semi-infinite strip.

    PubMed

    Cohen, Yossi; Rothman, Daniel H

    2017-04-01

    The Poisson equation is associated with many physical processes. Yet exact analytic solutions for the two-dimensional Poisson field are scarce. Here we derive an analytic solution for the Poisson equation with constant forcing in a semi-infinite strip. We provide a method that can be used to solve the field in other intricate geometries. We show that the Poisson flux reveals an inverse square-root singularity at a tip of a slit, and identify a characteristic length scale in which a small perturbation, in a form of a new slit, is screened by the field. We suggest that this length scale expresses itself as a characteristic spacing between tips in real Poisson networks that grow in response to fluxes at tips.

  10. Probabilistic SSME blades structural response under random pulse loading

    NASA Technical Reports Server (NTRS)

    Shiao, Michael; Rubinstein, Robert; Nagpal, Vinod K.

    1987-01-01

    The purpose is to develop models of random impacts on a Space Shuttle Main Engine (SSME) turbopump blade and to predict the probabilistic structural response of the blade to these impacts. The random loading is caused by the impact of debris. The probabilistic structural response is characterized by distribution functions for stress and displacements as functions of the loading parameters which determine the random pulse model. These parameters include pulse arrival, amplitude, and location. The analysis can be extended to predict level crossing rates. This requires knowledge of the joint distribution of the response and its derivative. The model of random impacts chosen allows the pulse arrivals, pulse amplitudes, and pulse locations to be random. Specifically, the pulse arrivals are assumed to be governed by a Poisson process, which is characterized by a mean arrival rate. The pulse intensity is modelled as a normally distributed random variable with a zero mean chosen independently at each arrival. The standard deviation of the distribution is a measure of pulse intensity. Several different models were used for the pulse locations. For example, three points near the blade tip were chosen at which pulses were allowed to arrive with equal probability. Again, the locations were chosen independently at each arrival. The structural response was analyzed both by direct Monte Carlo simulation and by a semi-analytical method.

  11. A computer program to generate two-dimensional grids about airfoils and other shapes by the use of Poisson's equation

    NASA Technical Reports Server (NTRS)

    Sorenson, R. L.

    1980-01-01

    A method for generating two dimensional finite difference grids about airfoils and other shapes by the use of the Poisson differential equation is developed. The inhomogeneous terms are automatically chosen such that two important effects are imposed on the grid at both the inner and outer boundaries. The first effect is control of the spacing between mesh points along mesh lines intersecting the boundaries. The second effect is control of the angles with which mesh lines intersect the boundaries. A FORTRAN computer program has been written to use this method. A description of the program, a discussion of the control parameters, and a set of sample cases are included.

  12. Pattern analysis of community health center location in Surabaya using spatial Poisson point process

    NASA Astrophysics Data System (ADS)

    Kusumaningrum, Choriah Margareta; Iriawan, Nur; Winahju, Wiwiek Setya

    2017-11-01

    Community health center (puskesmas) is one of the closest health service facilities for the community, which provide healthcare for population on sub-district level as one of the government-mandated community health clinics located across Indonesia. The increasing number of this puskesmas does not directly comply the fulfillment of basic health services needed in such region. Ideally, a puskesmas has to cover up to maximum 30,000 people. The number of puskesmas in Surabaya indicates an unbalance spread in all of the area. This research aims to analyze the spread of puskesmas in Surabaya using spatial Poisson point process model in order to get the effective location of Surabaya's puskesmas which based on their location. The results of the analysis showed that the distribution pattern of puskesmas in Surabaya is non-homogeneous Poisson process and is approched by mixture Poisson model. Based on the estimated model obtained by using Bayesian mixture model couple with MCMC process, some characteristics of each puskesmas have no significant influence as factors to decide the addition of health center in such location. Some factors related to the areas of sub-districts have to be considered as covariate to make a decision adding the puskesmas in Surabaya.

  13. Poisson-Gaussian Noise Analysis and Estimation for Low-Dose X-ray Images in the NSCT Domain.

    PubMed

    Lee, Sangyoon; Lee, Min Seok; Kang, Moon Gi

    2018-03-29

    The noise distribution of images obtained by X-ray sensors in low-dosage situations can be analyzed using the Poisson and Gaussian mixture model. Multiscale conversion is one of the most popular noise reduction methods used in recent years. Estimation of the noise distribution of each subband in the multiscale domain is the most important factor in performing noise reduction, with non-subsampled contourlet transform (NSCT) representing an effective method for scale and direction decomposition. In this study, we use artificially generated noise to analyze and estimate the Poisson-Gaussian noise of low-dose X-ray images in the NSCT domain. The noise distribution of the subband coefficients is analyzed using the noiseless low-band coefficients and the variance of the noisy subband coefficients. The noise-after-transform also follows a Poisson-Gaussian distribution, and the relationship between the noise parameters of the subband and the full-band image is identified. We then analyze noise of actual images to validate the theoretical analysis. Comparison of the proposed noise estimation method with an existing noise reduction method confirms that the proposed method outperforms traditional methods.

  14. SMPBS: Web server for computing biomolecular electrostatics using finite element solvers of size modified Poisson-Boltzmann equation.

    PubMed

    Xie, Yang; Ying, Jinyong; Xie, Dexuan

    2017-03-30

    SMPBS (Size Modified Poisson-Boltzmann Solvers) is a web server for computing biomolecular electrostatics using finite element solvers of the size modified Poisson-Boltzmann equation (SMPBE). SMPBE not only reflects ionic size effects but also includes the classic Poisson-Boltzmann equation (PBE) as a special case. Thus, its web server is expected to have a broader range of applications than a PBE web server. SMPBS is designed with a dynamic, mobile-friendly user interface, and features easily accessible help text, asynchronous data submission, and an interactive, hardware-accelerated molecular visualization viewer based on the 3Dmol.js library. In particular, the viewer allows computed electrostatics to be directly mapped onto an irregular triangular mesh of a molecular surface. Due to this functionality and the fast SMPBE finite element solvers, the web server is very efficient in the calculation and visualization of electrostatics. In addition, SMPBE is reconstructed using a new objective electrostatic free energy, clearly showing that the electrostatics and ionic concentrations predicted by SMPBE are optimal in the sense of minimizing the objective electrostatic free energy. SMPBS is available at the URL: smpbs.math.uwm.edu © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  15. Poisson-Gaussian Noise Reduction Using the Hidden Markov Model in Contourlet Domain for Fluorescence Microscopy Images

    PubMed Central

    Yang, Sejung; Lee, Byung-Uk

    2015-01-01

    In certain image acquisitions processes, like in fluorescence microscopy or astronomy, only a limited number of photons can be collected due to various physical constraints. The resulting images suffer from signal dependent noise, which can be modeled as a Poisson distribution, and a low signal-to-noise ratio. However, the majority of research on noise reduction algorithms focuses on signal independent Gaussian noise. In this paper, we model noise as a combination of Poisson and Gaussian probability distributions to construct a more accurate model and adopt the contourlet transform which provides a sparse representation of the directional components in images. We also apply hidden Markov models with a framework that neatly describes the spatial and interscale dependencies which are the properties of transformation coefficients of natural images. In this paper, an effective denoising algorithm for Poisson-Gaussian noise is proposed using the contourlet transform, hidden Markov models and noise estimation in the transform domain. We supplement the algorithm by cycle spinning and Wiener filtering for further improvements. We finally show experimental results with simulations and fluorescence microscopy images which demonstrate the improved performance of the proposed approach. PMID:26352138

  16. Fuzzy classifier based support vector regression framework for Poisson ratio determination

    NASA Astrophysics Data System (ADS)

    Asoodeh, Mojtaba; Bagheripour, Parisa

    2013-09-01

    Poisson ratio is considered as one of the most important rock mechanical properties of hydrocarbon reservoirs. Determination of this parameter through laboratory measurement is time, cost, and labor intensive. Furthermore, laboratory measurements do not provide continuous data along the reservoir intervals. Hence, a fast, accurate, and inexpensive way of determining Poisson ratio which produces continuous data over the whole reservoir interval is desirable. For this purpose, support vector regression (SVR) method based on statistical learning theory (SLT) was employed as a supervised learning algorithm to estimate Poisson ratio from conventional well log data. SVR is capable of accurately extracting the implicit knowledge contained in conventional well logs and converting the gained knowledge into Poisson ratio data. Structural risk minimization (SRM) principle which is embedded in the SVR structure in addition to empirical risk minimization (EMR) principle provides a robust model for finding quantitative formulation between conventional well log data and Poisson ratio. Although satisfying results were obtained from an individual SVR model, it had flaws of overestimation in low Poisson ratios and underestimation in high Poisson ratios. These errors were eliminated through implementation of fuzzy classifier based SVR (FCBSVR). The FCBSVR significantly improved accuracy of the final prediction. This strategy was successfully applied to data from carbonate reservoir rocks of an Iranian Oil Field. Results indicated that SVR predicted Poisson ratio values are in good agreement with measured values.

  17. Performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data.

    PubMed

    Yelland, Lisa N; Salter, Amy B; Ryan, Philip

    2011-10-15

    Modified Poisson regression, which combines a log Poisson regression model with robust variance estimation, is a useful alternative to log binomial regression for estimating relative risks. Previous studies have shown both analytically and by simulation that modified Poisson regression is appropriate for independent prospective data. This method is often applied to clustered prospective data, despite a lack of evidence to support its use in this setting. The purpose of this article is to evaluate the performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data, by using generalized estimating equations to account for clustering. A simulation study is conducted to compare log binomial regression and modified Poisson regression for analyzing clustered data from intervention and observational studies. Both methods generally perform well in terms of bias, type I error, and coverage. Unlike log binomial regression, modified Poisson regression is not prone to convergence problems. The methods are contrasted by using example data sets from 2 large studies. The results presented in this article support the use of modified Poisson regression as an alternative to log binomial regression for analyzing clustered prospective data when clustering is taken into account by using generalized estimating equations.

  18. A generalized Poisson and Poisson-Boltzmann solver for electrostatic environments.

    PubMed

    Fisicaro, G; Genovese, L; Andreussi, O; Marzari, N; Goedecker, S

    2016-01-07

    The computational study of chemical reactions in complex, wet environments is critical for applications in many fields. It is often essential to study chemical reactions in the presence of applied electrochemical potentials, taking into account the non-trivial electrostatic screening coming from the solvent and the electrolytes. As a consequence, the electrostatic potential has to be found by solving the generalized Poisson and the Poisson-Boltzmann equations for neutral and ionic solutions, respectively. In the present work, solvers for both problems have been developed. A preconditioned conjugate gradient method has been implemented for the solution of the generalized Poisson equation and the linear regime of the Poisson-Boltzmann, allowing to solve iteratively the minimization problem with some ten iterations of the ordinary Poisson equation solver. In addition, a self-consistent procedure enables us to solve the non-linear Poisson-Boltzmann problem. Both solvers exhibit very high accuracy and parallel efficiency and allow for the treatment of periodic, free, and slab boundary conditions. The solver has been integrated into the BigDFT and Quantum-ESPRESSO electronic-structure packages and will be released as an independent program, suitable for integration in other codes.

  19. A Review of Multivariate Distributions for Count Data Derived from the Poisson Distribution.

    PubMed

    Inouye, David; Yang, Eunho; Allen, Genevera; Ravikumar, Pradeep

    2017-01-01

    The Poisson distribution has been widely studied and used for modeling univariate count-valued data. Multivariate generalizations of the Poisson distribution that permit dependencies, however, have been far less popular. Yet, real-world high-dimensional count-valued data found in word counts, genomics, and crime statistics, for example, exhibit rich dependencies, and motivate the need for multivariate distributions that can appropriately model this data. We review multivariate distributions derived from the univariate Poisson, categorizing these models into three main classes: 1) where the marginal distributions are Poisson, 2) where the joint distribution is a mixture of independent multivariate Poisson distributions, and 3) where the node-conditional distributions are derived from the Poisson. We discuss the development of multiple instances of these classes and compare the models in terms of interpretability and theory. Then, we empirically compare multiple models from each class on three real-world datasets that have varying data characteristics from different domains, namely traffic accident data, biological next generation sequencing data, and text data. These empirical experiments develop intuition about the comparative advantages and disadvantages of each class of multivariate distribution that was derived from the Poisson. Finally, we suggest new research directions as explored in the subsequent discussion section.

  20. A generalized Poisson and Poisson-Boltzmann solver for electrostatic environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisicaro, G., E-mail: giuseppe.fisicaro@unibas.ch; Goedecker, S.; Genovese, L.

    2016-01-07

    The computational study of chemical reactions in complex, wet environments is critical for applications in many fields. It is often essential to study chemical reactions in the presence of applied electrochemical potentials, taking into account the non-trivial electrostatic screening coming from the solvent and the electrolytes. As a consequence, the electrostatic potential has to be found by solving the generalized Poisson and the Poisson-Boltzmann equations for neutral and ionic solutions, respectively. In the present work, solvers for both problems have been developed. A preconditioned conjugate gradient method has been implemented for the solution of the generalized Poisson equation and themore » linear regime of the Poisson-Boltzmann, allowing to solve iteratively the minimization problem with some ten iterations of the ordinary Poisson equation solver. In addition, a self-consistent procedure enables us to solve the non-linear Poisson-Boltzmann problem. Both solvers exhibit very high accuracy and parallel efficiency and allow for the treatment of periodic, free, and slab boundary conditions. The solver has been integrated into the BigDFT and Quantum-ESPRESSO electronic-structure packages and will be released as an independent program, suitable for integration in other codes.« less

  1. Brief Report: HIV Assisted Partner Services Among Those With and Without a History of Intimate Partner Violence in Kenya.

    PubMed

    Goyette, Marielle S; Mutiti, Peter M; Bukusi, David; Wamuti, Beatrice M; Otieno, Felix A; Cherutich, Peter; Golden, Matthew R; Spiegel, Hans; Richardson, Barbra A; Ngʼangʼa, Anne; Farquhar, Carey

    2018-05-01

    HIV assisted partner services (APS) are a notification and testing strategy for sex partners of HIV-infected index patients. This cluster-randomized controlled trial secondary data analysis investigated whether history of intimate partner violence (IPV) modified APS effectiveness and risk of relationship dissolution. Eighteen HIV testing and counseling sites in Kenya randomized to provide immediate APS (intervention) or APS delayed for 6 weeks (control). History of IPV was ascertained at study enrollment and defined as reporting ever experiencing physical or sexual IPV. Those reporting IPV in the month before enrollment were excluded. We tested whether history of IPV modified intervention effectiveness and risk of relationship dissolution using population-averaged Poisson and log-binomial generalized estimating equation models. Exploratory analyses investigated associations between history of IPV and events that occurred after HIV diagnosis using log-binomial generalized estimating equation models. The study enrolled 1119 index participants and 1286 partners. Among index participants, 81 (7%) had history of IPV. History of IPV did not modify APS effectiveness in testing, newly diagnosing, or linking partners to care. History of IPV did not modify the association between receiving immediate APS and relationship dissolution during the study. Among participants who had not experienced IPV in the last month but had experienced IPV in their lifetimes, our results suggest that APS is an effective and safe partner notification strategy in Kenya. As APS is scaled up in different contexts, these data support including those reporting past IPV and closely monitoring adverse events.

  2. An Artesunate-Containing Antimalarial Treatment Regimen Did Not Suppress Cytomegalovirus Viremia

    PubMed Central

    Gantt, Soren; Huang, Meei-Li; Magaret, Amalia; Bunts, Lisa; Selke, Stacy; Wald, Anna; Rosenthal, Philip J.; Dorsey, Grant; Casper, Corey

    2014-01-01

    Background Additional drugs are needed for the treatment of cytomegalovirus (CMV) infection. Artesunate is an antimalarial drug that has activity against CMV in vitro and in a rodent model. Only a small number of case reports are available describing the clinical effects of artesunate on CMV infection, and these yielded inconsistent results. Objective To evaluate the effect of artesunate on CMV infection, using blood samples collected from children who participated in malaria treatment trials. Study design Quantitative CMV DNA PCR was performed on dried blood spots collected from 494 Ugandan children, who were randomized either to artesunate plus amodiaquine or sulfadoxine-pyrimethamine plus amodiaquine for acute malaria infection. Poisson regression was used to compare treatment regimens with respect to the change in the frequency and quantity of CMV detected that occurred before and after treatment. Results CMV was detected in 11.4% of children immediately prior to treatment and 10.7% 3 days later (p=0.70). The average quantity of CMV was 0.30 log10 copies per million cells higher on day 3 than at treatment initiation (95% CI 0.01 to 0.58, p=0.041). There was no measurable difference in either the frequency or quantity of CMV detected in blood between children randomized to the two treatment arms. Conclusions A standard 3-day artesunate-containing antimalarial regimen had no detectable effect on CMV viremia in children with malaria. Longer treatment courses and/or higher doses of artesunate than those routinely used for malaria may be required for effective treatment of CMV infection. PMID:23827788

  3. Does progressive resistance and balance exercise reduce falls in residential aged care? Randomized controlled trial protocol for the SUNBEAM program.

    PubMed

    Hewitt, Jennifer; Refshauge, Kathryn M; Goodall, Stephen; Henwood, Timothy; Clemson, Lindy

    2014-01-01

    Falls are common among older adults. It is reported that approximately 60% of residents of aged care facilities fall each year. This is a major cause of morbidity and mortality, and a significant burden for health care providers and the health system. Among community dwelling older adults, exercise appears to be an effective countermeasure, but data are limited and inconsistent among studies in residents of aged care communities. This trial has been designed to evaluate whether the SUNBEAM program (Strength and Balance Exercise in Aged Care) reduces falls in residents of aged care facilities. Is the program more effective and cost-effective than usual care for the prevention of falls? Single-blinded, two group, cluster randomized trial. 300 residents, living in 20 aged care facilities. Progressive resistance and balance training under the guidance of a physiotherapist for 6 months, then facility-guided maintenance training for 6 months. Usual care. Number of falls, number of fallers, quality of life, mobility, balance, fear of falling, cognitive well-being, resource use, and cost-effectiveness. Measurements will be taken at baseline, 6 months, and 12 months. The number of falls will be analyzed using a Poisson mixed model. A logistic mixed model will be used to analyze the number of residents who fall during the study period. Intention-to-treat analysis will be used. This study addresses a significant shortcoming in aged care research, and has potential to impact upon a substantial health care problem. Outcomes will be used to inform care providers, and guide health care policies.

  4. Poisson Coordinates.

    PubMed

    Li, Xian-Ying; Hu, Shi-Min

    2013-02-01

    Harmonic functions are the critical points of a Dirichlet energy functional, the linear projections of conformal maps. They play an important role in computer graphics, particularly for gradient-domain image processing and shape-preserving geometric computation. We propose Poisson coordinates, a novel transfinite interpolation scheme based on the Poisson integral formula, as a rapid way to estimate a harmonic function on a certain domain with desired boundary values. Poisson coordinates are an extension of the Mean Value coordinates (MVCs) which inherit their linear precision, smoothness, and kernel positivity. We give explicit formulas for Poisson coordinates in both continuous and 2D discrete forms. Superior to MVCs, Poisson coordinates are proved to be pseudoharmonic (i.e., they reproduce harmonic functions on n-dimensional balls). Our experimental results show that Poisson coordinates have lower Dirichlet energies than MVCs on a number of typical 2D domains (particularly convex domains). As well as presenting a formula, our approach provides useful insights for further studies on coordinates-based interpolation and fast estimation of harmonic functions.

  5. An examination of sources of sensitivity of consumer surplus estimates in travel cost models.

    PubMed

    Blaine, Thomas W; Lichtkoppler, Frank R; Bader, Timothy J; Hartman, Travis J; Lucente, Joseph E

    2015-03-15

    We examine sensitivity of estimates of recreation demand using the Travel Cost Method (TCM) to four factors. Three of the four have been routinely and widely discussed in the TCM literature: a) Poisson verses negative binomial regression; b) application of Englin correction to account for endogenous stratification; c) truncation of the data set to eliminate outliers. A fourth issue we address has not been widely modeled: the potential effect on recreation demand of the interaction between income and travel cost. We provide a straightforward comparison of all four factors, analyzing the impact of each on regression parameters and consumer surplus estimates. Truncation has a modest effect on estimates obtained from the Poisson models but a radical effect on the estimates obtained by way of the negative binomial. Inclusion of an income-travel cost interaction term generally produces a more conservative but not a statistically significantly different estimate of consumer surplus in both Poisson and negative binomial models. It also generates broader confidence intervals. Application of truncation, the Englin correction and the income-travel cost interaction produced the most conservative estimates of consumer surplus and eliminated the statistical difference between the Poisson and the negative binomial. Use of the income-travel cost interaction term reveals that for visitors who face relatively low travel costs, the relationship between income and travel demand is negative, while it is positive for those who face high travel costs. This provides an explanation of the ambiguities on the findings regarding the role of income widely observed in the TCM literature. Our results suggest that policies that reduce access to publicly owned resources inordinately impact local low income recreationists and are contrary to environmental justice. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Fedosov’s formal symplectic groupoids and contravariant connections

    NASA Astrophysics Data System (ADS)

    Karabegov, Alexander V.

    2006-10-01

    Using Fedosov's approach we give a geometric construction of a formal symplectic groupoid over any Poisson manifold endowed with a torsion-free Poisson contravariant connection. In the case of Kähler-Poisson manifolds this construction provides, in particular, the formal symplectic groupoids with separation of variables. We show that the dual of a semisimple Lie algebra does not admit torsion-free Poisson contravariant connections.

  7. Complete synchronization of the global coupled dynamical network induced by Poisson noises.

    PubMed

    Guo, Qing; Wan, Fangyi

    2017-01-01

    The different Poisson noise-induced complete synchronization of the global coupled dynamical network is investigated. Based on the stability theory of stochastic differential equations driven by Poisson process, we can prove that Poisson noises can induce synchronization and sufficient conditions are established to achieve complete synchronization with probability 1. Furthermore, numerical examples are provided to show the agreement between theoretical and numerical analysis.

  8. Measured iron-gallium alloy tensile properties under magnetic fields

    NASA Astrophysics Data System (ADS)

    Yoo, Jin-Hyeong; Flatau, Alison B.

    2004-07-01

    Tension testing is used to identify Galfenol material properties under low level DC magnetic bias fields. Dog bone shaped specimens of single crystal Fe100-xGax, where 17<=x<=33, underwent tensile testing along two crystalographic axis orientations, [110] and [100]. The material properties being investigated and calculated from measured quantities are: Young's modulus and Poisson's ratio. Data are presented that demonstrate the dependence of these material properties on applied magnetic field levels and provide a preliminary assessment of the trends in material properties for performance under varied operating conditions. The elastic properties of Fe-Ga alloys were observed to be increasingly anisotropic with rising Ga content for the stoichiometries examined. The largest elastic anisotropies were manifested in [110] Poisson's ratios of as low as -0.63 in one specimen. This negative Poisson's ratio creates a significant in-plane auxetic behavior that could be exploited in applications that capitalize on unique area effects produced under uniaxial loading.

  9. A comparative study of count models: application to pedestrian-vehicle crashes along Malaysia federal roads.

    PubMed

    Hosseinpour, Mehdi; Pour, Mehdi Hossein; Prasetijo, Joewono; Yahaya, Ahmad Shukri; Ghadiri, Seyed Mohammad Reza

    2013-01-01

    The objective of this study was to examine the effects of various roadway characteristics on the incidence of pedestrian-vehicle crashes by developing a set of crash prediction models on 543 km of Malaysia federal roads over a 4-year time span between 2007 and 2010. Four count models including the Poisson, negative binomial (NB), hurdle Poisson (HP), and hurdle negative binomial (HNB) models were developed and compared to model the number of pedestrian crashes. The results indicated the presence of overdispersion in the pedestrian crashes (PCs) and showed that it is due to excess zero rather than variability in the crash data. To handle the issue, the hurdle Poisson model was found to be the best model among the considered models in terms of comparative measures. Moreover, the variables average daily traffic, heavy vehicle traffic, speed limit, land use, and area type were significantly associated with PCs.

  10. Simulation of the Formation of DNA Double Strand Breaks and Chromosome Aberrations in Irradiated Cells

    NASA Technical Reports Server (NTRS)

    Plante, Ianik; Ponomarev, Artem L.; Wu, Honglu; Blattnig, Steve; George, Kerry

    2014-01-01

    The formation of DNA double-strand breaks (DSBs) and chromosome aberrations is an important consequence of ionizing radiation. To simulate DNA double-strand breaks and the formation of chromosome aberrations, we have recently merged the codes RITRACKS (Relativistic Ion Tracks) and NASARTI (NASA Radiation Track Image). The program RITRACKS is a stochastic code developed to simulate detailed event-by-event radiation track structure: [1] This code is used to calculate the dose in voxels of 20 nm, in a volume containing simulated chromosomes, [2] The number of tracks in the volume is calculated for each simulation by sampling a Poisson distribution, with the distribution parameter obtained from the irradiation dose, ion type and energy. The program NASARTI generates the chromosomes present in a cell nucleus by random walks of 20 nm, corresponding to the size of the dose voxels, [3] The generated chromosomes are located within domains which may intertwine, and [4] Each segment of the random walks corresponds to approx. 2,000 DNA base pairs. NASARTI uses pre-calculated dose at each voxel to calculate the probability of DNA damage at each random walk segment. Using the location of double-strand breaks, possible rejoining between damaged segments is evaluated. This yields various types of chromosomes aberrations, including deletions, inversions, exchanges, etc. By performing the calculations using various types of radiations, it will be possible to obtain relative biological effectiveness (RBE) values for several types of chromosome aberrations.

  11. A multiscale filter for noise reduction of low-dose cone beam projections

    NASA Astrophysics Data System (ADS)

    Yao, Weiguang; Farr, Jonathan B.

    2015-08-01

    The Poisson or compound Poisson process governs the randomness of photon fluence in cone beam computed tomography (CBCT) imaging systems. The probability density function depends on the mean (noiseless) of the fluence at a certain detector. This dependence indicates the natural requirement of multiscale filters to smooth noise while preserving structures of the imaged object on the low-dose cone beam projection. In this work, we used a Gaussian filter, \\text{exp}≤ft(-{{x}2}/2σ f2\\right) as the multiscale filter to de-noise the low-dose cone beam projections. We analytically obtained the expression of {σf} , which represents the scale of the filter, by minimizing local noise-to-signal ratio. We analytically derived the variance of residual noise from the Poisson or compound Poisson processes after Gaussian filtering. From the derived analytical form of the variance of residual noise, optimal σ f2 is proved to be proportional to the noiseless fluence and modulated by local structure strength expressed as the linear fitting error of the structure. A strategy was used to obtain the reliable linear fitting error: smoothing the projection along the longitudinal direction to calculate the linear fitting error along the lateral direction and vice versa. The performance of our multiscale filter was examined on low-dose cone beam projections of a Catphan phantom and a head-and-neck patient. After performing the filter on the Catphan phantom projections scanned with pulse time 4 ms, the number of visible line pairs was similar to that scanned with 16 ms, and the contrast-to-noise ratio of the inserts was higher than that scanned with 16 ms about 64% in average. For the simulated head-and-neck patient projections with pulse time 4 ms, the visibility of soft tissue structures in the patient was comparable to that scanned with 20 ms. The image processing took less than 0.5 s per projection with 1024   ×   768 pixels.

  12. Vitamin D3 Supplementation and Childhood Diarrhea: A Randomized Controlled Trial

    PubMed Central

    Maroof, Zabihullah; Chandramohan, Daniel; Bruce, Jane; Mughal, M. Zulf; Bhutta, Zulfiqar; Walraven, Gijs; Masher, Mohammad I.; Ensink, Jeroen H.J.; Manaseki-Holland, Semira

    2013-01-01

    OBJECTIVE: To investigate the effect of vitamin D3 supplementation on the incidence and risk for first and recurrent diarrheal illnesses among children in Kabul, Afghanistan. METHODS: This double-blind placebo-controlled trial randomized 3046 high-risk 1- to 11-month-old infants to receive 6 quarterly doses of oral vitamin D3 (cholecalciferol 100 000 IU) or placebo in inner city Kabul. Data on diarrheal episodes (≥3 loose/liquid stools in 24 hours) was gathered through active and passive surveillance over 18 months of follow-up. Time to first diarrheal illness was analyzed by using Kaplan-Meier plots. Incidence rates and hazard ratios (HRs) were calculated by using recurrent event Poisson regression models. RESULTS: No significant difference existed in survival time to first diarrheal illness (log rank P = .55). The incidences of diarrheal episodes were 3.43 (95% confidence interval [CI], 3.28–3.59) and 3.59 per child-year (95% CI, 3.44–3.76) in the placebo and intervention arms, respectively. Vitamin D3 supplementation was found to have no effect on the risk for recurrent diarrheal disease in either intention-to-treat (HR, 1.05; 95% CI, 0.98–1.17; P = .15) or per protocol (HR, 1.05; 95% CI, 0.98–1.12; P = .14) analyses. The lack of preventive benefit remained when the randomized population was stratified by age groups, nutritional status, and seasons. CONCLUSIONS: Quarterly supplementation with vitamin D3 conferred no reduction on time to first illness or on the risk for recurrent diarrheal disease in this study. Similar supplementation to comparable populations is not recommended. Additional research in alternative settings may be helpful in elucidating the role of vitamin D3 supplementation for prevention of diarrheal diseases. PMID:24019420

  13. Effectiveness of an intensive E-mail based intervention in smoking cessation (TABATIC study): study protocol for a randomized controlled trial

    PubMed Central

    2013-01-01

    Background Intensive interventions on smoking cessation increase abstinence rates. However, few electronic mail (E-mail) based intensive interventions have been tested in smokers and none in primary care (PC) setting. The aim of the present study is to evaluate the effectiveness of an intensive E-mail based intervention in smokers attending PC services. Methods/design Randomized Controlled Multicentric Trial. Study population: 1060 smokers aged between 18–70 years from Catalonia, Salamanca and Aragón (Spain) who have and check regularly an E-mail account. Patients will be randomly assigned to control or intervention group. Intervention: Six phase intensive intervention with two face to face interviews and four automatically created and personal E-mail patients tracking, if needed other E-mail contacts will be made. Control group will receive a brief advice on smoking cessation. Outcome measures: Will be measured at 6 and 12 months after intervention: self reported continuous abstinence (confirmed by cooximetry), point prevalence abstinence, tobacco consumption, evolution of stage according to Prochaska and DiClemente's Stages of Change Model, length of visit, costs for the patient to access Primary Care Center. Statistical analysis: Descriptive and logistic and Poisson regression analysis under the intention to treat basis using SPSS v.17. Discussion The proposed intervention is an E-mail based intensive intervention in smokers attending primary care. Positive results could be useful to demonstrate a higher percentage of short and long-term abstinence among smokers attended in PC in Spain who regularly use E-mail. Furthermore, this intervention could be helpful in all health services to help smokers to quit. Trial Registration Clinical Trials.gov Identifier: NCT01494246. PMID:23597262

  14. Application of the Conway-Maxwell-Poisson generalized linear model for analyzing motor vehicle crashes.

    PubMed

    Lord, Dominique; Guikema, Seth D; Geedipally, Srinivas Reddy

    2008-05-01

    This paper documents the application of the Conway-Maxwell-Poisson (COM-Poisson) generalized linear model (GLM) for modeling motor vehicle crashes. The COM-Poisson distribution, originally developed in 1962, has recently been re-introduced by statisticians for analyzing count data subjected to over- and under-dispersion. This innovative distribution is an extension of the Poisson distribution. The objectives of this study were to evaluate the application of the COM-Poisson GLM for analyzing motor vehicle crashes and compare the results with the traditional negative binomial (NB) model. The comparison analysis was carried out using the most common functional forms employed by transportation safety analysts, which link crashes to the entering flows at intersections or on segments. To accomplish the objectives of the study, several NB and COM-Poisson GLMs were developed and compared using two datasets. The first dataset contained crash data collected at signalized four-legged intersections in Toronto, Ont. The second dataset included data collected for rural four-lane divided and undivided highways in Texas. Several methods were used to assess the statistical fit and predictive performance of the models. The results of this study show that COM-Poisson GLMs perform as well as NB models in terms of GOF statistics and predictive performance. Given the fact the COM-Poisson distribution can also handle under-dispersed data (while the NB distribution cannot or has difficulties converging), which have sometimes been observed in crash databases, the COM-Poisson GLM offers a better alternative over the NB model for modeling motor vehicle crashes, especially given the important limitations recently documented in the safety literature about the latter type of model.

  15. Conditional Poisson models: a flexible alternative to conditional logistic case cross-over analysis.

    PubMed

    Armstrong, Ben G; Gasparrini, Antonio; Tobias, Aurelio

    2014-11-24

    The time stratified case cross-over approach is a popular alternative to conventional time series regression for analysing associations between time series of environmental exposures (air pollution, weather) and counts of health outcomes. These are almost always analyzed using conditional logistic regression on data expanded to case-control (case crossover) format, but this has some limitations. In particular adjusting for overdispersion and auto-correlation in the counts is not possible. It has been established that a Poisson model for counts with stratum indicators gives identical estimates to those from conditional logistic regression and does not have these limitations, but it is little used, probably because of the overheads in estimating many stratum parameters. The conditional Poisson model avoids estimating stratum parameters by conditioning on the total event count in each stratum, thus simplifying the computing and increasing the number of strata for which fitting is feasible compared with the standard unconditional Poisson model. Unlike the conditional logistic model, the conditional Poisson model does not require expanding the data, and can adjust for overdispersion and auto-correlation. It is available in Stata, R, and other packages. By applying to some real data and using simulations, we demonstrate that conditional Poisson models were simpler to code and shorter to run than are conditional logistic analyses and can be fitted to larger data sets than possible with standard Poisson models. Allowing for overdispersion or autocorrelation was possible with the conditional Poisson model but when not required this model gave identical estimates to those from conditional logistic regression. Conditional Poisson regression models provide an alternative to case crossover analysis of stratified time series data with some advantages. The conditional Poisson model can also be used in other contexts in which primary control for confounding is by fine stratification.

  16. Phase-space networks of geometrically frustrated systems.

    PubMed

    Han, Yilong

    2009-11-01

    We illustrate a network approach to the phase-space study by using two geometrical frustration models: antiferromagnet on triangular lattice and square ice. Their highly degenerated ground states are mapped as discrete networks such that the quantitative network analysis can be applied to phase-space studies. The resulting phase spaces share some comon features and establish a class of complex networks with unique Gaussian spectral densities. Although phase-space networks are heterogeneously connected, the systems are still ergodic due to the random Poisson processes. This network approach can be generalized to phase spaces of some other complex systems.

  17. Boundaries, kinetic properties, and final domain structure of plane discrete uniform Poisson-Voronoi tessellations with von Neumann neighborhoods.

    PubMed

    Korobov, A

    2009-03-01

    Discrete random tessellations appear not infrequently in describing nucleation and growth transformations. Generally, several non-Euclidean metrics are possible in this case. Previously [A. Korobov, Phys. Rev. B 76, 085430 (2007)] continual analogs of such tessellations have been studied. Here one of the simplest discrete varieties of the Kolmogorov-Johnson-Mehl-Avrami model, namely, the model with von Neumann neighborhoods, has been examined per se, i.e., without continualization. The tessellation is uniform in the sense that domain boundaries consist of tiles. Similarities and distinctions between discrete and continual models are discussed.

  18. Absorbing metasurface created by diffractionless disordered arrays of nanoantennas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chevalier, Paul; Minao, Laboratoire de Photonique et Nanostructures; Bouchon, Patrick, E-mail: patrick.bouchon@onera.fr

    2015-12-21

    We study disordered arrays of metal-insulator-metal nanoantenna in order to create a diffractionless metasurface able to absorb light in the 3–5 μm spectral range. This study is conducted with angle-resolved reflectivity measurements obtained with a Fourier transform infrared spectrometer. A first design is based on a perturbation of a periodic arrangement, leading to a significant reduction of the radiative losses. Then, a random assembly of nanoantennas is built following a Poisson-disk distribution of given density, in order to obtain a nearly perfect cluttered assembly with optical properties of a homogeneous material.

  19. Characterizing the performance of the Conway-Maxwell Poisson generalized linear model.

    PubMed

    Francis, Royce A; Geedipally, Srinivas Reddy; Guikema, Seth D; Dhavala, Soma Sekhar; Lord, Dominique; LaRocca, Sarah

    2012-01-01

    Count data are pervasive in many areas of risk analysis; deaths, adverse health outcomes, infrastructure system failures, and traffic accidents are all recorded as count events, for example. Risk analysts often wish to estimate the probability distribution for the number of discrete events as part of doing a risk assessment. Traditional count data regression models of the type often used in risk assessment for this problem suffer from limitations due to the assumed variance structure. A more flexible model based on the Conway-Maxwell Poisson (COM-Poisson) distribution was recently proposed, a model that has the potential to overcome the limitations of the traditional model. However, the statistical performance of this new model has not yet been fully characterized. This article assesses the performance of a maximum likelihood estimation method for fitting the COM-Poisson generalized linear model (GLM). The objectives of this article are to (1) characterize the parameter estimation accuracy of the MLE implementation of the COM-Poisson GLM, and (2) estimate the prediction accuracy of the COM-Poisson GLM using simulated data sets. The results of the study indicate that the COM-Poisson GLM is flexible enough to model under-, equi-, and overdispersed data sets with different sample mean values. The results also show that the COM-Poisson GLM yields accurate parameter estimates. The COM-Poisson GLM provides a promising and flexible approach for performing count data regression. © 2011 Society for Risk Analysis.

  20. Requirements of frictional debonding at fiber/matrix interfaces for tough ceramic composites

    NASA Astrophysics Data System (ADS)

    Hsueh, Chun-Hway

    1992-11-01

    Optimum toughening of fiber-reinforced ceramic composites requires debonding at fiber/matrix interfaces and subsequent frictional sliding between the fibers and the matrix as the main crack extends through the composite. Criteria of both interfacial debonding vs fiber fracture, and frictional debonding vs frictionless debonding, are illustrated. To achieve interfacial debonding, the ratio of the fiber strength to the interfacial shear strength must exceed a critical value; to achieve a frictional interface after interfacial debonding, the ratio of the interfacial residual clamping stress to the interfacial shear strength must also exceed a critical value. While interfacial debonding is not sensitive to Poisson's effect, the frictional interface is sensitive to Poisson's effect.

  1. Nonlocal and nonlinear electrostatics of a dipolar Coulomb fluid.

    PubMed

    Sahin, Buyukdagli; Ralf, Blossey

    2014-07-16

    We study a model Coulomb fluid consisting of dipolar solvent molecules of finite extent which generalizes the point-like dipolar Poisson-Boltzmann model (DPB) previously introduced by Coalson and Duncan (1996 J. Phys. Chem. 100 2612) and Abrashkin et al (2007 Phys. Rev. Lett. 99 077801). We formulate a nonlocal Poisson-Boltzmann equation (NLPB) and study both linear and nonlinear dielectric response in this model for the case of a single plane geometry. Our results shed light on the relevance of nonlocal versus nonlinear effects in continuum models of material electrostatics.

  2. Effect of non-Poisson samples on turbulence spectra from laser velocimetry

    NASA Technical Reports Server (NTRS)

    Sree, Dave; Kjelgaard, Scott O.; Sellers, William L., III

    1994-01-01

    Spectral analysis of laser velocimetry (LV) data plays an important role in characterizing a turbulent flow and in estimating the associated turbulence scales, which can be helpful in validating theoretical and numerical turbulence models. The determination of turbulence scales is critically dependent on the accuracy of the spectral estimates. Spectral estimations from 'individual realization' laser velocimetry data are typically based on the assumption of a Poisson sampling process. What this Note has demonstrated is that the sampling distribution must be considered before spectral estimates are used to infer turbulence scales.

  3. A Method of Poisson's Ration Imaging Within a Material Part

    NASA Technical Reports Server (NTRS)

    Roth, Don J. (Inventor)

    1994-01-01

    The present invention is directed to a method of displaying the Poisson's ratio image of a material part. In the present invention, longitudinal data is produced using a longitudinal wave transducer and shear wave data is produced using a shear wave transducer. The respective data is then used to calculate the Poisson's ratio for the entire material part. The Poisson's ratio approximations are then used to display the data.

  4. Method of Poisson's ratio imaging within a material part

    NASA Technical Reports Server (NTRS)

    Roth, Don J. (Inventor)

    1996-01-01

    The present invention is directed to a method of displaying the Poisson's ratio image of a material part. In the present invention longitudinal data is produced using a longitudinal wave transducer and shear wave data is produced using a shear wave transducer. The respective data is then used to calculate the Poisson's ratio for the entire material part. The Poisson's ratio approximations are then used to displayed the image.

  5. A Review of Multivariate Distributions for Count Data Derived from the Poisson Distribution

    PubMed Central

    Inouye, David; Yang, Eunho; Allen, Genevera; Ravikumar, Pradeep

    2017-01-01

    The Poisson distribution has been widely studied and used for modeling univariate count-valued data. Multivariate generalizations of the Poisson distribution that permit dependencies, however, have been far less popular. Yet, real-world high-dimensional count-valued data found in word counts, genomics, and crime statistics, for example, exhibit rich dependencies, and motivate the need for multivariate distributions that can appropriately model this data. We review multivariate distributions derived from the univariate Poisson, categorizing these models into three main classes: 1) where the marginal distributions are Poisson, 2) where the joint distribution is a mixture of independent multivariate Poisson distributions, and 3) where the node-conditional distributions are derived from the Poisson. We discuss the development of multiple instances of these classes and compare the models in terms of interpretability and theory. Then, we empirically compare multiple models from each class on three real-world datasets that have varying data characteristics from different domains, namely traffic accident data, biological next generation sequencing data, and text data. These empirical experiments develop intuition about the comparative advantages and disadvantages of each class of multivariate distribution that was derived from the Poisson. Finally, we suggest new research directions as explored in the subsequent discussion section. PMID:28983398

  6. Noise, chaos, and (ɛ, τ)-entropy per unit time

    NASA Astrophysics Data System (ADS)

    Gaspard, Pierre; Wang, Xiao-Jing

    1993-12-01

    The degree of dynamical randomness of different time processes is characterized in terms of the (ε, τ)-entropy per unit time. The (ε, τ)-entropy is the amount of information generated per unit time, at different scales τ of time and ε of the observables. This quantity generalizes the Kolmogorov-Sinai entropy per unit time from deterministic chaotic processes, to stochastic processes such as fluctuations in mesoscopic physico-chemical phenomena or strong turbulence in macroscopic spacetime dynamics. The random processes that are characterized include chaotic systems, Bernoulli and Markov chains, Poisson and birth-and-death processes, Ornstein-Uhlenbeck and Yaglom noises, fractional Brownian motions, different regimes of hydrodynamical turbulence, and the Lorentz-Boltzmann process of nonequilibrium statistical mechanics. We also extend the (ε, τ)-entropy to spacetime processes like cellular automata, Conway's game of life, lattice gas automata, coupled maps, spacetime chaos in partial differential equations, as well as the ideal, the Lorentz, and the hard sphere gases. Through these examples it is demonstrated that the (ε, τ)-entropy provides a unified quantitative measure of dynamical randomness to both chaos and noises, and a method to detect transitions between dynamical states of different degrees of randomness as a parameter of the system is varied.

  7. Predictors for the Number of Warning Information Sources During Tornadoes.

    PubMed

    Cong, Zhen; Luo, Jianjun; Liang, Daan; Nejat, Ali

    2017-04-01

    People may receive tornado warnings from multiple information sources, but little is known about factors that affect the number of warning information sources (WISs). This study examined predictors for the number of WISs with a telephone survey on randomly sampled residents in Tuscaloosa, Alabama, and Joplin, Missouri, approximately 1 year after both cities were struck by violent tornadoes (EF4 and EF5) in 2011. The survey included 1006 finished interviews and the working sample included 903 respondents. Poisson regression and Zero-Inflated Poisson regression showed that older age and having an emergency plan predicted more WISs in both cities. Education, marital status, and gender affected the possibilities of receiving warnings and the number of WISs either in Joplin or in Tuscaloosa. The findings suggest that social disparity affects the access to warnings not only with respect to the likelihood of receiving any warnings but also with respect to the number of WISs. In addition, historical and social contexts are important for examining predictors for the number of WISs. We recommend that the number of WISs should be regarded as an important measure to evaluate access to warnings in addition to the likelihood of receiving warnings. (Disaster Med Public Health Preparedness. 2017;11:168-172).

  8. Short-range correlations control the G/K and Poisson ratios of amorphous solids and metallic glasses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zaccone, Alessio; Terentjev, Eugene M.

    2014-01-21

    The bulk modulus of many amorphous materials, such as metallic glasses, behaves nearly in agreement with the assumption of affine deformation, namely that the atoms are displaced just by the amount prescribed by the applied strain. In contrast, the shear modulus behaves as for nonaffine deformations, with additional displacements due to the structural disorder which induce a marked material softening to shear. The consequence is an anomalously large ratio of the bulk modulus to the shear modulus for disordered materials characterized by dense atomic packing, but not for random networks with point atoms. We explain this phenomenon with a microscopicmore » derivation of the elastic moduli of amorphous solids accounting for the interplay of nonaffinity and short-range particle correlations due to excluded volume. Short-range order is responsible for a reduction of the nonaffinity which is much stronger under compression, where the geometric coupling between nonaffinity and the deformation field is strong, whilst under shear this coupling is weak. Predictions of the Poisson ratio based on this model allow us to rationalize the trends as a function of coordination and atomic packing observed with many amorphous materials.« less

  9. Notes on testing equality and interval estimation in Poisson frequency data under a three-treatment three-period crossover trial.

    PubMed

    Lui, Kung-Jong; Chang, Kuang-Chao

    2016-10-01

    When the frequency of event occurrences follows a Poisson distribution, we develop procedures for testing equality of treatments and interval estimators for the ratio of mean frequencies between treatments under a three-treatment three-period crossover design. Using Monte Carlo simulations, we evaluate the performance of these test procedures and interval estimators in various situations. We note that all test procedures developed here can perform well with respect to Type I error even when the number of patients per group is moderate. We further note that the two weighted-least-squares (WLS) test procedures derived here are generally preferable to the other two commonly used test procedures in the contingency table analysis. We also demonstrate that both interval estimators based on the WLS method and interval estimators based on Mantel-Haenszel (MH) approach can perform well, and are essentially of equal precision with respect to the average length. We use a double-blind randomized three-treatment three-period crossover trial comparing salbutamol and salmeterol with a placebo with respect to the number of exacerbations of asthma to illustrate the use of these test procedures and estimators. © The Author(s) 2014.

  10. Non-linear properties of metallic cellular materials with a negative Poisson's ratio

    NASA Technical Reports Server (NTRS)

    Choi, J. B.; Lakes, R. S.

    1992-01-01

    Negative Poisson's ratio copper foam was prepared and characterized experimentally. The transformation into re-entrant foam was accomplished by applying sequential permanent compressions above the yield point to achieve a triaxial compression. The Poisson's ratio of the re-entrant foam depended on strain and attained a relative minimum at strains near zero. Poisson's ratio as small as -0.8 was achieved. The strain dependence of properties occurred over a narrower range of strain than in the polymer foams studied earlier. Annealing of the foam resulted in a slightly greater magnitude of negative Poisson's ratio and greater toughness at the expense of a decrease in the Young's modulus.

  11. Estimating effectiveness in HIV prevention trials with a Bayesian hierarchical compound Poisson frailty model

    PubMed Central

    Coley, Rebecca Yates; Browna, Elizabeth R.

    2016-01-01

    Inconsistent results in recent HIV prevention trials of pre-exposure prophylactic interventions may be due to heterogeneity in risk among study participants. Intervention effectiveness is most commonly estimated with the Cox model, which compares event times between populations. When heterogeneity is present, this population-level measure underestimates intervention effectiveness for individuals who are at risk. We propose a likelihood-based Bayesian hierarchical model that estimates the individual-level effectiveness of candidate interventions by accounting for heterogeneity in risk with a compound Poisson-distributed frailty term. This model reflects the mechanisms of HIV risk and allows that some participants are not exposed to HIV and, therefore, have no risk of seroconversion during the study. We assess model performance via simulation and apply the model to data from an HIV prevention trial. PMID:26869051

  12. Logistic regression for dichotomized counts.

    PubMed

    Preisser, John S; Das, Kalyan; Benecha, Habtamu; Stamm, John W

    2016-12-01

    Sometimes there is interest in a dichotomized outcome indicating whether a count variable is positive or zero. Under this scenario, the application of ordinary logistic regression may result in efficiency loss, which is quantifiable under an assumed model for the counts. In such situations, a shared-parameter hurdle model is investigated for more efficient estimation of regression parameters relating to overall effects of covariates on the dichotomous outcome, while handling count data with many zeroes. One model part provides a logistic regression containing marginal log odds ratio effects of primary interest, while an ancillary model part describes the mean count of a Poisson or negative binomial process in terms of nuisance regression parameters. Asymptotic efficiency of the logistic model parameter estimators of the two-part models is evaluated with respect to ordinary logistic regression. Simulations are used to assess the properties of the models with respect to power and Type I error, the latter investigated under both misspecified and correctly specified models. The methods are applied to data from a randomized clinical trial of three toothpaste formulations to prevent incident dental caries in a large population of Scottish schoolchildren. © The Author(s) 2014.

  13. A cross-comparison of different techniques for modeling macro-level cyclist crashes.

    PubMed

    Guo, Yanyong; Osama, Ahmed; Sayed, Tarek

    2018-04-01

    Despite the recognized benefits of cycling as a sustainable mode of transportation, cyclists are considered vulnerable road users and there are concerns about their safety. Therefore, it is essential to investigate the factors affecting cyclist safety. The goal of this study is to evaluate and compare different approaches of modeling macro-level cyclist safety as well as investigating factors that contribute to cyclist crashes using a comprehensive list of covariates. Data from 134 traffic analysis zones (TAZs) in the City of Vancouver were used to develop macro-level crash models (CM) incorporating variables related to actual traffic exposure, socio-economics, land use, built environment, and bike network. Four types of CMs were developed under a full Bayesian framework: Poisson lognormal model (PLN), random intercepts PLN model (RIPLN), random parameters PLN model (RPPLN), and spatial PLN model (SPLN). The SPLN model had the best goodness of fit, and the results highlighted the significant effects of spatial correlation. The models showed that the cyclist crashes were positively associated with bike and vehicle exposure measures, households, commercial area density, and signal density. On the other hand, negative associations were found between cyclist crashes and some bike network indicators such as average edge length, average zonal slope, and off-street bike links. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Radio pulsar glitches as a state-dependent Poisson process

    NASA Astrophysics Data System (ADS)

    Fulgenzi, W.; Melatos, A.; Hughes, B. D.

    2017-10-01

    Gross-Pitaevskii simulations of vortex avalanches in a neutron star superfluid are limited computationally to ≲102 vortices and ≲102 avalanches, making it hard to study the long-term statistics of radio pulsar glitches in realistically sized systems. Here, an idealized, mean-field model of the observed Gross-Pitaevskii dynamics is presented, in which vortex unpinning is approximated as a state-dependent, compound Poisson process in a single random variable, the spatially averaged crust-superfluid lag. Both the lag-dependent Poisson rate and the conditional distribution of avalanche-driven lag decrements are inputs into the model, which is solved numerically (via Monte Carlo simulations) and analytically (via a master equation). The output statistics are controlled by two dimensionless free parameters: α, the glitch rate at a reference lag, multiplied by the critical lag for unpinning, divided by the spin-down rate; and β, the minimum fraction of the lag that can be restored by a glitch. The system evolves naturally to a self-regulated stationary state, whose properties are determined by α/αc(β), where αc(β) ≈ β-1/2 is a transition value. In the regime α ≳ αc(β), one recovers qualitatively the power-law size and exponential waiting-time distributions observed in many radio pulsars and Gross-Pitaevskii simulations. For α ≪ αc(β), the size and waiting-time distributions are both power-law-like, and a correlation emerges between size and waiting time until the next glitch, contrary to what is observed in most pulsars. Comparisons with astrophysical data are restricted by the small sample sizes available at present, with ≤35 events observed per pulsar.

  15. Muscle Activity Map Reconstruction from High Density Surface EMG Signals With Missing Channels Using Image Inpainting and Surface Reconstruction Methods.

    PubMed

    Ghaderi, Parviz; Marateb, Hamid R

    2017-07-01

    The aim of this study was to reconstruct low-quality High-density surface EMG (HDsEMG) signals, recorded with 2-D electrode arrays, using image inpainting and surface reconstruction methods. It is common that some fraction of the electrodes may provide low-quality signals. We used variety of image inpainting methods, based on partial differential equations (PDEs), and surface reconstruction methods to reconstruct the time-averaged or instantaneous muscle activity maps of those outlier channels. Two novel reconstruction algorithms were also proposed. HDsEMG signals were recorded from the biceps femoris and brachial biceps muscles during low-to-moderate-level isometric contractions, and some of the channels (5-25%) were randomly marked as outliers. The root-mean-square error (RMSE) between the original and reconstructed maps was then calculated. Overall, the proposed Poisson and wave PDE outperformed the other methods (average RMSE 8.7 μV rms ± 6.1 μV rms and 7.5 μV rms ± 5.9 μV rms ) for the time-averaged single-differential and monopolar map reconstruction, respectively. Biharmonic Spline, the discrete cosine transform, and the Poisson PDE outperformed the other methods for the instantaneous map reconstruction. The running time of the proposed Poisson and wave PDE methods, implemented using a Vectorization package, was 4.6 ± 5.7 ms and 0.6 ± 0.5 ms, respectively, for each signal epoch or time sample in each channel. The proposed reconstruction algorithms could be promising new tools for reconstructing muscle activity maps in real-time applications. Proper reconstruction methods could recover the information of low-quality recorded channels in HDsEMG signals.

  16. M≥7 Earthquake rupture forecast and time-dependent probability for the Sea of Marmara region, Turkey

    USGS Publications Warehouse

    Murru, Maura; Akinci, Aybige; Falcone, Guiseppe; Pucci, Stefano; Console, Rodolfo; Parsons, Thomas E.

    2016-01-01

    We forecast time-independent and time-dependent earthquake ruptures in the Marmara region of Turkey for the next 30 years using a new fault-segmentation model. We also augment time-dependent Brownian Passage Time (BPT) probability with static Coulomb stress changes (ΔCFF) from interacting faults. We calculate Mw > 6.5 probability from 26 individual fault sources in the Marmara region. We also consider a multisegment rupture model that allows higher-magnitude ruptures over some segments of the Northern branch of the North Anatolian Fault Zone (NNAF) beneath the Marmara Sea. A total of 10 different Mw=7.0 to Mw=8.0 multisegment ruptures are combined with the other regional faults at rates that balance the overall moment accumulation. We use Gaussian random distributions to treat parameter uncertainties (e.g., aperiodicity, maximum expected magnitude, slip rate, and consequently mean recurrence time) of the statistical distributions associated with each fault source. We then estimate uncertainties of the 30-year probability values for the next characteristic event obtained from three different models (Poisson, BPT, and BPT+ΔCFF) using a Monte Carlo procedure. The Gerede fault segment located at the eastern end of the Marmara region shows the highest 30-yr probability, with a Poisson value of 29%, and a time-dependent interaction probability of 48%. We find an aggregated 30-yr Poisson probability of M >7.3 earthquakes at Istanbul of 35%, which increases to 47% if time dependence and stress transfer are considered. We calculate a 2-fold probability gain (ratio time-dependent to time-independent) on the southern strands of the North Anatolian Fault Zone.

  17. Simulation methods with extended stability for stiff biochemical Kinetics.

    PubMed

    Rué, Pau; Villà-Freixa, Jordi; Burrage, Kevin

    2010-08-11

    With increasing computer power, simulating the dynamics of complex systems in chemistry and biology is becoming increasingly routine. The modelling of individual reactions in (bio)chemical systems involves a large number of random events that can be simulated by the stochastic simulation algorithm (SSA). The key quantity is the step size, or waiting time, tau, whose value inversely depends on the size of the propensities of the different channel reactions and which needs to be re-evaluated after every firing event. Such a discrete event simulation may be extremely expensive, in particular for stiff systems where tau can be very short due to the fast kinetics of some of the channel reactions. Several alternative methods have been put forward to increase the integration step size. The so-called tau-leap approach takes a larger step size by allowing all the reactions to fire, from a Poisson or Binomial distribution, within that step. Although the expected value for the different species in the reactive system is maintained with respect to more precise methods, the variance at steady state can suffer from large errors as tau grows. In this paper we extend Poisson tau-leap methods to a general class of Runge-Kutta (RK) tau-leap methods. We show that with the proper selection of the coefficients, the variance of the extended tau-leap can be well-behaved, leading to significantly larger step sizes. The benefit of adapting the extended method to the use of RK frameworks is clear in terms of speed of calculation, as the number of evaluations of the Poisson distribution is still one set per time step, as in the original tau-leap method. The approach paves the way to explore new multiscale methods to simulate (bio)chemical systems.

  18. Assessing model uncertainty using hexavalent chromium and ...

    EPA Pesticide Factsheets

    Introduction: The National Research Council recommended quantitative evaluation of uncertainty in effect estimates for risk assessment. This analysis considers uncertainty across model forms and model parameterizations with hexavalent chromium [Cr(VI)] and lung cancer mortality as an example. The objective of this analysis is to characterize model uncertainty by evaluating the variance in estimates across several epidemiologic analyses.Methods: This analysis compared 7 publications analyzing two different chromate production sites in Ohio and Maryland. The Ohio cohort consisted of 482 workers employed from 1940-72, while the Maryland site employed 2,357 workers from 1950-74. Cox and Poisson models were the only model forms considered by study authors to assess the effect of Cr(VI) on lung cancer mortality. All models adjusted for smoking and included a 5-year exposure lag, however other latency periods and model covariates such as age and race were considered. Published effect estimates were standardized to the same units and normalized by their variances to produce a standardized metric to compare variability in estimates across and within model forms. A total of 7 similarly parameterized analyses were considered across model forms, and 23 analyses with alternative parameterizations were considered within model form (14 Cox; 9 Poisson). Results: Across Cox and Poisson model forms, adjusted cumulative exposure coefficients for 7 similar analyses ranged from 2.47

  19. Steric effects in the dynamics of electrolytes at large applied voltages. II. Modified Poisson-Nernst-Planck equations.

    PubMed

    Kilic, Mustafa Sabri; Bazant, Martin Z; Ajdari, Armand

    2007-02-01

    In situations involving large potentials or surface charges, the Poisson-Boltzman (PB) equation has shortcomings because it neglects ion-ion interactions and steric effects. This has been widely recognized by the electrochemistry community, leading to the development of various alternative models resulting in different sets "modified PB equations," which have had at least qualitative success in predicting equilibrium ion distributions. On the other hand, the literature is scarce in terms of descriptions of concentration dynamics in these regimes. Here, adapting strategies developed to modify the PB equation, we propose a simple modification of the widely used Poisson-Nernst-Planck (PNP) equations for ionic transport, which at least qualitatively accounts for steric effects. We analyze numerical solutions of these modified PNP equations on the model problem of the charging of a simple electrolyte cell, and compare the outcome to that of the standard PNP equations. Finally, we repeat the asymptotic analysis of Bazant, Thornton, and Ajdari [Phys. Rev. E 70, 021506 (2004)] for this new system of equations to further document the interest and limits of validity of the simpler equivalent electrical circuit models introduced in Part I [Kilic, Bazant, and Ajdari, Phys. Rev. E 75, 021502 (2007)] for such problems.

  20. Soft Wall Ion Channel in Continuum Representation with Application to Modeling Ion Currents in α-Hemolysin

    PubMed Central

    Simakov, Nikolay A.

    2010-01-01

    A soft repulsion (SR) model of short range interactions between mobile ions and protein atoms is introduced in the framework of continuum representation of the protein and solvent. The Poisson-Nernst-Plank (PNP) theory of ion transport through biological channels is modified to incorporate this soft wall protein model. Two sets of SR parameters are introduced: the first is parameterized for all essential amino acid residues using all atom molecular dynamic simulations; the second is a truncated Lennard – Jones potential. We have further designed an energy based algorithm for the determination of the ion accessible volume, which is appropriate for a particular system discretization. The effects of these models of short-range interaction were tested by computing current-voltage characteristics of the α-hemolysin channel. The introduced SR potentials significantly improve prediction of channel selectivity. In addition, we studied the effect of choice of some space-dependent diffusion coefficient distributions on the predicted current-voltage properties. We conclude that the diffusion coefficient distributions largely affect total currents and have little effect on rectifications, selectivity or reversal potential. The PNP-SR algorithm is implemented in a new efficient parallel Poisson, Poisson-Boltzman and PNP equation solver, also incorporated in a graphical molecular modeling package HARLEM. PMID:21028776

  1. Non-Poisson Processes: Regression to Equilibrium Versus Equilibrium Correlation Functions

    DTIC Science & Technology

    2004-07-07

    ARTICLE IN PRESSPhysica A 347 (2005) 268–2880378-4371/$ - doi:10.1016/j Correspo E-mail adwww.elsevier.com/locate/physaNon- Poisson processes : regression...05.40.a; 89.75.k; 02.50.Ey Keywords: Stochastic processes; Non- Poisson processes ; Liouville and Liouville-like equations; Correlation function...which is not legitimate with renewal non- Poisson processes , is a correct property if the deviation from the exponential relaxation is obtained by time

  2. Multiple imputation for assessment of exposures to drinking water contaminants: evaluation with the Atrazine Monitoring Program.

    PubMed

    Jones, Rachael M; Stayner, Leslie T; Demirtas, Hakan

    2014-10-01

    Drinking water may contain pollutants that harm human health. The frequency of pollutant monitoring may occur quarterly, annually, or less frequently, depending upon the pollutant, the pollutant concentration, and community water system. However, birth and other health outcomes are associated with narrow time-windows of exposure. Infrequent monitoring impedes linkage between water quality and health outcomes for epidemiological analyses. To evaluate the performance of multiple imputation to fill in water quality values between measurements in community water systems (CWSs). The multiple imputation method was implemented in a simulated setting using data from the Atrazine Monitoring Program (AMP, 2006-2009 in five Midwestern states). Values were deleted from the AMP data to leave one measurement per month. Four patterns reflecting drinking water monitoring regulations were used to delete months of data in each CWS: three patterns were missing at random and one pattern was missing not at random. Synthetic health outcome data were created using a linear and a Poisson exposure-response relationship with five levels of hypothesized association, respectively. The multiple imputation method was evaluated by comparing the exposure-response relationships estimated based on multiply imputed data with the hypothesized association. The four patterns deleted 65-92% months of atrazine observations in AMP data. Even with these high rates of missing information, our procedure was able to recover most of the missing information when the synthetic health outcome was included for missing at random patterns and for missing not at random patterns with low-to-moderate exposure-response relationships. Multiple imputation appears to be an effective method for filling in water quality values between measurements. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Encapsulation of single cells on a microfluidic device integrating droplet generation with fluorescence-activated droplet sorting.

    PubMed

    Wu, Liang; Chen, Pu; Dong, Yingsong; Feng, Xiaojun; Liu, Bi-Feng

    2013-06-01

    Encapsulation of single cells is a challenging task in droplet microfluidics due to the random compartmentalization of cells dictated by Poisson statistics. In this paper, a microfluidic device was developed to improve the single-cell encapsulation rate by integrating droplet generation with fluorescence-activated droplet sorting. After cells were loaded into aqueous droplets by hydrodynamic focusing, an on-flight fluorescence-activated sorting process was conducted to isolate droplets containing one cell. Encapsulation of fluorescent polystyrene beads was investigated to evaluate the developed method. A single-bead encapsulation rate of more than 98 % was achieved under the optimized conditions. Application to encapsulate single HeLa cells was further demonstrated with a single-cell encapsulation rate of 94.1 %, which is about 200 % higher than those obtained by random compartmentalization. We expect this new method to provide a useful platform for encapsulating single cells, facilitating the development of high-throughput cell-based assays.

  4. Spatiotemporal reconstruction of list-mode PET data.

    PubMed

    Nichols, Thomas E; Qi, Jinyi; Asma, Evren; Leahy, Richard M

    2002-04-01

    We describe a method for computing a continuous time estimate of tracer density using list-mode positron emission tomography data. The rate function in each voxel is modeled as an inhomogeneous Poisson process whose rate function can be represented using a cubic B-spline basis. The rate functions are estimated by maximizing the likelihood of the arrival times of detected photon pairs over the control vertices of the spline, modified by quadratic spatial and temporal smoothness penalties and a penalty term to enforce nonnegativity. Randoms rate functions are estimated by assuming independence between the spatial and temporal randoms distributions. Similarly, scatter rate functions are estimated by assuming spatiotemporal independence and that the temporal distribution of the scatter is proportional to the temporal distribution of the trues. A quantitative evaluation was performed using simulated data and the method is also demonstrated in a human study using 11C-raclopride.

  5. Probabilistic track coverage in cooperative sensor networks.

    PubMed

    Ferrari, Silvia; Zhang, Guoxian; Wettergren, Thomas A

    2010-12-01

    The quality of service of a network performing cooperative track detection is represented by the probability of obtaining multiple elementary detections over time along a target track. Recently, two different lines of research, namely, distributed-search theory and geometric transversals, have been used in the literature for deriving the probability of track detection as a function of random and deterministic sensors' positions, respectively. In this paper, we prove that these two approaches are equivalent under the same problem formulation. Also, we present a new performance function that is derived by extending the geometric-transversal approach to the case of random sensors' positions using Poisson flats. As a result, a unified approach for addressing track detection in both deterministic and probabilistic sensor networks is obtained. The new performance function is validated through numerical simulations and is shown to bring about considerable computational savings for both deterministic and probabilistic sensor networks.

  6. On the theory of Lorentz gases with long range interactions

    NASA Astrophysics Data System (ADS)

    Nota, Alessia; Simonella, Sergio; Velázquez, Juan J. L.

    We construct and study the stochastic force field generated by a Poisson distribution of sources at finite density, x1,x2,…, in ℝ3 each of them yielding a long range potential QiΦ(x - xi) with possibly different charges Qi ∈ ℝ. The potential Φ is assumed to behave typically as |x|-s for large |x|, with s > 1/2. We will denote the resulting random field as “generalized Holtsmark field”. We then consider the dynamics of one tagged particle in such random force fields, in several scaling limits where the mean free path is much larger than the average distance between the scatterers. We estimate the diffusive time scale and identify conditions for the vanishing of correlations. These results are used to obtain appropriate kinetic descriptions in terms of a linear Boltzmann or Landau evolution equation depending on the specific choices of the interaction potential.

  7. Poisson-type inequalities for growth properties of positive superharmonic functions.

    PubMed

    Luan, Kuan; Vieira, John

    2017-01-01

    In this paper, we present new Poisson-type inequalities for Poisson integrals with continuous data on the boundary. The obtained inequalities are used to obtain growth properties at infinity of positive superharmonic functions in a smooth cone.

  8. Combined analysis of magnetic and gravity anomalies using normalized source strength (NSS)

    NASA Astrophysics Data System (ADS)

    Li, L.; Wu, Y.

    2017-12-01

    Gravity field and magnetic field belong to potential fields which lead inherent multi-solution. Combined analysis of magnetic and gravity anomalies based on Poisson's relation is used to determinate homology gravity and magnetic anomalies and decrease the ambiguity. The traditional combined analysis uses the linear regression of the reduction to pole (RTP) magnetic anomaly to the first order vertical derivative of the gravity anomaly, and provides the quantitative or semi-quantitative interpretation by calculating the correlation coefficient, slope and intercept. In the calculation process, due to the effect of remanent magnetization, the RTP anomaly still contains the effect of oblique magnetization. In this case the homology gravity and magnetic anomalies display irrelevant results in the linear regression calculation. The normalized source strength (NSS) can be transformed from the magnetic tensor matrix, which is insensitive to the remanence. Here we present a new combined analysis using NSS. Based on the Poisson's relation, the gravity tensor matrix can be transformed into the pseudomagnetic tensor matrix of the direction of geomagnetic field magnetization under the homologous condition. The NSS of pseudomagnetic tensor matrix and original magnetic tensor matrix are calculated and linear regression analysis is carried out. The calculated correlation coefficient, slope and intercept indicate the homology level, Poisson's ratio and the distribution of remanent respectively. We test the approach using synthetic model under complex magnetization, the results show that it can still distinguish the same source under the condition of strong remanence, and establish the Poisson's ratio. Finally, this approach is applied in China. The results demonstrated that our approach is feasible.

  9. Auxetic behaviour from rotating rigid units

    NASA Astrophysics Data System (ADS)

    Grima, J. N.; Alderson, A.; Evans, K. E.

    2005-03-01

    Auxetic materials exhibit the unexpected feature of becoming fatter when stretched and narrower when compressed, in other words, they exhibit a negative Poisson's ratio. This counter-intuitive behaviour imparts many beneficial effects on the material's macroscopic properties that make auxetics superior to conventional materials in many commercial applications. Recent research suggests that auxetic be-haviour generally results from a cooperative effect between the material's internal structure (geometry setup) and the deformation mechanism it undergoes when submitted to a stress. Auxetic behaviour is also known to be scale-independent, and thus, the same geometry/deformation mechanism may operate at the macro-, micro- and nano- (molecular) level. A considerable amount of research has been focused on the re-entrant honeycomb structure which exhibits auxetic behaviour if deformed through hinging at the joints or flexure of the ribs, and it was proposed that this re-entrant geometry plays an impor- tant role in generating auxetic behaviour in various forms of materials ranging from nanostructured polymers to foams. This paper discusses an alternative mode of deformation involving rotating rigid units which also results in negative Poisson's ratios. In its most ideal form, this mechanism may be construc- ted in two dimensions using rigid polygons connected together through hinges at their vertices. On application of uniaxial loads, these rigid polygons rotate with respect to each other to form a more open structure hence giving rise to a negative Poisson's ratio. This paper also discusses the role that rotating rigid units are thought to have in various classes of materials to give rise to negative Poisson's ratios.

  10. Information transmission using non-poisson regular firing.

    PubMed

    Koyama, Shinsuke; Omi, Takahiro; Kass, Robert E; Shinomoto, Shigeru

    2013-04-01

    In many cortical areas, neural spike trains do not follow a Poisson process. In this study, we investigate a possible benefit of non-Poisson spiking for information transmission by studying the minimal rate fluctuation that can be detected by a Bayesian estimator. The idea is that an inhomogeneous Poisson process may make it difficult for downstream decoders to resolve subtle changes in rate fluctuation, but by using a more regular non-Poisson process, the nervous system can make rate fluctuations easier to detect. We evaluate the degree to which regular firing reduces the rate fluctuation detection threshold. We find that the threshold for detection is reduced in proportion to the coefficient of variation of interspike intervals.

  11. Spatial variation of natural radiation and childhood leukaemia incidence in Great Britain.

    PubMed

    Richardson, S; Monfort, C; Green, M; Draper, G; Muirhead, C

    This paper describes an analysis of the geographical variation of childhood leukaemia incidence in Great Britain over a 15 year period in relation to natural radiation (gamma and radon). Data at the level of the 459 district level local authorities in England, Wales and regional districts in Scotland are analysed in two complementary ways: first, by Poisson regressions with the inclusion of environmental covariates and a smooth spatial structure; secondly, by a hierarchical Bayesian model in which extra-Poisson variability is modelled explicitly in terms of spatial and non-spatial components. From this analysis, we deduce a strong indication that a main part of the variability is accounted for by a local neighbourhood 'clustering' structure. This structure is furthermore relatively stable over the 15 year period for the lymphocytic leukaemias which make up the majority of observed cases. We found no evidence of a positive association of childhood leukaemia incidence with outdoor or indoor gamma radiation levels. There is no consistent evidence of any association with radon levels. Indeed, in the Poisson regressions, a significant positive association was only observed for one 5-year period, a result which is not compatible with a stable environmental effect. Moreover, this positive association became clearly non-significant when over-dispersion relative to the Poisson distribution was taken into account.

  12. Statistical modeling of dental unit water bacterial test kit performance.

    PubMed

    Cohen, Mark E; Harte, Jennifer A; Stone, Mark E; O'Connor, Karen H; Coen, Michael L; Cullum, Malford E

    2007-01-01

    While it is important to monitor dental water quality, it is unclear whether in-office test kits provide bacterial counts comparable to the gold standard method (R2A). Studies were conducted on specimens with known bacterial concentrations, and from dental units, to evaluate test kit accuracy across a range of bacterial types and loads. Colony forming units (CFU) were counted for samples from each source, using R2A and two types of test kits, and conformity to Poisson distribution expectations was evaluated. Poisson regression was used to test for effects of source and device, and to estimate rate ratios for kits relative to R2A. For all devices, distributions were Poisson for low CFU/mL when only beige-pigmented bacteria were considered. For higher counts, R2A remained Poisson, but kits exhibited over-dispersion. Both kits undercounted relative to R2A, but the degree of undercounting was reasonably stable. Kits did not grow pink-pigmented bacteria from dental-unit water identified as Methylobacterium rhodesianum. Only one of the test kits provided results with adequate reliability at higher bacterial concentrations. Undercount bias could be estimated for this device and used to adjust test kit results. Insensitivity to methylobacteria spp. is problematic.

  13. Universal Poisson Statistics of mRNAs with Complex Decay Pathways.

    PubMed

    Thattai, Mukund

    2016-01-19

    Messenger RNA (mRNA) dynamics in single cells are often modeled as a memoryless birth-death process with a constant probability per unit time that an mRNA molecule is synthesized or degraded. This predicts a Poisson steady-state distribution of mRNA number, in close agreement with experiments. This is surprising, since mRNA decay is known to be a complex process. The paradox is resolved by realizing that the Poisson steady state generalizes to arbitrary mRNA lifetime distributions. A mapping between mRNA dynamics and queueing theory highlights an identifiability problem: a measured Poisson steady state is consistent with a large variety of microscopic models. Here, I provide a rigorous and intuitive explanation for the universality of the Poisson steady state. I show that the mRNA birth-death process and its complex decay variants all take the form of the familiar Poisson law of rare events, under a nonlinear rescaling of time. As a corollary, not only steady-states but also transients are Poisson distributed. Deviations from the Poisson form occur only under two conditions, promoter fluctuations leading to transcriptional bursts or nonindependent degradation of mRNA molecules. These results place severe limits on the power of single-cell experiments to probe microscopic mechanisms, and they highlight the need for single-molecule measurements. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Collective Poisson process with periodic rates: applications in physics from micro-to nanodevices.

    PubMed

    da Silva, Roberto; Lamb, Luis C; Wirth, Gilson Inacio

    2011-01-28

    Continuous reductions in the dimensions of semiconductor devices have led to an increasing number of noise sources, including random telegraph signals (RTS) due to the capture and emission of electrons by traps at random positions between oxide and semiconductor. The models traditionally used for microscopic devices become of limited validity in nano- and mesoscale systems since, in such systems, distributed quantities such as electron and trap densities, and concepts like electron mobility, become inadequate to model electrical behaviour. In addition, current experimental works have shown that RTS in semiconductor devices based on carbon nanotubes lead to giant current fluctuations. Therefore, the physics of this phenomenon and techniques to decrease the amplitudes of RTS need to be better understood. This problem can be described as a collective Poisson process under different, but time-independent, rates, τ(c) and τ(e), that control the capture and emission of electrons by traps distributed over the oxide. Thus, models that consider calculations performed under time-dependent periodic capture and emission rates should be of interest in order to model more efficient devices. We show a complete theoretical description of a model that is capable of showing a noise reduction of current fluctuations in the time domain, and a reduction of the power spectral density in the frequency domain, in semiconductor devices as predicted by previous experimental work. We do so through numerical integrations and a novel Monte Carlo Markov chain (MCMC) algorithm based on microscopic discrete values. The proposed model also handles the ballistic regime, relevant in nano- and mesoscale devices. Finally, we show that the ballistic regime leads to nonlinearity in the electrical behaviour.

  15. Effect of latitude on the rate of change in incidence of Lyme disease in the United States

    PubMed Central

    Tuite, Ashleigh R.; Greer, Amy L.

    2013-01-01

    Background Tick-borne illnesses represent an important class of emerging zoonoses, with climate change projected to increase the geographic range within which tick-borne zoonoses might become endemic. We evaluated the impact of latitude on the rate of change in the incidence of Lyme disease in the United States, using publicly available data. Methods We estimated state-level year-on-year incidence rate ratios (IRRs) for Lyme disease for the period 1993 to 2007 using Poisson regression methods. We evaluated between-state heterogeneity in IRRs using a random-effects meta-analytic approach. We identified state-level characteristics associated with increasing incidence using random-effects meta-regression. Results The incidence of Lyme disease in the US increased by about 80% between 1993 and 2007 (IRR per year 1.049, 95% CI [confidence interval] 1.048 to 1.050). There was marked between-state heterogeneity in the average incidence of Lyme disease, ranging from 0.008 per 100 000 person-years in Colorado to 75 per 100 000 in Connecticut, and significant between-state heterogeneity in temporal trends (p < 0.001). In multivariable meta-regression models, increasing incidence showed a linear association with state latitude and population density. These 2 factors explained 27% of the between-state variation in IRRs. No independent association was identified for other state-level characteristics. Interpretation Lyme disease incidence increased in the US as a whole during the study period, but the changes were not uniform. Marked increases were identified in northern-most states, whereas southern states experienced stable or declining rates of Lyme disease. PMID:25077101

  16. The solution of large multi-dimensional Poisson problems

    NASA Technical Reports Server (NTRS)

    Stone, H. S.

    1974-01-01

    The Buneman algorithm for solving Poisson problems can be adapted to solve large Poisson problems on computers with a rotating drum memory so that the computation is done with very little time lost due to rotational latency of the drum.

  17. Dynamical influence processes on networks: general theory and applications to social contagion.

    PubMed

    Harris, Kameron Decker; Danforth, Christopher M; Dodds, Peter Sheridan

    2013-08-01

    We study binary state dynamics on a network where each node acts in response to the average state of its neighborhood. By allowing varying amounts of stochasticity in both the network and node responses, we find different outcomes in random and deterministic versions of the model. In the limit of a large, dense network, however, we show that these dynamics coincide. We construct a general mean-field theory for random networks and show this predicts that the dynamics on the network is a smoothed version of the average response function dynamics. Thus, the behavior of the system can range from steady state to chaotic depending on the response functions, network connectivity, and update synchronicity. As a specific example, we model the competing tendencies of imitation and nonconformity by incorporating an off-threshold into standard threshold models of social contagion. In this way, we attempt to capture important aspects of fashions and societal trends. We compare our theory to extensive simulations of this "limited imitation contagion" model on Poisson random graphs, finding agreement between the mean-field theory and stochastic simulations.

  18. Effects of learning climate and registered nurse staffing on medication errors.

    PubMed

    Chang, Yunkyung; Mark, Barbara

    2011-01-01

    Despite increasing recognition of the significance of learning from errors, little is known about how learning climate contributes to error reduction. The purpose of this study was to investigate whether learning climate moderates the relationship between error-producing conditions and medication errors. A cross-sectional descriptive study was done using data from 279 nursing units in 146 randomly selected hospitals in the United States. Error-producing conditions included work environment factors (work dynamics and nurse mix), team factors (communication with physicians and nurses' expertise), personal factors (nurses' education and experience), patient factors (age, health status, and previous hospitalization), and medication-related support services. Poisson models with random effects were used with the nursing unit as the unit of analysis. A significant negative relationship was found between learning climate and medication errors. It also moderated the relationship between nurse mix and medication errors: When learning climate was negative, having more registered nurses was associated with fewer medication errors. However, no relationship was found between nurse mix and medication errors at either positive or average levels of learning climate. Learning climate did not moderate the relationship between work dynamics and medication errors. The way nurse mix affects medication errors depends on the level of learning climate. Nursing units with fewer registered nurses and frequent medication errors should examine their learning climate. Future research should be focused on the role of learning climate as related to the relationships between nurse mix and medication errors.

  19. Exploiting negative Poisson's ratio to design 3D-printed composites with enhanced mechanical properties

    DOE PAGES

    Li, Tiantian; Chen, Yanyu; Hu, Xiaoyi; ...

    2018-02-03

    Auxetic materials exhibiting a negative Poisson's ratio are shown to have better indentation resistance, impact shielding capability, and enhanced toughness. Here, we report a class of high-performance composites in which auxetic lattice structures are used as the reinforcements and the nearly incompressible soft material is employed as the matrix. This coupled geometry and material design concept is enabled by the state-of-the-art additive manufacturing technique. Guided by experimental tests and finite element analyses, we systematically study the compressive behavior of the 3D printed auxetics reinforced composites and achieve a significant enhancement of their stiffness and energy absorption. This improved mechanical performancemore » is due to the negative Poisson's ratio effect of the auxetic reinforcements, which makes the matrix in a state of biaxial compression and hence provides additional support. This mechanism is further supported by the investigation of the effect of auxetic degree on the stiffness and energy absorption capability. The findings reported here pave the way for developing a new class of auxetic composites that significantly expand their design space and possible applications through a combination of rational design and 3D printing.« less

  20. Exploiting negative Poisson's ratio to design 3D-printed composites with enhanced mechanical properties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Tiantian; Chen, Yanyu; Hu, Xiaoyi

    Auxetic materials exhibiting a negative Poisson's ratio are shown to have better indentation resistance, impact shielding capability, and enhanced toughness. Here, we report a class of high-performance composites in which auxetic lattice structures are used as the reinforcements and the nearly incompressible soft material is employed as the matrix. This coupled geometry and material design concept is enabled by the state-of-the-art additive manufacturing technique. Guided by experimental tests and finite element analyses, we systematically study the compressive behavior of the 3D printed auxetics reinforced composites and achieve a significant enhancement of their stiffness and energy absorption. This improved mechanical performancemore » is due to the negative Poisson's ratio effect of the auxetic reinforcements, which makes the matrix in a state of biaxial compression and hence provides additional support. This mechanism is further supported by the investigation of the effect of auxetic degree on the stiffness and energy absorption capability. The findings reported here pave the way for developing a new class of auxetic composites that significantly expand their design space and possible applications through a combination of rational design and 3D printing.« less

  1. Elastic properties of gas hydrate-bearing sediments

    USGS Publications Warehouse

    Lee, M.W.; Collett, T.S.

    2001-01-01

    Downhole-measured compressional- and shear-wave velocities acquired in the Mallik 2L-38 gas hydrate research well, northwestern Canada, reveal that the dominant effect of gas hydrate on the elastic properties of gas hydrate-bearing sediments is as a pore-filling constituent. As opposed to high elastic velocities predicted from a cementation theory, whereby a small amount of gas hydrate in the pore space significantly increases the elastic velocities, the velocity increase from gas hydrate saturation in the sediment pore space is small. Both the effective medium theory and a weighted equation predict a slight increase of velocities from gas hydrate concentration, similar to the field-observed velocities; however, the weighted equation more accurately describes the compressional- and shear-wave velocities of gas hydrate-bearing sediments. A decrease of Poisson's ratio with an increase in the gas hydrate concentration is similar to a decrease of Poisson's ratio with a decrease in the sediment porosity. Poisson's ratios greater than 0.33 for gas hydrate-bearing sediments imply the unconsolidated nature of gas hydrate-bearing sediments at this well site. The seismic characteristics of gas hydrate-bearing sediments at this site can be used to compare and evaluate other gas hydrate-bearing sediments in the Arctic.

  2. Efficient three-dimensional Poisson solvers in open rectangular conducting pipe

    NASA Astrophysics Data System (ADS)

    Qiang, Ji

    2016-06-01

    Three-dimensional (3D) Poisson solver plays an important role in the study of space-charge effects on charged particle beam dynamics in particle accelerators. In this paper, we propose three new 3D Poisson solvers for a charged particle beam in an open rectangular conducting pipe. These three solvers include a spectral integrated Green function (IGF) solver, a 3D spectral solver, and a 3D integrated Green function solver. These solvers effectively handle the longitudinal open boundary condition using a finite computational domain that contains the beam itself. This saves the computational cost of using an extra larger longitudinal domain in order to set up an appropriate finite boundary condition. Using an integrated Green function also avoids the need to resolve rapid variation of the Green function inside the beam. The numerical operational cost of the spectral IGF solver and the 3D IGF solver scales as O(N log(N)) , where N is the number of grid points. The cost of the 3D spectral solver scales as O(Nn N) , where Nn is the maximum longitudinal mode number. We compare these three solvers using several numerical examples and discuss the advantageous regime of each solver in the physical application.

  3. On the Determination of Poisson Statistics for Haystack Radar Observations of Orbital Debris

    NASA Technical Reports Server (NTRS)

    Stokely, Christopher L.; Benbrook, James R.; Horstman, Matt

    2007-01-01

    A convenient and powerful method is used to determine if radar detections of orbital debris are observed according to Poisson statistics. This is done by analyzing the time interval between detection events. For Poisson statistics, the probability distribution of the time interval between events is shown to be an exponential distribution. This distribution is a special case of the Erlang distribution that is used in estimating traffic loads on telecommunication networks. Poisson statistics form the basis of many orbital debris models but the statistical basis of these models has not been clearly demonstrated empirically until now. Interestingly, during the fiscal year 2003 observations with the Haystack radar in a fixed staring mode, there are no statistically significant deviations observed from that expected with Poisson statistics, either independent or dependent of altitude or inclination. One would potentially expect some significant clustering of events in time as a result of satellite breakups, but the presence of Poisson statistics indicates that such debris disperse rapidly with respect to Haystack's very narrow radar beam. An exception to Poisson statistics is observed in the months following the intentional breakup of the Fengyun satellite in January 2007.

  4. Spatio-temporal modelling of climate-sensitive disease risk: Towards an early warning system for dengue in Brazil

    NASA Astrophysics Data System (ADS)

    Lowe, Rachel; Bailey, Trevor C.; Stephenson, David B.; Graham, Richard J.; Coelho, Caio A. S.; Sá Carvalho, Marilia; Barcellos, Christovam

    2011-03-01

    This paper considers the potential for using seasonal climate forecasts in developing an early warning system for dengue fever epidemics in Brazil. In the first instance, a generalised linear model (GLM) is used to select climate and other covariates which are both readily available and prove significant in prediction of confirmed monthly dengue cases based on data collected across the whole of Brazil for the period January 2001 to December 2008 at the microregion level (typically consisting of one large city and several smaller municipalities). The covariates explored include temperature and precipitation data on a 2.5°×2.5° longitude-latitude grid with time lags relevant to dengue transmission, an El Niño Southern Oscillation index and other relevant socio-economic and environmental variables. A negative binomial model formulation is adopted in this model selection to allow for extra-Poisson variation (overdispersion) in the observed dengue counts caused by unknown/unobserved confounding factors and possible correlations in these effects in both time and space. Subsequently, the selected global model is refined in the context of the South East region of Brazil, where dengue predominates, by reverting to a Poisson framework and explicitly modelling the overdispersion through a combination of unstructured and spatio-temporal structured random effects. The resulting spatio-temporal hierarchical model (or GLMM—generalised linear mixed model) is implemented via a Bayesian framework using Markov Chain Monte Carlo (MCMC). Dengue predictions are found to be enhanced both spatially and temporally when using the GLMM and the Bayesian framework allows posterior predictive distributions for dengue cases to be derived, which can be useful for developing a dengue alert system. Using this model, we conclude that seasonal climate forecasts could have potential value in helping to predict dengue incidence months in advance of an epidemic in South East Brazil.

  5. Extended generalized geometry and a DBI-type effective action for branes ending on branes

    NASA Astrophysics Data System (ADS)

    Jurčo, Branislav; Schupp, Peter; Vysoký, Jan

    2014-08-01

    Starting from the Nambu-Goto bosonic membrane action, we develop a geometric description suitable for p-brane backgrounds. With tools of generalized geometry we derive the pertinent generalization of the string open-closed relations to the p-brane case. Nambu-Poisson structures are used in this context to generalize the concept of semi-classical noncommutativity of D-branes governed by a Poisson tensor. We find a natural description of the correspondence of recently proposed commutative and noncommutative versions of an effective action for p-branes ending on a p '-brane. We calculate the power series expansion of the action in background independent gauge. Leading terms in the double scaling limit are given by a generalization of a (semi-classical) matrix model.

  6. Simulation Methods for Poisson Processes in Nonstationary Systems.

    DTIC Science & Technology

    1978-08-01

    for simulation of nonhomogeneous Poisson processes is stated with log-linear rate function. The method is based on an identity relating the...and relatively efficient new method for simulation of one-dimensional and two-dimensional nonhomogeneous Poisson processes is described. The method is

  7. Poisson geometry from a Dirac perspective

    NASA Astrophysics Data System (ADS)

    Meinrenken, Eckhard

    2018-03-01

    We present proofs of classical results in Poisson geometry using techniques from Dirac geometry. This article is based on mini-courses at the Poisson summer school in Geneva, June 2016, and at the workshop Quantum Groups and Gravity at the University of Waterloo, April 2016.

  8. Identification of a Class of Filtered Poisson Processes.

    DTIC Science & Technology

    1981-01-01

    LD-A135 371 IDENTIFICATION OF A CLASS OF FILERED POISSON PROCESSES I AU) NORTH CAROLINA UNIV AT CHAPEL HIL DEPT 0F STATISTICS D DE RRUC ET AL 1981...STNO&IO$ !tt ~ 4.s " . , ".7" -L N ~ TITLE :IDENTIFICATION OF A CLASS OF FILTERED POISSON PROCESSES Authors : DE BRUCQ Denis - GUALTIEROTTI Antonio...filtered Poisson processes is intro- duced : the amplitude has a law which is spherically invariant and the filter is real, linear and causal. It is shown

  9. Interactive Graphic Simulation of Rolling Element Bearings. Phase I. Low Frequency Phenomenon and RAPIDREB Development.

    DTIC Science & Technology

    1981-11-01

    RDRER413 C EH 11-22 HOUSING ELASTIC MODUJLUS (F/L**2). RDRE8415 C PO4 ?3-34 HOUSING POISSON-S PATTO . PDPR416 C DENH 35-46 HOUSING MATERIAL DFNSITY (MA/L...23-34 CAGE POISSON-S PATTO . RDPRE427 C DENC 35-46 CAC7E MATFRIAL DENSITY (MA/L-03), PDPEP4?8 C RDRER4?9 C CARD 11 RDRE9430 C ---- ROPER431 C JF 11-16

  10. Cumulative Poisson Distribution Program

    NASA Technical Reports Server (NTRS)

    Bowerman, Paul N.; Scheuer, Ernest M.; Nolty, Robert

    1990-01-01

    Overflow and underflow in sums prevented. Cumulative Poisson Distribution Program, CUMPOIS, one of two computer programs that make calculations involving cumulative Poisson distributions. Both programs, CUMPOIS (NPO-17714) and NEWTPOIS (NPO-17715), used independently of one another. CUMPOIS determines cumulative Poisson distribution, used to evaluate cumulative distribution function (cdf) for gamma distributions with integer shape parameters and cdf for X (sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Written in C.

  11. Poly-symplectic Groupoids and Poly-Poisson Structures

    NASA Astrophysics Data System (ADS)

    Martinez, Nicolas

    2015-05-01

    We introduce poly-symplectic groupoids, which are natural extensions of symplectic groupoids to the context of poly-symplectic geometry, and define poly-Poisson structures as their infinitesimal counterparts. We present equivalent descriptions of poly-Poisson structures, including one related with AV-Dirac structures. We also discuss symmetries and reduction in the setting of poly-symplectic groupoids and poly-Poisson structures, and use our viewpoint to revisit results and develop new aspects of the theory initiated in Iglesias et al. (Lett Math Phys 103:1103-1133, 2013).

  12. Trends in Mortality After Primary Cytoreductive Surgery for Ovarian Cancer: A Systematic Review and Metaregression of Randomized Clinical Trials and Observational Studies.

    PubMed

    Di Donato, Violante; Kontopantelis, Evangelos; Aletti, Giovanni; Casorelli, Assunta; Piacenti, Ilaria; Bogani, Giorgio; Lecce, Francesca; Benedetti Panici, Pierluigi

    2017-06-01

    Primary cytoreductive surgery (PDS) followed by platinum-based chemotherapy is the cornerstone of treatment and the absence of residual tumor after PDS is universally considered the most important prognostic factor. The aim of the present analysis was to evaluate trend and predictors of 30-day mortality in patients undergoing primary cytoreduction for ovarian cancer. Literature was searched for records reporting 30-day mortality after PDS. All cohorts were rated for quality. Simple and multiple Poisson regression models were used to quantify the association between 30-day mortality and the following: overall or severe complications, proportion of patients with stage IV disease, median age, year of publication, and weighted surgical complexity index. Using the multiple regression model, we calculated the risk of perioperative mortality at different levels for statistically significant covariates of interest. Simple regression identified median age and proportion of patients with stage IV disease as statistically significant predictors of 30-day mortality. When included in the multiple Poisson regression model, both remained statistically significant, with an incidence rate ratio of 1.087 for median age and 1.017 for stage IV disease. Disease stage was a strong predictor, with the risk estimated to increase from 2.8% (95% confidence interval 2.02-3.66) for stage III to 16.1% (95% confidence interval 6.18-25.93) for stage IV, for a cohort with a median age of 65 years. Metaregression demonstrated that increased age and advanced clinical stage were independently associated with an increased risk of mortality, and the combined effects of both factors greatly increased the risk.

  13. Fractional poisson--a simple dose-response model for human norovirus.

    PubMed

    Messner, Michael J; Berger, Philip; Nappier, Sharon P

    2014-10-01

    This study utilizes old and new Norovirus (NoV) human challenge data to model the dose-response relationship for human NoV infection. The combined data set is used to update estimates from a previously published beta-Poisson dose-response model that includes parameters for virus aggregation and for a beta-distribution that describes variable susceptibility among hosts. The quality of the beta-Poisson model is examined and a simpler model is proposed. The new model (fractional Poisson) characterizes hosts as either perfectly susceptible or perfectly immune, requiring a single parameter (the fraction of perfectly susceptible hosts) in place of the two-parameter beta-distribution. A second parameter is included to account for virus aggregation in the same fashion as it is added to the beta-Poisson model. Infection probability is simply the product of the probability of nonzero exposure (at least one virus or aggregate is ingested) and the fraction of susceptible hosts. The model is computationally simple and appears to be well suited to the data from the NoV human challenge studies. The model's deviance is similar to that of the beta-Poisson, but with one parameter, rather than two. As a result, the Akaike information criterion favors the fractional Poisson over the beta-Poisson model. At low, environmentally relevant exposure levels (<100), estimation error is small for the fractional Poisson model; however, caution is advised because no subjects were challenged at such a low dose. New low-dose data would be of great value to further clarify the NoV dose-response relationship and to support improved risk assessment for environmentally relevant exposures. © 2014 Society for Risk Analysis Published 2014. This article is a U.S. Government work and is in the public domain for the U.S.A.

  14. Modeling animal-vehicle collisions using diagonal inflated bivariate Poisson regression.

    PubMed

    Lao, Yunteng; Wu, Yao-Jan; Corey, Jonathan; Wang, Yinhai

    2011-01-01

    Two types of animal-vehicle collision (AVC) data are commonly adopted for AVC-related risk analysis research: reported AVC data and carcass removal data. One issue with these two data sets is that they were found to have significant discrepancies by previous studies. In order to model these two types of data together and provide a better understanding of highway AVCs, this study adopts a diagonal inflated bivariate Poisson regression method, an inflated version of bivariate Poisson regression model, to fit the reported AVC and carcass removal data sets collected in Washington State during 2002-2006. The diagonal inflated bivariate Poisson model not only can model paired data with correlation, but also handle under- or over-dispersed data sets as well. Compared with three other types of models, double Poisson, bivariate Poisson, and zero-inflated double Poisson, the diagonal inflated bivariate Poisson model demonstrates its capability of fitting two data sets with remarkable overlapping portions resulting from the same stochastic process. Therefore, the diagonal inflated bivariate Poisson model provides researchers a new approach to investigating AVCs from a different perspective involving the three distribution parameters (λ(1), λ(2) and λ(3)). The modeling results show the impacts of traffic elements, geometric design and geographic characteristics on the occurrences of both reported AVC and carcass removal data. It is found that the increase of some associated factors, such as speed limit, annual average daily traffic, and shoulder width, will increase the numbers of reported AVCs and carcass removals. Conversely, the presence of some geometric factors, such as rolling and mountainous terrain, will decrease the number of reported AVCs. Published by Elsevier Ltd.

  15. A constitutive law for degrading bioresorbable polymers.

    PubMed

    Samami, Hassan; Pan, Jingzhe

    2016-06-01

    This paper presents a constitutive law that predicts the changes in elastic moduli, Poisson's ratio and ultimate tensile strength of bioresorbable polymers due to biodegradation. During biodegradation, long polymer chains are cleaved by hydrolysis reaction. For semi-crystalline polymers, the chain scissions also lead to crystallisation. Treating each scission as a cavity and each new crystal as a solid inclusion, a degrading semi-crystalline polymer can be modelled as a continuum solid containing randomly distributed cavities and crystal inclusions. The effective elastic properties of a degrading polymer are calculated using existing theories for such solid and the tensile strength of the degrading polymer is predicted using scaling relations that were developed for porous materials. The theoretical model for elastic properties and the scaling law for strength form a complete constitutive relation for the degrading polymers. It is shown that the constitutive law can capture the trend of the experimental data in the literature for a range of biodegradable polymers fairly well. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Spatial modeling of cutaneous leishmaniasis in the Andean region of Colombia

    PubMed Central

    Pérez-Flórez, Mauricio; Ocampo, Clara Beatriz; Valderrama-Ardila, Carlos; Alexander, Neal

    2016-01-01

    The objective of this research was to identify environmental risk factors for cutaneous leishmaniasis (CL) in Colombia and map high-risk municipalities. The study area was the Colombian Andean region, comprising 715 rural and urban municipalities. We used 10 years of CL surveillance: 2000-2009. We used spatial-temporal analysis - conditional autoregressive Poisson random effects modelling - in a Bayesian framework to model the dependence of municipality-level incidence on land use, climate, elevation and population density. Bivariable spatial analysis identified rainforests, forests and secondary vegetation, temperature, and annual precipitation as positively associated with CL incidence. By contrast, livestock agroecosystems and temperature seasonality were negatively associated. Multivariable analysis identified land use - rainforests and agro-livestock - and climate - temperature, rainfall and temperature seasonality - as best predictors of CL. We conclude that climate and land use can be used to identify areas at high risk of CL and that this approach is potentially applicable elsewhere in Latin America. PMID:27355214

  17. Identification d’une Classe de Processus de Poisson Filtres (Identification of a Class of Filtered Poisson Processes).

    DTIC Science & Technology

    1983-05-20

    Poisson processes is introduced: the amplitude has a law which is spherically invariant and the filter is real, linear and causal. It is shown how such a model can be identified from experimental data. (Author)

  18. Does progressive resistance and balance exercise reduce falls in residential aged care? Randomized controlled trial protocol for the SUNBEAM program

    PubMed Central

    Hewitt, Jennifer; Refshauge, Kathryn M; Goodall, Stephen; Henwood, Timothy; Clemson, Lindy

    2014-01-01

    Introduction Falls are common among older adults. It is reported that approximately 60% of residents of aged care facilities fall each year. This is a major cause of morbidity and mortality, and a significant burden for health care providers and the health system. Among community dwelling older adults, exercise appears to be an effective countermeasure, but data are limited and inconsistent among studies in residents of aged care communities. This trial has been designed to evaluate whether the SUNBEAM program (Strength and Balance Exercise in Aged Care) reduces falls in residents of aged care facilities. Research question Is the program more effective and cost-effective than usual care for the prevention of falls? Design Single-blinded, two group, cluster randomized trial. Participants and setting 300 residents, living in 20 aged care facilities. Intervention Progressive resistance and balance training under the guidance of a physiotherapist for 6 months, then facility-guided maintenance training for 6 months. Control Usual care. Measurements Number of falls, number of fallers, quality of life, mobility, balance, fear of falling, cognitive well-being, resource use, and cost-effectiveness. Measurements will be taken at baseline, 6 months, and 12 months. Analysis The number of falls will be analyzed using a Poisson mixed model. A logistic mixed model will be used to analyze the number of residents who fall during the study period. Intention-to-treat analysis will be used. Discussion This study addresses a significant shortcoming in aged care research, and has potential to impact upon a substantial health care problem. Outcomes will be used to inform care providers, and guide health care policies. PMID:24591821

  19. Slow-release L-Cysteine (Acetium®) Lozenge Is an Effective New Method in Smoking Cessation. A Randomized, Double-blind, Placebo-controlled Intervention.

    PubMed

    Syrjänen, Kari; Eronen, Katja; Hendolin, Panu; Paloheimo, Lea; Eklund, Carita; Bäckström, Anna; Suovaniemi, Osmo

    2017-07-01

    Because of the major health problems and annual economic burden caused by cigarette smoking, effective new tools for smoking intervention are urgently needed. Our previous randomized controlled trial (RCT) provided promising results on the efficacy of slow-release L-cysteine lozenge in smoking intervention, but the study was not adequately powered. To confirm in an adequately-powered study the results of the previous RCT implicating that effective elimination of acetaldehyde in saliva by slow-release L-cysteine (Acetium® lozenge, Biohit Oyj, Helsinki), would assist in smoking cessation by reducing acetaldehyde-enhanced nicotine addiction. On this matter, we undertook a double-blind, randomized, placebo-controlled trial comparing Acetium® lozenge and placebo in smoking intervention. A cohort of 1,998 cigarette smokers were randomly allocated to intervention (n=996) and placebo arms (n=1,002). At baseline, smoking history was recorded by a questionnaire, with nicotine dependence testing according to the Fagerström scale (FTND). The subjects used smoking diary recording the daily numbers of cigarettes, lozenges and subjective sensations of smoking. The data were analysed separately for point prevalence of abstinence (PPA) and prolonged abstinence (PA) endpoints. Altogether, 753 study subjects completed the trial per protocol (PP), 944 with violations (mITT), and the rest (n=301) were lost to follow-up (LTF). During the 6-month intervention, 331 subjects stopped smoking; 181 (18.2%) in the intervention arm and 150 (15.0%) in the placebo arm (OR=1.43; 95%CI=1.09-1.88); p=0.010). In the PP group, 170 (45.3%) quitted smoking in the intervention arm compared to 134 (35.4%) in the placebo arm (OR=1.51, 95%CI=1.12-2.02; p=0.006). In multivariate (Poisson regression) model, decreased level of smoking pleasure (p=0.010) and "smoking sensations changed" were powerful independent predictors of quit events (IRR=12.01; 95%CI=1.5-95.6). Acetium® lozenge, herein confirmed in an adequately powered study to be an effective means to aid smoking quit, represents a major breakthrough in the development of smoking intervention methods, because slow-release L-cysteine is non-toxic, with no side-effects or limitations of use. Copyright© 2017, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.

  20. Poisson pre-processing of nonstationary photonic signals: Signals with equality between mean and variance.

    PubMed

    Poplová, Michaela; Sovka, Pavel; Cifra, Michal

    2017-01-01

    Photonic signals are broadly exploited in communication and sensing and they typically exhibit Poisson-like statistics. In a common scenario where the intensity of the photonic signals is low and one needs to remove a nonstationary trend of the signals for any further analysis, one faces an obstacle: due to the dependence between the mean and variance typical for a Poisson-like process, information about the trend remains in the variance even after the trend has been subtracted, possibly yielding artifactual results in further analyses. Commonly available detrending or normalizing methods cannot cope with this issue. To alleviate this issue we developed a suitable pre-processing method for the signals that originate from a Poisson-like process. In this paper, a Poisson pre-processing method for nonstationary time series with Poisson distribution is developed and tested on computer-generated model data and experimental data of chemiluminescence from human neutrophils and mung seeds. The presented method transforms a nonstationary Poisson signal into a stationary signal with a Poisson distribution while preserving the type of photocount distribution and phase-space structure of the signal. The importance of the suggested pre-processing method is shown in Fano factor and Hurst exponent analysis of both computer-generated model signals and experimental photonic signals. It is demonstrated that our pre-processing method is superior to standard detrending-based methods whenever further signal analysis is sensitive to variance of the signal.

Top