#### Sample records for age poisson regression

1. Understanding poisson regression.

PubMed

Hayat, Matthew J; Higgins, Melinda

2014-04-01

Nurse investigators often collect study data in the form of counts. Traditional methods of data analysis have historically approached analysis of count data either as if the count data were continuous and normally distributed or with dichotomization of the counts into the categories of occurred or did not occur. These outdated methods for analyzing count data have been replaced with more appropriate statistical methods that make use of the Poisson probability distribution, which is useful for analyzing count data. The purpose of this article is to provide an overview of the Poisson distribution and its use in Poisson regression. Assumption violations for the standard Poisson regression model are addressed with alternative approaches, including addition of an overdispersion parameter or negative binomial regression. An illustrative example is presented with an application from the ENSPIRE study, and regression modeling of comorbidity data is included for illustrative purposes.

2. Estimation of count data using mixed Poisson, generalized Poisson and finite Poisson mixture regression models

Zamani, Hossein; Faroughi, Pouya; Ismail, Noriszura

2014-06-01

This study relates the Poisson, mixed Poisson (MP), generalized Poisson (GP) and finite Poisson mixture (FPM) regression models through mean-variance relationship, and suggests the application of these models for overdispersed count data. As an illustration, the regression models are fitted to the US skin care count data. The results indicate that FPM regression model is the best model since it provides the largest log likelihood and the smallest AIC, followed by Poisson-Inverse Gaussion (PIG), GP and negative binomial (NB) regression models. The results also show that NB, PIG and GP regression models provide similar results.

3. Modelling of filariasis in East Java with Poisson regression and generalized Poisson regression models

Darnah

2016-04-01

Poisson regression has been used if the response variable is count data that based on the Poisson distribution. The Poisson distribution assumed equal dispersion. In fact, a situation where count data are over dispersion or under dispersion so that Poisson regression inappropriate because it may underestimate the standard errors and overstate the significance of the regression parameters, and consequently, giving misleading inference about the regression parameters. This paper suggests the generalized Poisson regression model to handling over dispersion and under dispersion on the Poisson regression model. The Poisson regression model and generalized Poisson regression model will be applied the number of filariasis cases in East Java. Based regression Poisson model the factors influence of filariasis are the percentage of families who don't behave clean and healthy living and the percentage of families who don't have a healthy house. The Poisson regression model occurs over dispersion so that we using generalized Poisson regression. The best generalized Poisson regression model showing the factor influence of filariasis is percentage of families who don't have healthy house. Interpretation of result the model is each additional 1 percentage of families who don't have healthy house will add 1 people filariasis patient.

4. Poisson Mixture Regression Models for Heart Disease Prediction.

PubMed

Mufudza, Chipo; Erol, Hamza

2016-01-01

Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model.

5. Poisson Regression Analysis of Illness and Injury Surveillance Data

SciTech Connect

Frome E.L., Watkins J.P., Ellis E.D.

2012-12-12

The Department of Energy (DOE) uses illness and injury surveillance to monitor morbidity and assess the overall health of the work force. Data collected from each participating site include health events and a roster file with demographic information. The source data files are maintained in a relational data base, and are used to obtain stratified tables of health event counts and person time at risk that serve as the starting point for Poisson regression analysis. The explanatory variables that define these tables are age, gender, occupational group, and time. Typical response variables of interest are the number of absences due to illness or injury, i.e., the response variable is a count. Poisson regression methods are used to describe the effect of the explanatory variables on the health event rates using a log-linear main effects model. Results of fitting the main effects model are summarized in a tabular and graphical form and interpretation of model parameters is provided. An analysis of deviance table is used to evaluate the importance of each of the explanatory variables on the event rate of interest and to determine if interaction terms should be considered in the analysis. Although Poisson regression methods are widely used in the analysis of count data, there are situations in which over-dispersion occurs. This could be due to lack-of-fit of the regression model, extra-Poisson variation, or both. A score test statistic and regression diagnostics are used to identify over-dispersion. A quasi-likelihood method of moments procedure is used to evaluate and adjust for extra-Poisson variation when necessary. Two examples are presented using respiratory disease absence rates at two DOE sites to illustrate the methods and interpretation of the results. In the first example the Poisson main effects model is adequate. In the second example the score test indicates considerable over-dispersion and a more detailed analysis attributes the over-dispersion to extra-Poisson

6. Effect of Nutritional Habits on Dental Caries in Permanent Dentition among Schoolchildren Aged 10–12 Years: A Zero-Inflated Generalized Poisson Regression Model Approach

PubMed Central

2016-01-01

Background: The aim of this study was to assess the associations between nutrition and dental caries in permanent dentition among schoolchildren. Methods: A cross-sectional survey was undertaken on 698 schoolchildren aged 10 to 12 yr from a random sample of primary schools in Kermanshah, western Iran, in 2014. The study was based on the data obtained from the questionnaire containing information on nutritional habits and the outcome of decayed/missing/filled teeth (DMFT) index. The association between predictors and dental caries was modeled using the Zero Inflated Generalized Poisson (ZIGP) regression model. Results: Fourteen percent of the children were caries free. The model was shown that in female children, the odds of being in a caries susceptible sub-group was 1.23 (95% CI: 1.08–1.51) times more likely than boys (P=0.041). Additionally, mean caries count in children who consumed the fizzy soft beverages and sweet biscuits more than once daily was 1.41 (95% CI: 1.19–1.63) and 1.27 (95% CI: 1.18–1.37) times more than children that were in category of less than 3 times a week or never, respectively. Conclusions: Girls were at a higher risk of caries than boys were. Since our study showed that nutritional status may have significant effect on caries in permanent teeth, we recommend that health promotion activities in school should be emphasized on healthful eating practices; especially limiting beverages containing sugar to only occasionally between meals. PMID:27141498

7. Background stratified Poisson regression analysis of cohort data.

PubMed

Richardson, David B; Langholz, Bryan

2012-03-01

Background stratified Poisson regression is an approach that has been used in the analysis of data derived from a variety of epidemiologically important studies of radiation-exposed populations, including uranium miners, nuclear industry workers, and atomic bomb survivors. We describe a novel approach to fit Poisson regression models that adjust for a set of covariates through background stratification while directly estimating the radiation-disease association of primary interest. The approach makes use of an expression for the Poisson likelihood that treats the coefficients for stratum-specific indicator variables as 'nuisance' variables and avoids the need to explicitly estimate the coefficients for these stratum-specific parameters. Log-linear models, as well as other general relative rate models, are accommodated. This approach is illustrated using data from the Life Span Study of Japanese atomic bomb survivors and data from a study of underground uranium miners. The point estimate and confidence interval obtained from this 'conditional' regression approach are identical to the values obtained using unconditional Poisson regression with model terms for each background stratum. Moreover, it is shown that the proposed approach allows estimation of background stratified Poisson regression models of non-standard form, such as models that parameterize latency effects, as well as regression models in which the number of strata is large, thereby overcoming the limitations of previously available statistical software for fitting background stratified Poisson regression models.

8. Modeling the number of car theft using Poisson regression

Zulkifli, Malina; Ling, Agnes Beh Yen; Kasim, Maznah Mat; Ismail, Noriszura

2016-10-01

Regression analysis is the most popular statistical methods used to express the relationship between the variables of response with the covariates. The aim of this paper is to evaluate the factors that influence the number of car theft using Poisson regression model. This paper will focus on the number of car thefts that occurred in districts in Peninsular Malaysia. There are two groups of factor that have been considered, namely district descriptive factors and socio and demographic factors. The result of the study showed that Bumiputera composition, Chinese composition, Other ethnic composition, foreign migration, number of residence with the age between 25 to 64, number of employed person and number of unemployed person are the most influence factors that affect the car theft cases. These information are very useful for the law enforcement department, insurance company and car owners in order to reduce and limiting the car theft cases in Peninsular Malaysia.

9. Regression models for mixed Poisson and continuous longitudinal data.

PubMed

Yang, Ying; Kang, Jian; Mao, Kai; Zhang, Jie

2007-09-10

In this article we develop flexible regression models in two respects to evaluate the influence of the covariate variables on the mixed Poisson and continuous responses and to evaluate how the correlation between Poisson response and continuous response changes over time. A scenario for dealing with regression models of mixed continuous and Poisson responses when the heterogeneous variance and correlation changing over time exist is proposed. Our general approach is first to jointly build marginal model and to check whether the variance and correlation change over time via likelihood ratio test. If the variance and correlation change over time, we will do a suitable data transformation to properly evaluate the influence of the covariates on the mixed responses. The proposed methods are applied to the interstitial cystitis data base (ICDB) cohort study, and we find that the positive correlations significantly change over time, which suggests heterogeneous variances should not be ignored in modelling and inference.

10. Testing approaches for overdispersion in poisson regression versus the generalized poisson model.

PubMed

Yang, Zhao; Hardin, James W; Addy, Cheryl L; Vuong, Quang H

2007-08-01

Overdispersion is a common phenomenon in Poisson modeling, and the negative binomial (NB) model is frequently used to account for overdispersion. Testing approaches (Wald test, likelihood ratio test (LRT), and score test) for overdispersion in the Poisson regression versus the NB model are available. Because the generalized Poisson (GP) model is similar to the NB model, we consider the former as an alternate model for overdispersed count data. The score test has an advantage over the LRT and the Wald test in that the score test only requires that the parameter of interest be estimated under the null hypothesis. This paper proposes a score test for overdispersion based on the GP model and compares the power of the test with the LRT and Wald tests. A simulation study indicates the score test based on asymptotic standard Normal distribution is more appropriate in practical application for higher empirical power, however, it underestimates the nominal significance level, especially in small sample situations, and examples illustrate the results of comparing the candidate tests between the Poisson and GP models. A bootstrap test is also proposed to adjust the underestimation of nominal level in the score statistic when the sample size is small. The simulation study indicates the bootstrap test has significance level closer to nominal size and has uniformly greater power than the score test based on asymptotic standard Normal distribution. From a practical perspective, we suggest that, if the score test gives even a weak indication that the Poisson model is inappropriate, say at the 0.10 significance level, we advise the more accurate bootstrap procedure as a better test for comparing whether the GP model is more appropriate than Poisson model. Finally, the Vuong test is illustrated to choose between GP and NB2 models for the same dataset.

11. Reducing Poisson noise and baseline drift in X-ray spectral images with bootstrap Poisson regression and robust nonparametric regression.

PubMed

Zhu, Feng; Qin, Binjie; Feng, Weiyue; Wang, Huajian; Huang, Shaosen; Lv, Yisong; Chen, Yong

2013-03-21

X-ray spectral imaging provides quantitative imaging of trace elements in a biological sample with high sensitivity. We propose a novel algorithm to promote the signal-to-noise ratio (SNR) of x-ray spectral images that have low photon counts. Firstly, we estimate the image data area that belongs to the homogeneous parts through confidence interval testing. Then, we apply the Poisson regression through its maximum likelihood estimation on this area to estimate the true photon counts from the Poisson noise corrupted data. Unlike other denoising methods based on regression analysis, we use the bootstrap resampling method to ensure the accuracy of regression estimation. Finally, we use a robust local nonparametric regression method to estimate the baseline and subsequently subtract it from the x-ray spectral data to further improve the SNR of the data. Experiments on several real samples show that the proposed method performs better than some state-of-the-art approaches to ensure accuracy and precision for quantitative analysis of the different trace elements in a standard reference biological sample.

12. Mixed-effects Poisson regression analysis of adverse event reports

PubMed Central

Gibbons, Robert D.; Segawa, Eisuke; Karabatsos, George; Amatya, Anup K.; Bhaumik, Dulal K.; Brown, C. Hendricks; Kapur, Kush; Marcus, Sue M.; Hur, Kwan; Mann, J. John

2008-01-01

SUMMARY A new statistical methodology is developed for the analysis of spontaneous adverse event (AE) reports from post-marketing drug surveillance data. The method involves both empirical Bayes (EB) and fully Bayes estimation of rate multipliers for each drug within a class of drugs, for a particular AE, based on a mixed-effects Poisson regression model. Both parametric and semiparametric models for the random-effect distribution are examined. The method is applied to data from Food and Drug Administration (FDA)’s Adverse Event Reporting System (AERS) on the relationship between antidepressants and suicide. We obtain point estimates and 95 per cent confidence (posterior) intervals for the rate multiplier for each drug (e.g. antidepressants), which can be used to determine whether a particular drug has an increased risk of association with a particular AE (e.g. suicide). Confidence (posterior) intervals that do not include 1.0 provide evidence for either significant protective or harmful associations of the drug and the adverse effect. We also examine EB, parametric Bayes, and semiparametric Bayes estimators of the rate multipliers and associated confidence (posterior) intervals. Results of our analysis of the FDA AERS data revealed that newer antidepressants are associated with lower rates of suicide adverse event reports compared with older antidepressants. We recommend improvements to the existing AERS system, which are likely to improve its public health value as an early warning system. PMID:18404622

13. Performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data.

PubMed

Yelland, Lisa N; Salter, Amy B; Ryan, Philip

2011-10-15

Modified Poisson regression, which combines a log Poisson regression model with robust variance estimation, is a useful alternative to log binomial regression for estimating relative risks. Previous studies have shown both analytically and by simulation that modified Poisson regression is appropriate for independent prospective data. This method is often applied to clustered prospective data, despite a lack of evidence to support its use in this setting. The purpose of this article is to evaluate the performance of the modified Poisson regression approach for estimating relative risks from clustered prospective data, by using generalized estimating equations to account for clustering. A simulation study is conducted to compare log binomial regression and modified Poisson regression for analyzing clustered data from intervention and observational studies. Both methods generally perform well in terms of bias, type I error, and coverage. Unlike log binomial regression, modified Poisson regression is not prone to convergence problems. The methods are contrasted by using example data sets from 2 large studies. The results presented in this article support the use of modified Poisson regression as an alternative to log binomial regression for analyzing clustered prospective data when clustering is taken into account by using generalized estimating equations.

14. A comparison between Poisson and zero-inflated Poisson regression models with an application to number of black spots in Corriedale sheep.

PubMed

Naya, Hugo; Urioste, Jorge I; Chang, Yu-Mei; Rodrigues-Motta, Mariana; Kremer, Roberto; Gianola, Daniel

2008-01-01

Dark spots in the fleece area are often associated with dark fibres in wool, which limits its competitiveness with other textile fibres. Field data from a sheep experiment in Uruguay revealed an excess number of zeros for dark spots. We compared the performance of four Poisson and zero-inflated Poisson (ZIP) models under four simulation scenarios. All models performed reasonably well under the same scenario for which the data were simulated. The deviance information criterion favoured a Poisson model with residual, while the ZIP model with a residual gave estimates closer to their true values under all simulation scenarios. Both Poisson and ZIP models with an error term at the regression level performed better than their counterparts without such an error. Field data from Corriedale sheep were analysed with Poisson and ZIP models with residuals. Parameter estimates were similar for both models. Although the posterior distribution of the sire variance was skewed due to a small number of rams in the dataset, the median of this variance suggested a scope for genetic selection. The main environmental factor was the age of the sheep at shearing. In summary, age related processes seem to drive the number of dark spots in this breed of sheep.

15. A comparison between Poisson and zero-inflated Poisson regression models with an application to number of black spots in Corriedale sheep

PubMed Central

Naya, Hugo; Urioste, Jorge I; Chang, Yu-Mei; Rodrigues-Motta, Mariana; Kremer, Roberto; Gianola, Daniel

2008-01-01

Dark spots in the fleece area are often associated with dark fibres in wool, which limits its competitiveness with other textile fibres. Field data from a sheep experiment in Uruguay revealed an excess number of zeros for dark spots. We compared the performance of four Poisson and zero-inflated Poisson (ZIP) models under four simulation scenarios. All models performed reasonably well under the same scenario for which the data were simulated. The deviance information criterion favoured a Poisson model with residual, while the ZIP model with a residual gave estimates closer to their true values under all simulation scenarios. Both Poisson and ZIP models with an error term at the regression level performed better than their counterparts without such an error. Field data from Corriedale sheep were analysed with Poisson and ZIP models with residuals. Parameter estimates were similar for both models. Although the posterior distribution of the sire variance was skewed due to a small number of rams in the dataset, the median of this variance suggested a scope for genetic selection. The main environmental factor was the age of the sheep at shearing. In summary, age related processes seem to drive the number of dark spots in this breed of sheep. PMID:18558072

16. Poisson regression analysis of mortality among male workers at a thorium-processing plant

SciTech Connect

Liu, Zhiyuan; Lee, Tze-San; Kotek, T.J.

1991-12-31

Analyses of mortality among a cohort of 3119 male workers employed between 1915 and 1973 at a thorium-processing plant were updated to the end of 1982. Of the whole group, 761 men were deceased and 2161 men were still alive, while 197 men were lost to follow-up. A total of 250 deaths was added to the 511 deaths observed in the previous study. The standardized mortality ratio (SMR) for all causes of death was 1.12 with 95% confidence interval (CI) of 1.05-1.21. The SMRs were also significantly increased for all malignant neoplasms (SMR = 1.23, 95% CI = 1.04-1.43) and lung cancer (SMR = 1.36, 95% CI = 1.02-1.78). Poisson regression analysis was employed to evaluate the joint effects of job classification, duration of employment, time since first employment, age and year at first employment on mortality of all malignant neoplasms and lung cancer. A comparison of internal and external analyses with the Poisson regression model was also conducted and showed no obvious difference in fitting the data on lung cancer mortality of the thorium workers. The results of the multivariate analysis showed that there was no significant effect of all the study factors on mortality due to all malignant neoplasms and lung cancer. Therefore, further study is needed for the former thorium workers.

17. Poisson regression for modeling count and frequency outcomes in trauma research.

PubMed

Gagnon, David R; Doron-LaMarca, Susan; Bell, Margret; O'Farrell, Timothy J; Taft, Casey T

2008-10-01

The authors describe how the Poisson regression method for analyzing count or frequency outcome variables can be applied in trauma studies. The outcome of interest in trauma research may represent a count of the number of incidents of behavior occurring in a given time interval, such as acts of physical aggression or substance abuse. Traditional linear regression approaches assume a normally distributed outcome variable with equal variances over the range of predictor variables, and may not be optimal for modeling count outcomes. An application of Poisson regression is presented using data from a study of intimate partner aggression among male patients in an alcohol treatment program and their female partners. Results of Poisson regression and linear regression models are compared.

18. Fuzzy classifier based support vector regression framework for Poisson ratio determination

Asoodeh, Mojtaba; Bagheripour, Parisa

2013-09-01

Poisson ratio is considered as one of the most important rock mechanical properties of hydrocarbon reservoirs. Determination of this parameter through laboratory measurement is time, cost, and labor intensive. Furthermore, laboratory measurements do not provide continuous data along the reservoir intervals. Hence, a fast, accurate, and inexpensive way of determining Poisson ratio which produces continuous data over the whole reservoir interval is desirable. For this purpose, support vector regression (SVR) method based on statistical learning theory (SLT) was employed as a supervised learning algorithm to estimate Poisson ratio from conventional well log data. SVR is capable of accurately extracting the implicit knowledge contained in conventional well logs and converting the gained knowledge into Poisson ratio data. Structural risk minimization (SRM) principle which is embedded in the SVR structure in addition to empirical risk minimization (EMR) principle provides a robust model for finding quantitative formulation between conventional well log data and Poisson ratio. Although satisfying results were obtained from an individual SVR model, it had flaws of overestimation in low Poisson ratios and underestimation in high Poisson ratios. These errors were eliminated through implementation of fuzzy classifier based SVR (FCBSVR). The FCBSVR significantly improved accuracy of the final prediction. This strategy was successfully applied to data from carbonate reservoir rocks of an Iranian Oil Field. Results indicated that SVR predicted Poisson ratio values are in good agreement with measured values.

19. Maximum Entropy Discrimination Poisson Regression for Software Reliability Modeling.

PubMed

Chatzis, Sotirios P; Andreou, Andreas S

2015-11-01

Reliably predicting software defects is one of the most significant tasks in software engineering. Two of the major components of modern software reliability modeling approaches are: 1) extraction of salient features for software system representation, based on appropriately designed software metrics and 2) development of intricate regression models for count data, to allow effective software reliability data modeling and prediction. Surprisingly, research in the latter frontier of count data regression modeling has been rather limited. More specifically, a lack of simple and efficient algorithms for posterior computation has made the Bayesian approaches appear unattractive, and thus underdeveloped in the context of software reliability modeling. In this paper, we try to address these issues by introducing a novel Bayesian regression model for count data, based on the concept of max-margin data modeling, effected in the context of a fully Bayesian model treatment with simple and efficient posterior distribution updates. Our novel approach yields a more discriminative learning technique, making more effective use of our training data during model inference. In addition, it allows of better handling uncertainty in the modeled data, which can be a significant problem when the training data are limited. We derive elegant inference algorithms for our model under the mean-field paradigm and exhibit its effectiveness using the publicly available benchmark data sets.

20. Modeling animal-vehicle collisions using diagonal inflated bivariate Poisson regression.

PubMed

Lao, Yunteng; Wu, Yao-Jan; Corey, Jonathan; Wang, Yinhai

2011-01-01

Two types of animal-vehicle collision (AVC) data are commonly adopted for AVC-related risk analysis research: reported AVC data and carcass removal data. One issue with these two data sets is that they were found to have significant discrepancies by previous studies. In order to model these two types of data together and provide a better understanding of highway AVCs, this study adopts a diagonal inflated bivariate Poisson regression method, an inflated version of bivariate Poisson regression model, to fit the reported AVC and carcass removal data sets collected in Washington State during 2002-2006. The diagonal inflated bivariate Poisson model not only can model paired data with correlation, but also handle under- or over-dispersed data sets as well. Compared with three other types of models, double Poisson, bivariate Poisson, and zero-inflated double Poisson, the diagonal inflated bivariate Poisson model demonstrates its capability of fitting two data sets with remarkable overlapping portions resulting from the same stochastic process. Therefore, the diagonal inflated bivariate Poisson model provides researchers a new approach to investigating AVCs from a different perspective involving the three distribution parameters (λ(1), λ(2) and λ(3)). The modeling results show the impacts of traffic elements, geometric design and geographic characteristics on the occurrences of both reported AVC and carcass removal data. It is found that the increase of some associated factors, such as speed limit, annual average daily traffic, and shoulder width, will increase the numbers of reported AVCs and carcass removals. Conversely, the presence of some geometric factors, such as rolling and mountainous terrain, will decrease the number of reported AVCs.

1. Effect of air pollution on lung cancer: A poisson regression model based on vital statistics

SciTech Connect

Tango, Toshiro

1994-11-01

This article describes a Poisson regression model for time trends of mortality to detect the long-term effects of common levels of air pollution on lung cancer, in which the adjustment for cigarette smoking is not always necessary. The main hypothesis to be tested in the model is that if the long-term and common-level air pollution had an effect on lung cancer, the death rate from lung cancer could be expected to increase gradually at a higher rate in the region with relatively high levels of air pollution than in the region with low levels, and that this trend would not be expected for other control diseases in which cigarette smoking is a risk factor. Using this approach, we analyzed the trend of mortality in females aged 40 to 79, from lung cancer and two control diseases, ischemic heart disease and cerebrovascular disease, based on vital statistics in 23 wards of the Tokyo metropolitan area for 1972 to 1988. Ward-specific mean levels per day of SO{sub 2} and NO{sub 2} from 1974 through 1976 estimated by Makino (1978) were used as the ward-specific exposure measure of air pollution. No data on tobacco consumption in each ward is available. Our analysis supported the existence of long-term effects of air pollution on lung cancer. 14 refs., 5 figs., 2 tabs.

2. A marginalized zero-inflated Poisson regression model with overall exposure effects.

PubMed

Long, D Leann; Preisser, John S; Herring, Amy H; Golin, Carol E

2014-12-20

The zero-inflated Poisson (ZIP) regression model is often employed in public health research to examine the relationships between exposures of interest and a count outcome exhibiting many zeros, in excess of the amount expected under sampling from a Poisson distribution. The regression coefficients of the ZIP model have latent class interpretations, which correspond to a susceptible subpopulation at risk for the condition with counts generated from a Poisson distribution and a non-susceptible subpopulation that provides the extra or excess zeros. The ZIP model parameters, however, are not well suited for inference targeted at marginal means, specifically, in quantifying the effect of an explanatory variable in the overall mixture population. We develop a marginalized ZIP model approach for independent responses to model the population mean count directly, allowing straightforward inference for overall exposure effects and empirical robust variance estimation for overall log-incidence density ratios. Through simulation studies, the performance of maximum likelihood estimation of the marginalized ZIP model is assessed and compared with other methods of estimating overall exposure effects. The marginalized ZIP model is applied to a recent study of a motivational interviewing-based safer sex counseling intervention, designed to reduce unprotected sexual act counts.

3. Extension of the modified Poisson regression model to prospective studies with correlated binary data.

PubMed

Zou, G Y; Donner, Allan

2013-12-01

The Poisson regression model using a sandwich variance estimator has become a viable alternative to the logistic regression model for the analysis of prospective studies with independent binary outcomes. The primary advantage of this approach is that it readily provides covariate-adjusted risk ratios and associated standard errors. In this article, the model is extended to studies with correlated binary outcomes as arise in longitudinal or cluster randomization studies. The key step involves a cluster-level grouping strategy for the computation of the middle term in the sandwich estimator. For a single binary exposure variable without covariate adjustment, this approach results in risk ratio estimates and standard errors that are identical to those found in the survey sampling literature. Simulation results suggest that it is reliable for studies with correlated binary data, provided the total number of clusters is at least 50. Data from observational and cluster randomized studies are used to illustrate the methods.

4. A Spline-Based Lack-Of-Fit Test for Independent Variable Effect in Poisson Regression.

PubMed

Li, Chin-Shang; Tu, Wanzhu

2007-05-01

In regression analysis of count data, independent variables are often modeled by their linear effects under the assumption of log-linearity. In reality, the validity of such an assumption is rarely tested, and its use is at times unjustifiable. A lack-of-fit test is proposed for the adequacy of a postulated functional form of an independent variable within the framework of semiparametric Poisson regression models based on penalized splines. It offers added flexibility in accommodating the potentially non-loglinear effect of the independent variable. A likelihood ratio test is constructed for the adequacy of the postulated parametric form, for example log-linearity, of the independent variable effect. Simulations indicate that the proposed model performs well, and misspecified parametric model has much reduced power. An example is given.

5. Modeling both of the number of pausibacillary and multibacillary leprosy patients by using bivariate poisson regression

Winahju, W. S.; Mukarromah, A.; Putri, S.

2015-03-01

Leprosy is a chronic infectious disease caused by bacteria of leprosy (Mycobacterium leprae). Leprosy has become an important thing in Indonesia because its morbidity is quite high. Based on WHO data in 2014, in 2012 Indonesia has the highest number of new leprosy patients after India and Brazil with a contribution of 18.994 people (8.7% of the world). This number makes Indonesia automatically placed as the country with the highest number of leprosy morbidity of ASEAN countries. The province that most contributes to the number of leprosy patients in Indonesia is East Java. There are two kind of leprosy. They consist of pausibacillary and multibacillary. The morbidity of multibacillary leprosy is higher than pausibacillary leprosy. This paper will discuss modeling both of the number of multibacillary and pausibacillary leprosy patients as responses variables. These responses are count variables, so modeling will be conducted by using bivariate poisson regression method. Unit experiment used is in East Java, and predictors involved are: environment, demography, and poverty. The model uses data in 2012, and the result indicates that all predictors influence significantly.

6. Bayesian semi-parametric analysis of Poisson change-point regression models: application to policy making in Cali, Colombia

PubMed Central

Park, Taeyoung; Krafty, Robert T.; Sánchez, Alvaro I.

2012-01-01

A Poisson regression model with an offset assumes a constant baseline rate after accounting for measured covariates, which may lead to biased estimates of coefficients in an inhomogeneous Poisson process. To correctly estimate the effect of time-dependent covariates, we propose a Poisson change-point regression model with an offset that allows a time-varying baseline rate. When the nonconstant pattern of a log baseline rate is modeled with a nonparametric step function, the resulting semi-parametric model involves a model component of varying dimension and thus requires a sophisticated varying-dimensional inference to obtain correct estimates of model parameters of fixed dimension. To fit the proposed varying-dimensional model, we devise a state-of-the-art MCMC-type algorithm based on partial collapse. The proposed model and methods are used to investigate an association between daily homicide rates in Cali, Colombia and policies that restrict the hours during which the legal sale of alcoholic beverages is permitted. While simultaneously identifying the latent changes in the baseline homicide rate which correspond to the incidence of sociopolitical events, we explore the effect of policies governing the sale of alcohol on homicide rates and seek a policy that balances the economic and cultural dependencies on alcohol sales to the health of the public. PMID:23393408

7. Predictors of the number of under-five malnourished children in Bangladesh: application of the generalized poisson regression model

PubMed Central

2013-01-01

Background Malnutrition is one of the principal causes of child mortality in developing countries including Bangladesh. According to our knowledge, most of the available studies, that addressed the issue of malnutrition among under-five children, considered the categorical (dichotomous/polychotomous) outcome variables and applied logistic regression (binary/multinomial) to find their predictors. In this study malnutrition variable (i.e. outcome) is defined as the number of under-five malnourished children in a family, which is a non-negative count variable. The purposes of the study are (i) to demonstrate the applicability of the generalized Poisson regression (GPR) model as an alternative of other statistical methods and (ii) to find some predictors of this outcome variable. Methods The data is extracted from the Bangladesh Demographic and Health Survey (BDHS) 2007. Briefly, this survey employs a nationally representative sample which is based on a two-stage stratified sample of households. A total of 4,460 under-five children is analysed using various statistical techniques namely Chi-square test and GPR model. Results The GPR model (as compared to the standard Poisson regression and negative Binomial regression) is found to be justified to study the above-mentioned outcome variable because of its under-dispersion (variance < mean) property. Our study also identify several significant predictors of the outcome variable namely mother’s education, father’s education, wealth index, sanitation status, source of drinking water, and total number of children ever born to a woman. Conclusions Consistencies of our findings in light of many other studies suggest that the GPR model is an ideal alternative of other statistical models to analyse the number of under-five malnourished children in a family. Strategies based on significant predictors may improve the nutritional status of children in Bangladesh. PMID:23297699

8. The analysis of incontinence episodes and other count data in patients with overactive bladder by Poisson and negative binomial regression.

PubMed

Martina, R; Kay, R; van Maanen, R; Ridder, A

2015-01-01

Clinical studies in overactive bladder have traditionally used analysis of covariance or nonparametric methods to analyse the number of incontinence episodes and other count data. It is known that if the underlying distributional assumptions of a particular parametric method do not hold, an alternative parametric method may be more efficient than a nonparametric one, which makes no assumptions regarding the underlying distribution of the data. Therefore, there are advantages in using methods based on the Poisson distribution or extensions of that method, which incorporate specific features that provide a modelling framework for count data. One challenge with count data is overdispersion, but methods are available that can account for this through the introduction of random effect terms in the modelling, and it is this modelling framework that leads to the negative binomial distribution. These models can also provide clinicians with a clearer and more appropriate interpretation of treatment effects in terms of rate ratios. In this paper, the previously used parametric and non-parametric approaches are contrasted with those based on Poisson regression and various extensions in trials evaluating solifenacin and mirabegron in patients with overactive bladder. In these applications, negative binomial models are seen to fit the data well.

9. Does attitude matter in computer use in Australian general practice? A zero-inflated Poisson regression analysis.

PubMed

2011-01-01

The purpose of this study was to explore factors that facilitate or hinder effective use of computers in Australian general medical practice. This study is based on data extracted from a national telephone survey of 480 general practitioners (GPs) across Australia. Clinical functions performed by GPs using computers were examined using a zero-inflated Poisson (ZIP) regression modelling. About 17% of GPs were not using computer for any clinical function, while 18% reported using computers for all clinical functions. The ZIP model showed that computer anxiety was negatively associated with effective computer use, while practitioners' belief about usefulness of computers was positively associated with effective computer use. Being a female GP or working in partnership or group practice increased the odds of effectively using computers for clinical functions. To fully capitalise on the benefits of computer technology, GPs need to be convinced that this technology is useful and can make a difference.

10. Modelling the influence of temperature and rainfall on malaria incidence in four endemic provinces of Zambia using semiparametric Poisson regression.

PubMed

Shimaponda-Mataa, Nzooma M; Tembo-Mwase, Enala; Gebreslasie, Michael; Achia, Thomas N O; Mukaratirwa, Samson

2017-02-01

Although malaria morbidity and mortality are greatly reduced globally owing to great control efforts, the disease remains the main contributor. In Zambia, all provinces are malaria endemic. However, the transmission intensities vary mainly depending on environmental factors as they interact with the vectors. Generally in Africa, possibly due to the varying perspectives and methods used, there is variation on the relative importance of malaria risk determinants. In Zambia, the role climatic factors play on malaria case rates has not been determined in combination of space and time using robust methods in modelling. This is critical considering the reversal in malaria reduction after the year 2010 and the variation by transmission zones. Using a geoadditive or structured additive semiparametric Poisson regression model, we determined the influence of climatic factors on malaria incidence in four endemic provinces of Zambia. We demonstrate a strong positive association between malaria incidence and precipitation as well as minimum temperature. The risk of malaria was 95% lower in Lusaka (ARR=0.05, 95% CI=0.04-0.06) and 68% lower in the Western Province (ARR=0.31, 95% CI=0.25-0.41) compared to Luapula Province. North-western Province did not vary from Luapula Province. The effects of geographical region are clearly demonstrated by the unique behaviour and effects of minimum and maximum temperatures in the four provinces. Environmental factors such as landscape in urbanised places may also be playing a role.

11. Longitudinal Poisson regression to evaluate the epidemiology of Cryptosporidium, Giardia, and fecal indicator bacteria in coastal California wetlands.

PubMed

Hogan, Jennifer N; Daniels, Miles E; Watson, Fred G; Conrad, Patricia A; Oates, Stori C; Miller, Melissa A; Hardin, Dane; Byrne, Barbara A; Dominik, Clare; Melli, Ann; Jessup, David A; Miller, Woutrina A

2012-05-01

Fecal pathogen contamination of watersheds worldwide is increasingly recognized, and natural wetlands may have an important role in mitigating fecal pathogen pollution flowing downstream. Given that waterborne protozoa, such as Cryptosporidium and Giardia, are transported within surface waters, this study evaluated associations between fecal protozoa and various wetland-specific and environmental risk factors. This study focused on three distinct coastal California wetlands: (i) a tidally influenced slough bordered by urban and agricultural areas, (ii) a seasonal wetland adjacent to a dairy, and (iii) a constructed wetland that receives agricultural runoff. Wetland type, seasonality, rainfall, and various water quality parameters were evaluated using longitudinal Poisson regression to model effects on concentrations of protozoa and indicator bacteria (Escherichia coli and total coliform). Among wetland types, the dairy wetland exhibited the highest protozoal and bacterial concentrations, and despite significant reductions in microbe concentrations, the wetland could still be seen to influence water quality in the downstream tidal wetland. Additionally, recent rainfall events were associated with higher protozoal and bacterial counts in wetland water samples across all wetland types. Notably, detection of E. coli concentrations greater than a 400 most probable number (MPN) per 100 ml was associated with higher Cryptosporidium oocyst and Giardia cyst concentrations. These findings show that natural wetlands draining agricultural and livestock operation runoff into human-utilized waterways should be considered potential sources of pathogens and that wetlands can be instrumental in reducing pathogen loads to downstream waters.

12. Misspecified poisson regression models for large-scale registry data: inference for 'large n and small p'.

PubMed

Grøn, Randi; Gerds, Thomas A; Andersen, Per K

2016-03-30

Poisson regression is an important tool in register-based epidemiology where it is used to study the association between exposure variables and event rates. In this paper, we will discuss the situation with 'large n and small p', where n is the sample size and p is the number of available covariates. Specifically, we are concerned with modeling options when there are time-varying covariates that can have time-varying effects. One problem is that tests of the proportional hazards assumption, of no interactions between exposure and other observed variables, or of other modeling assumptions have large power due to the large sample size and will often indicate statistical significance even for numerically small deviations that are unimportant for the subject matter. Another problem is that information on important confounders may be unavailable. In practice, this situation may lead to simple working models that are then likely misspecified. To support and improve conclusions drawn from such models, we discuss methods for sensitivity analysis, for estimation of average exposure effects using aggregated data, and a semi-parametric bootstrap method to obtain robust standard errors. The methods are illustrated using data from the Danish national registries investigating the diabetes incidence for individuals treated with antipsychotics compared with the general unexposed population.

13. Existence and uniqueness, attraction for stochastic age-structured population systems with diffusion and Poisson jump

Chen, Huabin

2013-08-01

In this paper, the problems about the existence and uniqueness, attraction for strong solution of stochastic age-structured population systems with diffusion and Poisson jump are considered. Under the non-Lipschitz condition with the Lipschitz condition being considered as a special case, the existence and uniqueness for such systems is firstly proved by using the Burkholder-Davis-Gundy inequality (B-D-G inequality) and Itô's formula. And then by using a novel inequality technique, some sufficient conditions ensuring the existence for the domain of attraction are established. As another by-product, the exponential stability in mean square moment of strong solution for such systems can be also discussed.

14. Analyzing Seasonal Variations in Suicide With Fourier Poisson Time-Series Regression: A Registry-Based Study From Norway, 1969-2007.

PubMed

Bramness, Jørgen G; Walby, Fredrik A; Morken, Gunnar; Røislien, Jo

2015-08-01

Seasonal variation in the number of suicides has long been acknowledged. It has been suggested that this seasonality has declined in recent years, but studies have generally used statistical methods incapable of confirming this. We examined all suicides occurring in Norway during 1969-2007 (more than 20,000 suicides in total) to establish whether seasonality decreased over time. Fitting of additive Fourier Poisson time-series regression models allowed for formal testing of a possible linear decrease in seasonality, or a reduction at a specific point in time, while adjusting for a possible smooth nonlinear long-term change without having to categorize time into discrete yearly units. The models were compared using Akaike's Information Criterion and analysis of variance. A model with a seasonal pattern was significantly superior to a model without one. There was a reduction in seasonality during the period. Both the model assuming a linear decrease in seasonality and the model assuming a change at a specific point in time were both superior to a model assuming constant seasonality, thus confirming by formal statistical testing that the magnitude of the seasonality in suicides has diminished. The additive Fourier Poisson time-series regression model would also be useful for studying other temporal phenomena with seasonal components.

15. Change with age in regression construction of fat percentage for BMI in school-age children.

PubMed

Fujii, Katsunori; Mishima, Takaaki; Watanabe, Eiji; Seki, Kazuyoshi

2011-01-01

In this study, curvilinear regression was applied to the relationship between BMI and body fat percentage, and an analysis was done to see whether there are characteristic changes in that curvilinear regression from elementary to middle school. Then, by simultaneously investigating the changes with age in BMI and body fat percentage, the essential differences in BMI and body fat percentage were demonstrated. The subjects were 789 boys and girls (469 boys, 320 girls) aged 7.5 to 14.5 years from all parts of Japan who participated in regular sports activities. Body weight, total body water (TBW), soft lean mass (SLM), body fat percentage, and fat mass were measured with a body composition analyzer (Tanita BC-521 Inner Scan), using segmental bioelectrical impedance analysis & multi-frequency bioelectrical impedance analysis. Height was measured with a digital height measurer. Body mass index (BMI) was calculated as body weight (km) divided by the square of height (m). The results for the validity of regression polynomials of body fat percentage against BMI showed that, for both boys and girls, first-order polynomials were valid in all school years. With regard to changes with age in BMI and body fat percentage, the results showed a temporary drop at 9 years in the aging distance curve in boys, followed by an increasing trend. Peaks were seen in the velocity curve at 9.7 and 11.9 years, but the MPV was presumed to be at 11.9 years. Among girls, a decreasing trend was seen in the aging distance curve, which was opposite to the changes in the aging distance curve for body fat percentage.

16. Poisson Coordinates.

PubMed

Li, Xian-Ying; Hu, Shi-Min

2013-02-01

Harmonic functions are the critical points of a Dirichlet energy functional, the linear projections of conformal maps. They play an important role in computer graphics, particularly for gradient-domain image processing and shape-preserving geometric computation. We propose Poisson coordinates, a novel transfinite interpolation scheme based on the Poisson integral formula, as a rapid way to estimate a harmonic function on a certain domain with desired boundary values. Poisson coordinates are an extension of the Mean Value coordinates (MVCs) which inherit their linear precision, smoothness, and kernel positivity. We give explicit formulas for Poisson coordinates in both continuous and 2D discrete forms. Superior to MVCs, Poisson coordinates are proved to be pseudoharmonic (i.e., they reproduce harmonic functions on n-dimensional balls). Our experimental results show that Poisson coordinates have lower Dirichlet energies than MVCs on a number of typical 2D domains (particularly convex domains). As well as presenting a formula, our approach provides useful insights for further studies on coordinates-based interpolation and fast estimation of harmonic functions.

17. Relative age and birthplace effect in Japanese professional sports: a quantitative evaluation using a Bayesian hierarchical Poisson model.

PubMed

Ishigami, Hideaki

2016-01-01

Relative age effect (RAE) in sports has been well documented. Recent studies investigate the effect of birthplace in addition to the RAE. The first objective of this study was to show the magnitude of the RAE in two major professional sports in Japan, baseball and soccer. Second, we examined the birthplace effect and compared its magnitude with that of the RAE. The effect sizes were estimated using a Bayesian hierarchical Poisson model with the number of players as dependent variable. The RAEs were 9.0% and 7.7% per month for soccer and baseball, respectively. These estimates imply that children born in the first month of a school year have about three times greater chance of becoming a professional player than those born in the last month of the year. Over half of the difference in likelihoods of becoming a professional player between birthplaces was accounted for by weather conditions, with the likelihood decreasing by 1% per snow day. An effect of population size was not detected in the data. By investigating different samples, we demonstrated that using quarterly data leads to underestimation and that the age range of sampled athletes should be set carefully.

18. Age Regression in the Treatment of Anger in a Prison Setting.

ERIC Educational Resources Information Center

Eisel, Harry E.

1988-01-01

Incorporated hypnotherapy with age regression into cognitive therapeutic approach with prisoners having history of anger. Technique involved age regression to establish first significant event causing current anger, catharsis of feelings for original event, and reorientation of event while under hypnosis. Results indicated decrease in acting-out…

19. Integrated analysis of transcriptomic and proteomic data of Desulfovibrio vulgaris: Zero-Inflated Poisson regression models to predict abundance of undetected proteins

SciTech Connect

Nie, Lei; Wu, Gang; Brockman, Fred J.; Zhang, Weiwen

2006-05-04

Abstract Advances in DNA microarray and proteomics technologies have enabled high-throughput measurement of mRNA expression and protein abundance. Parallel profiling of mRNA and protein on a global scale and integrative analysis of these two data types could provide additional insight into the metabolic mechanisms underlying complex biological systems. However, because protein abundance and mRNA expression are affected by many cellular and physical processes, there have been conflicting results on the correlation of these two measurements. In addition, as current proteomic methods can detect only a small fraction of proteins present in cells, no correlation study of these two data types has been done thus far at the whole-genome level. In this study, we describe a novel data-driven statistical model to integrate whole-genome microarray and proteomic data collected from Desulfovibrio vulgaris grown under three different conditions. Based on the Poisson distribution pattern of proteomic data and the fact that a large number of proteins were undetected (excess zeros), Zero-inflated Poisson models were used to define the correlation pattern of mRNA and protein abundance. The models assumed that there is a probability mass at zero representing some of the undetected proteins because of technical limitations. The models thus use abundance measurements of transcripts and proteins experimentally detected as input to generate predictions of protein abundances as output for all genes in the genome. We demonstrated the statistical models by comparatively analyzing D. vulgaris grown on lactate-based versus formate-based media. The increased expressions of Ech hydrogenase and alcohol dehydrogenase (Adh)-periplasmic Fe-only hydrogenase (Hyd) pathway for ATP synthesis were predicted for D. vulgaris grown on formate.

20. Short-Term Effects of Climatic Variables on Hand, Foot, and Mouth Disease in Mainland China, 2008–2013: A Multilevel Spatial Poisson Regression Model Accounting for Overdispersion

PubMed Central

Yang, Fang; Yang, Min; Hu, Yuehua; Zhang, Juying

2016-01-01

Background Hand, Foot, and Mouth Disease (HFMD) is a worldwide infectious disease. In China, many provinces have reported HFMD cases, especially the south and southwest provinces. Many studies have found a strong association between the incidence of HFMD and climatic factors such as temperature, rainfall, and relative humidity. However, few studies have analyzed cluster effects between various geographical units. Methods The nonlinear relationships and lag effects between weekly HFMD cases and climatic variables were estimated for the period of 2008–2013 using a polynomial distributed lag model. The extra-Poisson multilevel spatial polynomial model was used to model the exact relationship between weekly HFMD incidence and climatic variables after considering cluster effects, provincial correlated structure of HFMD incidence and overdispersion. The smoothing spline methods were used to detect threshold effects between climatic factors and HFMD incidence. Results The HFMD incidence spatial heterogeneity distributed among provinces, and the scale measurement of overdispersion was 548.077. After controlling for long-term trends, spatial heterogeneity and overdispersion, temperature was highly associated with HFMD incidence. Weekly average temperature and weekly temperature difference approximate inverse “V” shape and “V” shape relationships associated with HFMD incidence. The lag effects for weekly average temperature and weekly temperature difference were 3 weeks and 2 weeks. High spatial correlated HFMD incidence were detected in northern, central and southern province. Temperature can be used to explain most of variation of HFMD incidence in southern and northeastern provinces. After adjustment for temperature, eastern and Northern provinces still had high variation HFMD incidence. Conclusion We found a relatively strong association between weekly HFMD incidence and weekly average temperature. The association between the HFMD incidence and climatic

1. A novel strategy for forensic age prediction by DNA methylation and support vector regression model

PubMed Central

Xu, Cheng; Qu, Hongzhu; Wang, Guangyu; Xie, Bingbing; Shi, Yi; Yang, Yaran; Zhao, Zhao; Hu, Lan; Fang, Xiangdong; Yan, Jiangwei; Feng, Lei

2015-01-01

High deviations resulting from prediction model, gender and population difference have limited age estimation application of DNA methylation markers. Here we identified 2,957 novel age-associated DNA methylation sites (P < 0.01 and R2 > 0.5) in blood of eight pairs of Chinese Han female monozygotic twins. Among them, nine novel sites (false discovery rate < 0.01), along with three other reported sites, were further validated in 49 unrelated female volunteers with ages of 20–80 years by Sequenom Massarray. A total of 95 CpGs were covered in the PCR products and 11 of them were built the age prediction models. After comparing four different models including, multivariate linear regression, multivariate nonlinear regression, back propagation neural network and support vector regression, SVR was identified as the most robust model with the least mean absolute deviation from real chronological age (2.8 years) and an average accuracy of 4.7 years predicted by only six loci from the 11 loci, as well as an less cross-validated error compared with linear regression model. Our novel strategy provides an accurate measurement that is highly useful in estimating the individual age in forensic practice as well as in tracking the aging process in other related applications. PMID:26635134

2. Restoration of Eidetic Imagery via Hypnotic Age Regression: A Preliminary Report

ERIC Educational Resources Information Center

Walker, Neil S.; And Others

1976-01-01

Eidetic imagery involves the ability to examine a visual stimulus briefly and later project onto a neutral surface an image that represents an exact duplication of the original. This study uses the differential frequency of eidetic imagery ability between children and adults as a basis for testing the validity of hypnotic age regression.…

3. Random regression model of growth during the first three months of age in Spanish Merino sheep.

PubMed

2007-11-01

A total of 88,727 individual BW records of Spanish Merino lambs, obtained from 30,214 animals between 2 and 92 d of age, were analyzed using a random regression model (RRM). These animals were progeny of 546 rams and 15,586 ewes raised in 30 flocks, between 1992 and 2002, with a total of 45,941 animals in the pedigree. The contemporary groups (animals of the same flock, year, and season, with 452 levels), the lambing number (11 levels), the combination sex of lambs with type of litter (4 levels), and a fixed regression coefficient of age on BW were included as fixed effects. A total of 7 RRM were compared, and the best fit was obtained for a model of order 3 for the direct and maternal genetic effects and for the individual permanent environmental effect. For the maternal permanent environmental effect the best model had an order 2. The residual variance was assumed to be heterogeneous with 10 age classes; the covariance between both genetic effects was included. According to the results of the selected RRM, the heritability for both genetic effects (h(a)2 and h(m)2) increased with age, with estimates of 0.123 to 0.186 for h(a)2 and of 0.059 to 0.108 for h(m)2. The correlations between direct and genetic maternal effects were -0.619 to -0.387 during the first 45 d of age and decreased as age increased, until reaching values from -0.366 to -0.275 between 45 to 75 d of age. Important changes in ranking of the animals were found based on the breeding value estimation with the current method and with the random regression procedure. The use of RRM to analyze the genetic trajectory of growth in this population of Merino sheep is highly recommended.

4. Predicting Hospital Admissions With Poisson Regression Analysis

DTIC Science & Technology

2009-06-01

East and Four West. Four East is where bariatric , general, neurologic, otolaryngology (ENT), ophthalmologic, orthopedic, and plastic surgery ...where care is provided for cardiovascular, thoracic, and vascular surgery patients. Figure 1 shows a bar graph for each unit, giving the proportion of...provided at NMCSD, or a study could be conducted on the amount of time that patients generally wait for elective surgeries . There is also the

5. Alternatives for logistic regression in cross-sectional studies: an empirical comparison of models that directly estimate the prevalence ratio

PubMed Central

Barros, Aluísio JD; Hirakata, Vânia N

2003-01-01

Background Cross-sectional studies with binary outcomes analyzed by logistic regression are frequent in the epidemiological literature. However, the odds ratio can importantly overestimate the prevalence ratio, the measure of choice in these studies. Also, controlling for confounding is not equivalent for the two measures. In this paper we explore alternatives for modeling data of such studies with techniques that directly estimate the prevalence ratio. Methods We compared Cox regression with constant time at risk, Poisson regression and log-binomial regression against the standard Mantel-Haenszel estimators. Models with robust variance estimators in Cox and Poisson regressions and variance corrected by the scale parameter in Poisson regression were also evaluated. Results Three outcomes, from a cross-sectional study carried out in Pelotas, Brazil, with different levels of prevalence were explored: weight-for-age deficit (4%), asthma (31%) and mother in a paid job (52%). Unadjusted Cox/Poisson regression and Poisson regression with scale parameter adjusted by deviance performed worst in terms of interval estimates. Poisson regression with scale parameter adjusted by χ2 showed variable performance depending on the outcome prevalence. Cox/Poisson regression with robust variance, and log-binomial regression performed equally well when the model was correctly specified. Conclusions Cox or Poisson regression with robust variance and log-binomial regression provide correct estimates and are a better alternative for the analysis of cross-sectional studies with binary outcomes than logistic regression, since the prevalence ratio is more interpretable and easier to communicate to non-specialists than the odds ratio. However, precautions are needed to avoid estimation problems in specific situations. PMID:14567763

6. Does Scapular Motion Regress with Aging and is It Restricted in Patients with Idiopathic Frozen Shoulder?

PubMed Central

Endo, Kazuhiro; Hamada, Junichiro; Suzuki, Kazuaki; Hagiwara, Yoshihiro; Muraki, Takayuki; Karasuno, Hiroshi

2016-01-01

Purposes: It has been reported that the amount of posterior tilt and upward rotation in scapular motions decreases with aging. The purposes of the current study were to investigate age related scapular motion regression and scapular restriction in patients with idiopathic frozen shoulder (IFS). Methods: The groups were recruited as follows: two groups of 50 asymptomatic subjects aged in their twenties and fifties, and 56 patients with IFS. We passively moved the scapula toward 8 directions: elevation/depression; upward/downward rotation; external/internal rotation; and anterior/posterior tilt. The grading of scapular motion was ranged from 0 to 3 (3, normal; and 0, severe restriction) and the score for each direction and the total aggregated score for all directions were calculated. Results: Scapular restriction was present in 3 subjects (6%) in the normal 20s group, 10 (14%) in the 50s group, and 51 (91%) in the IFS group. The total score between the normal 20s and 50s groups did not show statistical difference; however, greater significance was present between the normal 50s group and the IFS group (p < 0.01). There was statistical significance in depression (p < 0.01), downward rotation (p < 0.01), and posterior tilt (p < 0.01) among the 3 groups. Conclusion: Depression, downward rotation, and posterior tilt substantially regress with aging. Scapular motions towards depression, downward rotation, external rotation, and posterior tilt are severely restricted in the IFS group. PMID:27733880

7. Investigation of the association between the test day milk fat-protein ratio and clinical mastitis using a Poisson regression approach for analysis of time-to-event data.

PubMed

Zoche-Golob, V; Heuwieser, W; Krömker, V

2015-09-01

The objective of the present study was to investigate the association between the milk fat-protein ratio and the incidence rate of clinical mastitis including repeated cases of clinical mastitis to determine the usefulness of this association to monitor metabolic disorders as risk factors for udder health. Herd records from 10 dairy herds of Holstein cows in Saxony, Germany, from September 2005-2011 (36,827 lactations of 17,657 cows) were used for statistical analysis. A mixed Poisson regression model with the weekly incidence rate of clinical mastitis as outcome variable was fitted. The model included repeated events of the outcome, time-varying covariates and multilevel clustering. Because the recording of clinical mastitis might have been imperfect, a probabilistic bias analysis was conducted to assess the impact of the misclassification of clinical mastitis on the conventional results. The lactational incidence of clinical mastitis was 38.2%. In 36.2% and 34.9% of the lactations, there was at least one dairy herd test day with a fat-protein ratio of <1.0 or >1.5, respectively. Misclassification of clinical mastitis was assumed to have resulted in bias towards the null. A clinical mastitis case increased the incidence rate of following cases of the same cow. Fat-protein ratios of <1.0 and >1.5 were associated with higher incidence rates of clinical mastitis depending on week in milk. The effect of a fat-protein ratio >1.5 on the incidence rate of clinical mastitis increased considerably over the course of lactation, whereas the effect of a fat-protein ratio <1.0 decreased. Fat-protein ratios <1.0 or >1.5 on the precedent test days of all cows irrespective of their time in milk seemed to be better predictors for clinical mastitis than the first test day results per lactation.

8. High Adherence to Iron/Folic Acid Supplementation during Pregnancy Time among Antenatal and Postnatal Care Attendant Mothers in Governmental Health Centers in Akaki Kality Sub City, Addis Ababa, Ethiopia: Hierarchical Negative Binomial Poisson Regression

PubMed Central

2017-01-01

9. The impact of health insurance for children under age 6 in Vietnam: A regression discontinuity approach.

PubMed

Palmer, Michael; Mitra, Sophie; Mont, Daniel; Groce, Nora

2015-11-01

Accessing health services at an early age is important to future health and life outcomes. Yet, little is currently known on the role of health insurance in facilitating access to care for children. Exploiting a regression discontinuity design made possible through a policy to provide health insurance to pre-school aged children in Vietnam, this paper evaluates the impact of health insurance on the health care utilization outcomes of children at the eligibility threshold of six years. Using three rounds of the Vietnam Household Living Standards Survey, the study finds a positive impact on inpatient and outpatient visits and no significant impact on expenditures per visit at public facilities. We find moderately high use of private outpatient services and no evidence of a switch from private to covered public facilities under insurance. Results suggest that adopting public health insurance programs for children under age 6 may be an important vehicle to improving service utilization in a low- and middle-income country context. Challenges remain in providing adequate protections from the costs and other barriers to care.

10. The use of a random regression model to account for change in racing speed of German trotters with increasing age.

PubMed

Bugislaus, A-E; Roehe, R; Willms, F; Kalm, E

2006-08-01

In a genetic analysis of German trotters, the performance trait racing time per km was analysed by using a random regression model on six different age classes (2-, 3-, 4-, 5- and 6-year-old and older trotters; the age class of 3-year-old trotters was additionally divided by birth months of horses into two seasons). The best-fitting random regression model for the trait racing time per km on six age classes included as fixed effects sex, race track, condition of race track (fitted as second-order polynomial on age), distance of race and each driver (fitted as first-order polynomial on age) as well as the year-season (fitted independent of age). The random additive genetic and permanent environmental effects were fitted as second-order polynomials on age. Data consisted of 138,620 performance observations from 2,373 trotters and the pedigree data contained 9,952 horses from a four-generation pedigree. Heritabilities for racing time per km increased from 0.01 to 0.18 at age classes from 2- to 4-year-old trotters, then slightly decreased for 5 year and substantially decreased for 6-year-old horses. Genetic correlations of racing time per km among the six age classes were very high (rg = 0.82-0.99). Heritability was h2 = 0.13 when using a repeatability animal model for racing time per km considering the six age classes as fixed effect. Breeding values using repeatability analysis over all and within age classes resulted in slightly different ranking of trotters than those using random regression analysis. When using random regression analysis almost no reranking of trotters over time took place. Generally, the analyses showed that using a random regression model improved the accuracy of selection of trotters over age classes.

11. Cumulative Poisson Distribution Program

NASA Technical Reports Server (NTRS)

Bowerman, Paul N.; Scheuer, Ernest M.; Nolty, Robert

1990-01-01

Overflow and underflow in sums prevented. Cumulative Poisson Distribution Program, CUMPOIS, one of two computer programs that make calculations involving cumulative Poisson distributions. Both programs, CUMPOIS (NPO-17714) and NEWTPOIS (NPO-17715), used independently of one another. CUMPOIS determines cumulative Poisson distribution, used to evaluate cumulative distribution function (cdf) for gamma distributions with integer shape parameters and cdf for X (sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Written in C.

12. Scaling the Poisson Distribution

ERIC Educational Resources Information Center

Farnsworth, David L.

2014-01-01

We derive the additive property of Poisson random variables directly from the probability mass function. An important application of the additive property to quality testing of computer chips is presented.

13. Random regression models on Legendre polynomials to estimate genetic parameters for weights from birth to adult age in Canchim cattle.

PubMed

Baldi, F; Albuquerque, L G; Alencar, M M

2010-08-01

The objective of this work was to estimate covariance functions for direct and maternal genetic effects, animal and maternal permanent environmental effects, and subsequently, to derive relevant genetic parameters for growth traits in Canchim cattle. Data comprised 49,011 weight records on 2435 females from birth to adult age. The model of analysis included fixed effects of contemporary groups (year and month of birth and at weighing) and age of dam as quadratic covariable. Mean trends were taken into account by a cubic regression on orthogonal polynomials of animal age. Residual variances were allowed to vary and were modelled by a step function with 1, 4 or 11 classes based on animal's age. The model fitting four classes of residual variances was the best. A total of 12 random regression models from second to seventh order were used to model direct and maternal genetic effects, animal and maternal permanent environmental effects. The model with direct and maternal genetic effects, animal and maternal permanent environmental effects fitted by quadric, cubic, quintic and linear Legendre polynomials, respectively, was the most adequate to describe the covariance structure of the data. Estimates of direct and maternal heritability obtained by multi-trait (seven traits) and random regression models were very similar. Selection for higher weight at any age, especially after weaning, will produce an increase in mature cow weight. The possibility to modify the growth curve in Canchim cattle to obtain animals with rapid growth at early ages and moderate to low mature cow weight is limited.

14. Regression: A Bibliography.

ERIC Educational Resources Information Center

Pedrini, D. T.; Pedrini, Bonnie C.

Regression, another mechanism studied by Sigmund Freud, has had much research, e.g., hypnotic regression, frustration regression, schizophrenic regression, and infra-human-animal regression (often directly related to fixation). Many investigators worked with hypnotic age regression, which has a long history, going back to Russian reflexologists.…

15. Poisson Structures:. Towards a Classification

Grabowski, J.; Marmo, G.; Perelomov, A. M.

In the present note we give an explicit description of certain class of Poisson structures. The methods lead to a classification of Poisson structures in low dimensions and suggest a possible approach for higher dimensions.

16. Radiologic assessment of third molar tooth and spheno-occipital synchondrosis for age estimation: a multiple regression analysis study.

PubMed

Demirturk Kocasarac, Husniye; Sinanoglu, Alper; Noujeim, Marcel; Helvacioglu Yigit, Dilek; Baydemir, Canan

2016-05-01

For forensic age estimation, radiographic assessment of third molar mineralization is important between 14 and 21 years which coincides with the legal age in most countries. The spheno-occipital synchondrosis (SOS) is an important growth site during development, and its use for age estimation is beneficial when combined with other markers. In this study, we aimed to develop a regression model to estimate and narrow the age range based on the radiologic assessment of third molar and SOS in a Turkish subpopulation. Panoramic radiographs and cone beam CT scans of 349 subjects (182 males, 167 females) with age between 8 and 25 were evaluated. Four-stage system was used to evaluate the fusion degree of SOS, and Demirjian's eight stages of development for calcification for third molars. The Pearson correlation indicated a strong positive relationship between age and third molar calcification for both sexes (r = 0.850 for females, r = 0.839 for males, P < 0.001) and also between age and SOS fusion for females (r = 0.814), but a moderate relationship was found for males (r = 0.599), P < 0.001). Based on the results obtained, an age determination formula using these scores was established.

17. Branes in Poisson sigma models

SciTech Connect

Falceto, Fernando

2010-07-28

In this review we discuss possible boundary conditions (branes) for the Poisson sigma model. We show how to carry out the perturbative quantization in the presence of a general pre-Poisson brane and how this is related to the deformation quantization of Poisson structures. We conclude with an open problem: the perturbative quantization of the system when the boundary has several connected components and we use a different pre-Poisson brane in every component.

18. [Study on the Recognition of Liquor Age of Gujing Based on Raman Spectra and Support Vector Regression].

PubMed

Wang, Guo-xiang; Wang, Hai-yan; Wang, Hu; Zhang, Zheng-yong; Liu, Jun

2016-03-01

It is an important and difficult research point to recognize the age of Chinese liquor rapidly and exactly in the field of liquor analyzing, which is also of great significance to the healthy development of the liquor industry and protection of the legitimate rights and interests of consumers. Spectroscopy together with the pattern recognition technology is a preferred method of achieving rapid identification of wine quality, in which the Raman Spectroscopy is promising because of its little affection of water and little or free of sample pretreatment. So, in this paper, Raman spectra and support vector regression (SVR) are used to recognize different ages and different storing time of the liquor of the same age. The innovation of this paper is mainly reflected in the following three aspects. First, the application of Raman in the area of liquor analysis is rarely reported till now. Second, the concentration of studying the recognition of wine age, while most studies focus on studying specific components of liquor and studies together with the pattern recognition method focus more on the identification of brands or different types of base wine. The third one is the application of regression analysis framework, which cannot be only used to identify different years of liquor, but also can be used to analyze different storing time, which has theoretical and practical significance to the research and quality control of liquor. Three kinds of experiments are conducted in this paper. Firstly, SVR is used to recognize different ages of 5, 8, 16 and 26 years of the Gujing Liquor; secondly, SVR is also used to classify the storing time of the 8-years liquor; thirdly, certain group of train data is deleted form the train set and put into the test set to simulate the actual situation of liquor age recognition. Results show that the SVR model has good train and predict performance in these experiments, and it has better performance than other non-liner regression method such

19. Poisson-Riemannian geometry

Beggs, Edwin J.; Majid, Shahn

2017-04-01

We study noncommutative bundles and Riemannian geometry at the semiclassical level of first order in a deformation parameter λ, using a functorial approach. This leads us to field equations of 'Poisson-Riemannian geometry' between the classical metric, the Poisson bracket and a certain Poisson-compatible connection needed as initial data for the quantisation of the differential structure. We use such data to define a functor Q to O(λ2) from the monoidal category of all classical vector bundles equipped with connections to the monoidal category of bimodules equipped with bimodule connections over the quantised algebra. This is used to 'semiquantise' the wedge product of the exterior algebra and in the Riemannian case, the metric and the Levi-Civita connection in the sense of constructing a noncommutative geometry to O(λ2) . We solve our field equations for the Schwarzschild black-hole metric under the assumption of spherical symmetry and classical dimension, finding a unique solution and the necessity of nonassociativity at order λ2, which is similar to previous results for quantum groups. The paper also includes a nonassociative hyperboloid, nonassociative fuzzy sphere and our previously algebraic bicrossproduct model.

20. Impact of population aging on trends in diabetes prevalence: A meta-regression analysis of 160,000 Japanese adults

PubMed Central

Charvat, Hadrien; Goto, Atsushi; Goto, Maki; Inoue, Machiko; Heianza, Yoriko; Arase, Yasuji; Sone, Hirohito; Nakagami, Tomoko; Song, Xin; Qiao, Qing; Tuomilehto, Jaakko; Tsugane, Shoichiro; Noda, Mitsuhiko; Inoue, Manami

2015-01-01

Aims/Introduction To provide age- and sex-specific trends, age-standardized trends, and projections of diabetes prevalence through the year 2030 in the Japanese adult population. Materials and Methods In the present meta-regression analysis, we included 161,087 adults from six studies and nine national health surveys carried out between 1988 and 2011 in Japan. We assessed the prevalence of diabetes using a recorded history of diabetes or, for the population of individuals without known diabetes, either a glycated hemoglobin level of ≥6.5% (48 mmol/mol) or the 1999 World Health Organization criteria (i.e., a fasting plasma glucose level of ≥126 mg/dL and/or 2-h glucose level of ≥200 mg/dL in the 75-g oral glucose tolerance test). Results For both sexes, prevalence appeared to remain unchanged over the years in all age categories except for men aged 70 years or older, in whom a significant increase in prevalence with time was observed. Age-standardized diabetes prevalence estimates based on the Japanese population of the corresponding year showed marked increasing trends: diabetes prevalence was 6.1% among women (95% confidence interval [CI] 5.5–6.7), 9.9% (95% CI 9.2–10.6) among men, and 7.9% (95% CI 7.5–8.4) among the total population in 2010, and was expected to rise by 2030 to 6.7% (95% CI 5.2–9.2), 13.1% (95% CI 10.9–16.7) and 9.8% (95% CI 8.5–12.0), respectively. In contrast, the age-standardized diabetes prevalence using a fixed population appeared to remain unchanged. Conclusions This large-scale meta-regression analysis shows that a substantial increase in diabetes prevalence is expected in Japan during the next few decades, mainly as a result of the aging of the adult population. PMID:26417410

1. The performance of functional methods for correcting non-Gaussian measurement error within Poisson regression: corrected excess risk of lung cancer mortality in relation to radon exposure among French uranium miners.

PubMed

Allodji, Rodrigue S; Thiébaut, Anne C M; Leuraud, Klervi; Rage, Estelle; Henry, Stéphane; Laurier, Dominique; Bénichou, Jacques

2012-12-30

A broad variety of methods for measurement error (ME) correction have been developed, but these methods have rarely been applied possibly because their ability to correct ME is poorly understood. We carried out a simulation study to assess the performance of three error-correction methods: two variants of regression calibration (the substitution method and the estimation calibration method) and the simulation extrapolation (SIMEX) method. Features of the simulated cohorts were borrowed from the French Uranium Miners' Cohort in which exposure to radon had been documented from 1946 to 1999. In the absence of ME correction, we observed a severe attenuation of the true effect of radon exposure, with a negative relative bias of the order of 60% on the excess relative risk of lung cancer death. In the main scenario considered, that is, when ME characteristics previously determined as most plausible from the French Uranium Miners' Cohort were used both to generate exposure data and to correct for ME at the analysis stage, all three error-correction methods showed a noticeable but partial reduction of the attenuation bias, with a slight advantage for the SIMEX method. However, the performance of the three correction methods highly depended on the accurate determination of the characteristics of ME. In particular, we encountered severe overestimation in some scenarios with the SIMEX method, and we observed lack of correction with the three methods in some other scenarios. For illustration, we also applied and compared the proposed methods on the real data set from the French Uranium Miners' Cohort study.

2. Algorithm Calculates Cumulative Poisson Distribution

NASA Technical Reports Server (NTRS)

Bowerman, Paul N.; Nolty, Robert C.; Scheuer, Ernest M.

1992-01-01

Algorithm calculates accurate values of cumulative Poisson distribution under conditions where other algorithms fail because numbers are so small (underflow) or so large (overflow) that computer cannot process them. Factors inserted temporarily to prevent underflow and overflow. Implemented in CUMPOIS computer program described in "Cumulative Poisson Distribution Program" (NPO-17714).

3. Poisson Spot with Magnetic Levitation

ERIC Educational Resources Information Center

Hoover, Matthew; Everhart, Michael; D'Arruda, Jose

2010-01-01

In this paper we describe a unique method for obtaining the famous Poisson spot without adding obstacles to the light path, which could interfere with the effect. A Poisson spot is the interference effect from parallel rays of light diffracting around a solid spherical object, creating a bright spot in the center of the shadow.

4. Regression of Some High-risk Features of Age-related Macular Degeneration (AMD) in Patients Receiving Intensive Statin Treatment

PubMed Central

Vavvas, Demetrios G.; Daniels, Anthony B.; Kapsala, Zoi G.; Goldfarb, Jeremy W.; Ganotakis, Emmanuel; Loewenstein, John I.; Young, Lucy H.; Gragoudas, Evangelos S.; Eliott, Dean; Kim, Ivana K.; Tsilimbaris, Miltiadis K.; Miller, Joan W.

2016-01-01

Importance Age-related macular degeneration (AMD) remains the leading cause of blindness in developed countries, and affects more than 150 million worldwide. Despite effective anti-angiogenic therapies for the less prevalent neovascular form of AMD, treatments are lacking for the more prevalent dry form. Similarities in risk factors and pathogenesis between AMD and atherosclerosis have led investigators to study the effects of statins on AMD incidence and progression with mixed results. A limitation of these studies has been the heterogeneity of AMD disease and the lack of standardization in statin dosage. Objective We were interested in studying the effects of high-dose statins, similar to those showing regression of atherosclerotic plaques, in AMD. Design Pilot multicenter open-label prospective clinical study of 26 patients with diagnosis of AMD and the presence of many large, soft drusenoid deposits. Patients received 80 mg of atorvastatin daily and were monitored at baseline and every 3 months with complete ophthalmologic exam, best corrected visual acuity (VA), fundus photographs, optical coherence tomography (OCT), and blood work (AST, ALT, CPK, total cholesterol, TSH, creatinine, as well as a pregnancy test for premenopausal women). Results Twenty-three subjects completed a minimum follow-up of 12 months. High-dose atorvastatin resulted in regression of drusen deposits associated with vision gain (+ 3.3 letters, p = 0.06) in 10 patients. No subjects progressed to advanced neovascular AMD. Conclusions High-dose statins may result in resolution of drusenoid pigment epithelial detachments (PEDs) and improvement in VA, without atrophy or neovascularization in a high-risk subgroup of AMD patients. Confirmation from larger studies is warranted. PMID:27077128

5. Differences in geriatric anthropometric data between DXA-based subject-specific estimates and non-age-specific traditional regression models.

PubMed

Chambers, April J; Sukits, Alison L; McCrory, Jean L; Cham, Rakie

2011-08-01

Age, obesity, and gender can have a significant impact on the anthropometrics of adults aged 65 and older. The aim of this study was to investigate differences in body segment parameters derived using two methods: (1) a dual-energy x-ray absorptiometry (DXA) subject-specific method (Chambers et al., 2010) and (2) traditional regression models (de Leva, 1996). The impact of aging, gender, and obesity on the potential differences between these methods was examined. Eighty-three healthy older adults were recruited for participation. Participants underwent a whole-body DXA scan (Hologic QDR 1000/W). Mass, length, center of mass, and radius of gyration were determined for each segment. In addition, traditional regressions were used to estimate these parameters (de Leva, 1996). A mixed linear regression model was performed (α = 0.05). Method type was significant in every variable of interest except forearm segment mass. The obesity and gender differences that we observed translate into differences associated with using traditional regressions to predict anthropometric variables in an aging population. Our data point to a need to consider age, obesity, and gender when utilizing anthropometric data sets and to develop regression models that accurately predict body segment parameters in the geriatric population, considering gender and obesity.

6. Newton/Poisson-Distribution Program

NASA Technical Reports Server (NTRS)

Bowerman, Paul N.; Scheuer, Ernest M.

1990-01-01

NEWTPOIS, one of two computer programs making calculations involving cumulative Poisson distributions. NEWTPOIS (NPO-17715) and CUMPOIS (NPO-17714) used independently of one another. NEWTPOIS determines Poisson parameter for given cumulative probability, from which one obtains percentiles for gamma distributions with integer shape parameters and percentiles for X(sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Program written in C.

7. How does Poisson kriging compare to the popular BYM model for mapping disease risks?

PubMed Central

Goovaerts, Pierre; Gebreab, Samson

2008-01-01

Background Geostatistical techniques are now available to account for spatially varying population sizes and spatial patterns in the mapping of disease rates. At first glance, Poisson kriging represents an attractive alternative to increasingly popular Bayesian spatial models in that: 1) it is easier to implement and less CPU intensive, and 2) it accounts for the size and shape of geographical units, avoiding the limitations of conditional auto-regressive (CAR) models commonly used in Bayesian algorithms while allowing for the creation of isopleth risk maps. Both approaches, however, have never been compared in simulation studies, and there is a need to better understand their merits in terms of accuracy and precision of disease risk estimates. Results Besag, York and Mollie's (BYM) model and Poisson kriging (point and area-to-area implementations) were applied to age-adjusted lung and cervix cancer mortality rates recorded for white females in two contrasted county geographies: 1) state of Indiana that consists of 92 counties of fairly similar size and shape, and 2) four states in the Western US (Arizona, California, Nevada and Utah) forming a set of 118 counties that are vastly different geographical units. The spatial support (i.e. point versus area) has a much smaller impact on the results than the statistical methodology (i.e. geostatistical versus Bayesian models). Differences between methods are particularly pronounced in the Western US dataset: BYM model yields smoother risk surface and prediction variance that changes mainly as a function of the predicted risk, while the Poisson kriging variance increases in large sparsely populated counties. Simulation studies showed that the geostatistical approach yields smaller prediction errors, more precise and accurate probability intervals, and allows a better discrimination between counties with high and low mortality risks. The benefit of area-to-area Poisson kriging increases as the county geography becomes more

8. Relaxed Poisson cure rate models.

PubMed

Rodrigues, Josemar; Cordeiro, Gauss M; Cancho, Vicente G; Balakrishnan, N

2016-03-01

The purpose of this article is to make the standard promotion cure rate model (Yakovlev and Tsodikov, ) more flexible by assuming that the number of lesions or altered cells after a treatment follows a fractional Poisson distribution (Laskin, ). It is proved that the well-known Mittag-Leffler relaxation function (Berberan-Santos, ) is a simple way to obtain a new cure rate model that is a compromise between the promotion and geometric cure rate models allowing for superdispersion. So, the relaxed cure rate model developed here can be considered as a natural and less restrictive extension of the popular Poisson cure rate model at the cost of an additional parameter, but a competitor to negative-binomial cure rate models (Rodrigues et al., ). Some mathematical properties of a proper relaxed Poisson density are explored. A simulation study and an illustration of the proposed cure rate model from the Bayesian point of view are finally presented.

9. Poisson's spot and Gouy phase

da Paz, I. G.; Soldati, Rodolfo; Cabral, L. A.; de Oliveira, J. G. G.; Sampaio, Marcos

2016-12-01

Recently there have been experimental results on Poisson spot matter-wave interferometry followed by theoretical models describing the relative importance of the wave and particle behaviors for the phenomenon. We propose an analytical theoretical model for Poisson's spot with matter waves based on the Babinet principle, in which we use the results for free propagation and single-slit diffraction. We take into account effects of loss of coherence and finite detection area using the propagator for a quantum particle interacting with an environment. We observe that the matter-wave Gouy phase plays a role in the existence of the central peak and thus corroborates the predominantly wavelike character of the Poisson's spot. Our model shows remarkable agreement with the experimental data for deuterium (D2) molecules.

10. NEWTPOIS- NEWTON POISSON DISTRIBUTION PROGRAM

NASA Technical Reports Server (NTRS)

Bowerman, P. N.

1994-01-01

The cumulative poisson distribution program, NEWTPOIS, is one of two programs which make calculations involving cumulative poisson distributions. Both programs, NEWTPOIS (NPO-17715) and CUMPOIS (NPO-17714), can be used independently of one another. NEWTPOIS determines percentiles for gamma distributions with integer shape parameters and calculates percentiles for chi-square distributions with even degrees of freedom. It can be used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. NEWTPOIS determines the Poisson parameter (lambda), that is; the mean (or expected) number of events occurring in a given unit of time, area, or space. Given that the user already knows the cumulative probability for a specific number of occurrences (n) it is usually a simple matter of substitution into the Poisson distribution summation to arrive at lambda. However, direct calculation of the Poisson parameter becomes difficult for small positive values of n and unmanageable for large values. NEWTPOIS uses Newton's iteration method to extract lambda from the initial value condition of the Poisson distribution where n=0, taking successive estimations until some user specified error term (epsilon) is reached. The NEWTPOIS program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly on most C compilers. The program format is interactive, accepting epsilon, n, and the cumulative probability of the occurrence of n as inputs. It has been implemented under DOS 3.2 and has a memory requirement of 30K. NEWTPOIS was developed in 1988.

11. Characterizing the performance of the Conway-Maxwell Poisson generalized linear model.

PubMed

Francis, Royce A; Geedipally, Srinivas Reddy; Guikema, Seth D; Dhavala, Soma Sekhar; Lord, Dominique; LaRocca, Sarah

2012-01-01

Count data are pervasive in many areas of risk analysis; deaths, adverse health outcomes, infrastructure system failures, and traffic accidents are all recorded as count events, for example. Risk analysts often wish to estimate the probability distribution for the number of discrete events as part of doing a risk assessment. Traditional count data regression models of the type often used in risk assessment for this problem suffer from limitations due to the assumed variance structure. A more flexible model based on the Conway-Maxwell Poisson (COM-Poisson) distribution was recently proposed, a model that has the potential to overcome the limitations of the traditional model. However, the statistical performance of this new model has not yet been fully characterized. This article assesses the performance of a maximum likelihood estimation method for fitting the COM-Poisson generalized linear model (GLM). The objectives of this article are to (1) characterize the parameter estimation accuracy of the MLE implementation of the COM-Poisson GLM, and (2) estimate the prediction accuracy of the COM-Poisson GLM using simulated data sets. The results of the study indicate that the COM-Poisson GLM is flexible enough to model under-, equi-, and overdispersed data sets with different sample mean values. The results also show that the COM-Poisson GLM yields accurate parameter estimates. The COM-Poisson GLM provides a promising and flexible approach for performing count data regression.

12. Graded geometry and Poisson reduction

SciTech Connect

Cattaneo, A. S.; Zambon, M.

2009-02-02

The main result extends the Marsden-Ratiu reduction theorem in Poisson geometry, and is proven by means of graded geometry. In this note we provide the background material about graded geometry necessary for the proof. Further, we provide an alternative algebraic proof for the main result.

13. Sparse Poisson noisy image deblurring.

PubMed

Carlavan, Mikael; Blanc-Féraud, Laure

2012-04-01

Deblurring noisy Poisson images has recently been a subject of an increasing amount of works in many areas such as astronomy and biological imaging. In this paper, we focus on confocal microscopy, which is a very popular technique for 3-D imaging of biological living specimens that gives images with a very good resolution (several hundreds of nanometers), although degraded by both blur and Poisson noise. Deconvolution methods have been proposed to reduce these degradations, and in this paper, we focus on techniques that promote the introduction of an explicit prior on the solution. One difficulty of these techniques is to set the value of the parameter, which weights the tradeoff between the data term and the regularizing term. Only few works have been devoted to the research of an automatic selection of this regularizing parameter when considering Poisson noise; therefore, it is often set manually such that it gives the best visual results. We present here two recent methods to estimate this regularizing parameter, and we first propose an improvement of these estimators, which takes advantage of confocal images. Following these estimators, we secondly propose to express the problem of the deconvolution of Poisson noisy images as the minimization of a new constrained problem. The proposed constrained formulation is well suited to this application domain since it is directly expressed using the antilog likelihood of the Poisson distribution and therefore does not require any approximation. We show how to solve the unconstrained and constrained problems using the recent alternating-direction technique, and we present results on synthetic and real data using well-known priors, such as total variation and wavelet transforms. Among these wavelet transforms, we specially focus on the dual-tree complex wavelet transform and on the dictionary composed of curvelets and an undecimated wavelet transform.

14. Calculation of the Poisson cumulative distribution function

NASA Technical Reports Server (NTRS)

Bowerman, Paul N.; Nolty, Robert G.; Scheuer, Ernest M.

1990-01-01

A method for calculating the Poisson cdf (cumulative distribution function) is presented. The method avoids computer underflow and overflow during the process. The computer program uses this technique to calculate the Poisson cdf for arbitrary inputs. An algorithm that determines the Poisson parameter required to yield a specified value of the cdf is presented.

15. Temporal and spatial relations between age specific mortality and ambient air quality in the United States: regression results for counties, 1960–97

PubMed Central

Lipfert, F; Morris, S

2002-01-01

Objective: To investigate longitudinal and spatial relations between air pollution and age specific mortality for United States counties (except Alaska) from 1960 to the end of 1997. Methods: Cross sectional regressions for five specific periods using published data on mortality, air quality, demography, climate, socioeconomic status, lifestyle, and diet. Outcome measures are statistical relations between air quality and county mortalities by age group for all causes of death, other than AIDS and trauma. Results: A specific regression model was developed for each period and age group, using variables that were significant (p<0.05), not substantially collinear (variance inflation factor <2), and had the expected algebraic sign. Models were initially developed without the air pollution variables, which varied in spatial coverage. Residuals were then regressed in turn against current and previous air quality, and dose-response plots were constructed. The validity of this two stage procedure was shown by comparing a subset of results with those obtained with single stage models that included air quality (correlation=0.88). On the basis of attributable risks computed for overall mean concentrations, the strongest associations were found in the earlier periods, with attributable risks usually less than 5%. Stronger relations were found when mortality and air quality were measured in the same period and when the locations considered were limited to those of previous cohort studies (for PM2.5 and SO42-). Thresholds were suggested at 100–130 µg/m3 for mean total suspended particulate (TSP), 7–10 µg/m3 for mean sulfate, 10–15 ppm for peak (95th percentile) CO, 20–40 ppb for mean SO2. Contrary to expectations, associations were often stronger for the younger age groups (<65 y). Responses to PM, CO, and SO2 declined over time; responses in elderly people to peak O3 increased over time as did responses to NO2 for the younger age groups. These results generally agreed

16. A regression method including chronological and bone age for predicting final height in Turner's syndrome, with a comparison of existing methods.

PubMed

van Teunenbroek, A; Stijnen, T; Otten, B; de Muinck Keizer-Schrama, S; Naeraa, R W; Rongen-Westerlaken, C; Drop, S

1996-04-01

A total of 235 measurement points of 57 Dutch women with Turner's syndrome (TS), including women with spontaneous menarche and oestrogen treatment, served to develop a new Turner-specific final height (FH) prediction method (PTS). Analogous to the Tanner and Whitehouse mark 2 method (TW) for normal children, smoothed regression coefficients are tabulated for PTS for height (H), chronological age (CA) and bone age (BA), both TW RUS and Greulich and Pyle (GP). Comparison between all methods on 40 measurement points of 21 Danish TS women showed small mean prediction errors (predicted minus observed FH) and corresponding standard deviation (ESD) of both PTSRUS and PTSGP, in particular at the "younger" ages. Comparison between existing methods on the Dutch data indicated a tendency to overpredict FH. Before the CA of 9 years the mean prediction errors of the Bayley and Pinneau and TW methods were markedly higher compared with the other methods. Overall, the simplest methods--projected height (PAH) and its modification (mPAH)--were remarkably good at most ages. Although the validity of PTSRUS and PTSGP remains to be tested below the age of 6 years, both gave small mean prediction errors and a high accuracy. FH prediction in TS is important in the consideration of growth-promoting therapy or in the evaluation of its effects.

17. Phase space reduction and Poisson structure

1999-07-01

Let (P,π,B,G) be a G-principal fiber bundle. The action of G on the cotangent bundle T*P is free and Hamiltonian. By Liberman and Marle [Symplectic Geometry and Analytical Mechanics (Reidel, Dortrecht, 1987)] and Marsden and Ratiu [Lett. Math. Phys. 11, 161 (1981)] the quotient space T*P/G is a Poisson manifold. We will determine the Poisson bracket on the reduced Poisson manifold T*P/G, and its symplectic leaves.

18. Nonlinear Poisson equation for heterogeneous media.

PubMed

Hu, Langhua; Wei, Guo-Wei

2012-08-22

The Poisson equation is a widely accepted model for electrostatic analysis. However, the Poisson equation is derived based on electric polarizations in a linear, isotropic, and homogeneous dielectric medium. This article introduces a nonlinear Poisson equation to take into consideration of hyperpolarization effects due to intensive charges and possible nonlinear, anisotropic, and heterogeneous media. Variational principle is utilized to derive the nonlinear Poisson model from an electrostatic energy functional. To apply the proposed nonlinear Poisson equation for the solvation analysis, we also construct a nonpolar solvation energy functional based on the nonlinear Poisson equation by using the geometric measure theory. At a fixed temperature, the proposed nonlinear Poisson theory is extensively validated by the electrostatic analysis of the Kirkwood model and a set of 20 proteins, and the solvation analysis of a set of 17 small molecules whose experimental measurements are also available for a comparison. Moreover, the nonlinear Poisson equation is further applied to the solvation analysis of 21 compounds at different temperatures. Numerical results are compared to theoretical prediction, experimental measurements, and those obtained from other theoretical methods in the literature. A good agreement between our results and experimental data as well as theoretical results suggests that the proposed nonlinear Poisson model is a potentially useful model for electrostatic analysis involving hyperpolarization effects.

19. Nonlinear Poisson Equation for Heterogeneous Media

PubMed Central

Hu, Langhua; Wei, Guo-Wei

2012-01-01

The Poisson equation is a widely accepted model for electrostatic analysis. However, the Poisson equation is derived based on electric polarizations in a linear, isotropic, and homogeneous dielectric medium. This article introduces a nonlinear Poisson equation to take into consideration of hyperpolarization effects due to intensive charges and possible nonlinear, anisotropic, and heterogeneous media. Variational principle is utilized to derive the nonlinear Poisson model from an electrostatic energy functional. To apply the proposed nonlinear Poisson equation for the solvation analysis, we also construct a nonpolar solvation energy functional based on the nonlinear Poisson equation by using the geometric measure theory. At a fixed temperature, the proposed nonlinear Poisson theory is extensively validated by the electrostatic analysis of the Kirkwood model and a set of 20 proteins, and the solvation analysis of a set of 17 small molecules whose experimental measurements are also available for a comparison. Moreover, the nonlinear Poisson equation is further applied to the solvation analysis of 21 compounds at different temperatures. Numerical results are compared to theoretical prediction, experimental measurements, and those obtained from other theoretical methods in the literature. A good agreement between our results and experimental data as well as theoretical results suggests that the proposed nonlinear Poisson model is a potentially useful model for electrostatic analysis involving hyperpolarization effects. PMID:22947937

20. Logistic Regression

Grégoire, G.

2014-12-01

The logistic regression originally is intended to explain the relationship between the probability of an event and a set of covariables. The model's coefficients can be interpreted via the odds and odds ratio, which are presented in introduction of the chapter. The observations are possibly got individually, then we speak of binary logistic regression. When they are grouped, the logistic regression is said binomial. In our presentation we mainly focus on the binary case. For statistical inference the main tool is the maximum likelihood methodology: we present the Wald, Rao and likelihoods ratio results and their use to compare nested models. The problems we intend to deal with are essentially the same as in multiple linear regression: testing global effect, individual effect, selection of variables to build a model, measure of the fitness of the model, prediction of new values… . The methods are demonstrated on data sets using R. Finally we briefly consider the binomial case and the situation where we are interested in several events, that is the polytomous (multinomial) logistic regression and the particular case of ordinal logistic regression.

1. Sub-Poisson-binomial light

Lee, Changhyoup; Ferrari, Simone; Pernice, Wolfram H. P.; Rockstuhl, Carsten

2016-11-01

We introduce a general parameter QPB that provides an experimentally accessible nonclassicality measure for light. The parameter is quantified by the click statistics obtained from on-off detectors in a general multiplexing detection setup. Sub-Poisson-binomial statistics, observed by QPB<0 , indicates that a given state of light is nonclassical. Our parameter replaces the binomial parameter QB for more general cases, where any unbalance among the multiplexed modes is allowed, thus enabling the use of arbitrary multiplexing schemes. The significance of the parameter QPB is theoretically examined in a measurement setup that only consists of a ring resonator and a single on-off detector. The proposed setup exploits minimal experimental resources and is geared towards a fully integrated quantum nanophotonic circuit. The results show that nonclassical features remain noticeable even in the presence of significant losses, rendering our nonclassicality test more practical and sufficiently flexible to be used in various nanophotonic platforms.

2. CUMPOIS- CUMULATIVE POISSON DISTRIBUTION PROGRAM

NASA Technical Reports Server (NTRS)

Bowerman, P. N.

1994-01-01

The Cumulative Poisson distribution program, CUMPOIS, is one of two programs which make calculations involving cumulative poisson distributions. Both programs, CUMPOIS (NPO-17714) and NEWTPOIS (NPO-17715), can be used independently of one another. CUMPOIS determines the approximate cumulative binomial distribution, evaluates the cumulative distribution function (cdf) for gamma distributions with integer shape parameters, and evaluates the cdf for chi-square distributions with even degrees of freedom. It can be used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. CUMPOIS calculates the probability that n or less events (ie. cumulative) will occur within any unit when the expected number of events is given as lambda. Normally, this probability is calculated by a direct summation, from i=0 to n, of terms involving the exponential function, lambda, and inverse factorials. This approach, however, eventually fails due to underflow for sufficiently large values of n. Additionally, when the exponential term is moved outside of the summation for simplification purposes, there is a risk that the terms remaining within the summation, and the summation itself, will overflow for certain values of i and lambda. CUMPOIS eliminates these possibilities by multiplying an additional exponential factor into the summation terms and the partial sum whenever overflow/underflow situations threaten. The reciprocal of this term is then multiplied into the completed sum giving the cumulative probability. The CUMPOIS program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly on most C compilers. The program format is interactive, accepting lambda and n as inputs. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CUMPOIS was

3. Human immunophenotyping via low-variance, low-bias, interpretive regression modeling of small, wide data sets: Application to aging and immune response to influenza vaccination.

PubMed

Holmes, Tyson H; He, Xiao-Song

2016-10-01

Small, wide data sets are commonplace in human immunophenotyping research. As defined here, a small, wide data set is constructed by sampling a small to modest quantity n,1regression modeling of small, wide data sets. These prescriptions are distinctive in their especially heavy emphasis on minimizing the use of out-of-sample information for conducting statistical inference. This allows the working immunologist to proceed without being encumbered by imposed and often untestable statistical assumptions. Problems of unmeasured confounders, confidence-interval coverage, feature selection, and shrinkage/denoising are defined clearly and treated in detail. We propose an extension of an existing nonparametric technique for improved small-sample confidence-interval tail coverage from the univariate case (single immune feature) to the multivariate (many, possibly correlated immune features). An important role for derived features in the immunological interpretation of regression analyses is stressed. Areas of further research are discussed. Presented principles and methods are illustrated through application to a small, wide data set of adults spanning a wide range in ages and multiple immunophenotypes that were assayed before and after immunization with inactivated influenza vaccine (IIV). Our regression modeling prescriptions identify some potentially important topics for future immunological research. 1) Immunologists may wish to distinguish age-related differences in immune features from changes in immune features caused by aging. 2) A form of the bootstrap that employs linear extrapolation may prove to be an invaluable analytic tool because it allows the working immunologist to obtain accurate estimates of the stability of immune parameter estimates with a

4. On Quantization of Quadratic Poisson Structures

Manchon, D.; Masmoudi, M.; Roux, A.

Any classical r-matrix on the Lie algebra of linear operators on a real vector space V gives rise to a quadratic Poisson structure on V which admits a deformation quantization stemming from the construction of V. Drinfel'd [Dr], [Gr]. We exhibit in this article an example of quadratic Poisson structure which does not arise this way.

5. Alternative Derivations for the Poisson Integral Formula

ERIC Educational Resources Information Center

Chen, J. T.; Wu, C. S.

2006-01-01

Poisson integral formula is revisited. The kernel in the Poisson integral formula can be derived in a series form through the direct BEM free of the concept of image point by using the null-field integral equation in conjunction with the degenerate kernels. The degenerate kernels for the closed-form Green's function and the series form of Poisson…

6. Analysis of Blood Transfusion Data Using Bivariate Zero-Inflated Poisson Model: A Bayesian Approach

PubMed Central

2016-01-01

Recognizing the factors affecting the number of blood donation and blood deferral has a major impact on blood transfusion. There is a positive correlation between the variables “number of blood donation” and “number of blood deferral”: as the number of return for donation increases, so does the number of blood deferral. On the other hand, due to the fact that many donors never return to donate, there is an extra zero frequency for both of the above-mentioned variables. In this study, in order to apply the correlation and to explain the frequency of the excessive zero, the bivariate zero-inflated Poisson regression model was used for joint modeling of the number of blood donation and number of blood deferral. The data was analyzed using the Bayesian approach applying noninformative priors at the presence and absence of covariates. Estimating the parameters of the model, that is, correlation, zero-inflation parameter, and regression coefficients, was done through MCMC simulation. Eventually double-Poisson model, bivariate Poisson model, and bivariate zero-inflated Poisson model were fitted on the data and were compared using the deviance information criteria (DIC). The results showed that the bivariate zero-inflated Poisson regression model fitted the data better than the other models. PMID:27703493

7. Analysis of Blood Transfusion Data Using Bivariate Zero-Inflated Poisson Model: A Bayesian Approach.

PubMed

Mohammadi, Tayeb; Kheiri, Soleiman; Sedehi, Morteza

2016-01-01

Recognizing the factors affecting the number of blood donation and blood deferral has a major impact on blood transfusion. There is a positive correlation between the variables "number of blood donation" and "number of blood deferral": as the number of return for donation increases, so does the number of blood deferral. On the other hand, due to the fact that many donors never return to donate, there is an extra zero frequency for both of the above-mentioned variables. In this study, in order to apply the correlation and to explain the frequency of the excessive zero, the bivariate zero-inflated Poisson regression model was used for joint modeling of the number of blood donation and number of blood deferral. The data was analyzed using the Bayesian approach applying noninformative priors at the presence and absence of covariates. Estimating the parameters of the model, that is, correlation, zero-inflation parameter, and regression coefficients, was done through MCMC simulation. Eventually double-Poisson model, bivariate Poisson model, and bivariate zero-inflated Poisson model were fitted on the data and were compared using the deviance information criteria (DIC). The results showed that the bivariate zero-inflated Poisson regression model fitted the data better than the other models.

8. Poisson's ratio of individual metal nanowires.

PubMed

McCarthy, Eoin K; Bellew, Allen T; Sader, John E; Boland, John J

2014-07-07

The measurement of Poisson's ratio of nanomaterials is extremely challenging. Here we report a lateral atomic force microscope experimental method to electromechanically measure the Poisson's ratio and gauge factor of individual nanowires. Under elastic loading conditions we monitor the four-point resistance of individual metallic nanowires as a function of strain and different levels of electrical stress. We determine the gauge factor of individual wires and directly measure the Poisson's ratio using a model that is independently validated for macroscopic wires. For macroscopic wires and nickel nanowires we find Poisson's ratios that closely correspond to bulk values, whereas for silver nanowires significant deviations from the bulk silver value are observed. Moreover, repeated measurements on individual silver nanowires at different levels of mechanical and electrical stress yield a small spread in Poisson ratio, with a range of mean values for different wires, all of which are distinct from the bulk value.

9. The impact of minimum legal drinking age laws on alcohol consumption, smoking, and marijuana use: evidence from a regression discontinuity design using exact date of birth.

PubMed

Yörük, Barış K; Yörük, Ceren Ertan

2011-07-01

This paper uses a regression discontinuity design to estimate the impact of the minimum legal drinking age laws on alcohol consumption, smoking, and marijuana use among young adults. Using data from the National Longitudinal Survey of Youth (1997 Cohort), we find that granting legal access to alcohol at age 21 leads to an increase in several measures of alcohol consumption, including an up to a 13 percentage point increase in the probability of drinking. Furthermore, this effect is robust under several different parametric and non-parametric models. We also find some evidence that the discrete jump in alcohol consumption at age 21 has negative spillover effects on marijuana use but does not affect the smoking habits of young adults. Our results indicate that although the change in alcohol consumption habits of young adults following their 21st birthday is less severe than previously known, policies that are designed to reduce drinking among young adults may have desirable impacts and can create public health benefits.

10. Almost efficient estimation of relative risk regression

PubMed Central

Fitzmaurice, Garrett M.; Lipsitz, Stuart R.; Arriaga, Alex; Sinha, Debajyoti; Greenberg, Caprice; Gawande, Atul A.

2014-01-01

Relative risks (RRs) are often considered the preferred measures of association in prospective studies, especially when the binary outcome of interest is common. In particular, many researchers regard RRs to be more intuitively interpretable than odds ratios. Although RR regression is a special case of generalized linear models, specifically with a log link function for the binomial (or Bernoulli) outcome, the resulting log-binomial regression does not respect the natural parameter constraints. Because log-binomial regression does not ensure that predicted probabilities are mapped to the [0,1] range, maximum likelihood (ML) estimation is often subject to numerical instability that leads to convergence problems. To circumvent these problems, a number of alternative approaches for estimating RR regression parameters have been proposed. One approach that has been widely studied is the use of Poisson regression estimating equations. The estimating equations for Poisson regression yield consistent, albeit inefficient, estimators of the RR regression parameters. We consider the relative efficiency of the Poisson regression estimator and develop an alternative, almost efficient estimator for the RR regression parameters. The proposed method uses near-optimal weights based on a Maclaurin series (Taylor series expanded around zero) approximation to the true Bernoulli or binomial weight function. This yields an almost efficient estimator while avoiding convergence problems. We examine the asymptotic relative efficiency of the proposed estimator for an increase in the number of terms in the series. Using simulations, we demonstrate the potential for convergence problems with standard ML estimation of the log-binomial regression model and illustrate how this is overcome using the proposed estimator. We apply the proposed estimator to a study of predictors of pre-operative use of beta blockers among patients undergoing colorectal surgery after diagnosis of colon cancer. PMID

11. Supervised Gamma Process Poisson Factorization

SciTech Connect

Anderson, Dylan Zachary

2015-05-01

This thesis develops the supervised gamma process Poisson factorization (S- GPPF) framework, a novel supervised topic model for joint modeling of count matrices and document labels. S-GPPF is fully generative and nonparametric: document labels and count matrices are modeled under a uni ed probabilistic framework and the number of latent topics is controlled automatically via a gamma process prior. The framework provides for multi-class classification of documents using a generative max-margin classifier. Several recent data augmentation techniques are leveraged to provide for exact inference using a Gibbs sampling scheme. The first portion of this thesis reviews supervised topic modeling and several key mathematical devices used in the formulation of S-GPPF. The thesis then introduces the S-GPPF generative model and derives the conditional posterior distributions of the latent variables for posterior inference via Gibbs sampling. The S-GPPF is shown to exhibit state-of-the-art performance for joint topic modeling and document classification on a dataset of conference abstracts, beating out competing supervised topic models. The unique properties of S-GPPF along with its competitive performance make it a novel contribution to supervised topic modeling.

12. Negative Poisson's ratio in rippled graphene.

PubMed

Qin, Huasong; Sun, Yu; Liu, Jefferson Zhe; Li, Mengjie; Liu, Yilun

2017-03-10

In this work, we perform molecular dynamics (MD) simulations to study the effect of rippling on the Poisson's ratio of graphene. Due to the atomic scale thickness of graphene, out-of-plane ripples are generated in free standing graphene with topological defects (e.g. heptagons and pentagons) to release the in-plane deformation energy. Through MD simulations, we have found that the Poisson's ratio of rippled graphene decreases upon increasing its aspect ratio η (amplitude over wavelength). For the rippled graphene sheet η = 0.188, a negative Poisson's ratio of -0.38 is observed for a tensile strain up to 8%, while the Poisson's ratio for η = 0.066 is almost zero. During uniaxial tension, the ripples gradually become flat, thus the Poisson's ratio of rippled graphene is determined by the competing factors of the intrinsic positive Poisson's ratio of graphene and the negative Poisson's ratio due to the de-wrinkling effect. Besides, the rippled graphene exhibits excellent fracture strength and toughness. With the combination of its auxetic and excellent mechanical properties, rippled graphene may possess potential for application in nano-devices and nanomaterials.

13. Negative Poisson's ratio materials via isotropic interactions.

PubMed

Rechtsman, Mikael C; Stillinger, Frank H; Torquato, Salvatore

2008-08-22

We show that under tension a classical many-body system with only isotropic pair interactions in a crystalline state can, counterintuitively, have a negative Poisson's ratio, or auxetic behavior. We derive the conditions under which the triangular lattice in two dimensions and lattices with cubic symmetry in three dimensions exhibit a negative Poisson's ratio. In the former case, the simple Lennard-Jones potential can give rise to auxetic behavior. In the latter case, a negative Poisson's ratio can be exhibited even when the material is constrained to be elastically isotropic.

14. Poisson-Based Inference for Perturbation Models in Adaptive Spelling Training

ERIC Educational Resources Information Center

Baschera, Gian-Marco; Gross, Markus

2010-01-01

We present an inference algorithm for perturbation models based on Poisson regression. The algorithm is designed to handle unclassified input with multiple errors described by independent mal-rules. This knowledge representation provides an intelligent tutoring system with local and global information about a student, such as error classification…

15. Time since discharge of 9mm cartridges by headspace analysis, part 2: Ageing study and estimation of the time since discharge using multivariate regression.

PubMed

Gallidabino, M; Romolo, F S; Weyermann, C

2017-03-01

Estimating the time since discharge of spent cartridges can be a valuable tool in the forensic investigation of firearm-related crimes. To reach this aim, it was previously proposed that the decrease of volatile organic compounds released during discharge is monitored over time using non-destructive headspace extraction techniques. While promising results were obtained for large-calibre cartridges (e.g., shotgun shells), handgun calibres yielded unsatisfying results. In addition to the natural complexity of the specimen itself, these can also be attributed to some selective choices in the methods development. Thus, the present series of papers aimed to systematically evaluate the potential of headspace analysis to estimate the time since discharge of cartridges through the use of more comprehensive analytical and interpretative techniques. Following the comprehensive optimisation and validation of an exhaustive headspace sorptive extraction (HSSE) method in the first part of this work, the present paper addresses the application of chemometric tools in order to systematically evaluate the potential of applying headspace analysis to estimate the time since discharge of 9mm Geco cartridges. Several multivariate regression and pre-treatment methods were tested and compared to univariate models based on non-linear regression. Random forests (RF) and partial least squares (PLS) proceeded by pairwise log-ratios normalisation (PLR) showed the best results, and allowed to estimate time since discharge up to 48h of ageing and to differentiate recently fired from older cartridges (e.g., less than 5h compared to more than 1-2 days). The proposed multivariate approaches showed significant improvement compared to univariate models. The effects of storage conditions were also tested and results demonstrated that temperature, humidity and cartridge position should be taken into account when estimating the time since discharge.

16. Statistical Tests of the PTHA Poisson Assumption for Submarine Landslides

Geist, E. L.; Chaytor, J. D.; Parsons, T.; Ten Brink, U. S.

2012-12-01

We demonstrate that a sequence of dated mass transport deposits (MTDs) can provide information to statistically test whether or not submarine landslides associated with these deposits conform to a Poisson model of occurrence. Probabilistic tsunami hazard analysis (PTHA) most often assumes Poissonian occurrence for all sources, with an exponential distribution of return times. Using dates that define the bounds of individual MTDs, we first describe likelihood and Monte Carlo methods of parameter estimation for a suite of candidate occurrence models (Poisson, lognormal, gamma, Brownian Passage Time). In addition to age-dating uncertainty, both methods incorporate uncertainty caused by the open time intervals: i.e., before the first and after the last event to the present. Accounting for these open intervals is critical when there are a small number of observed events. The optimal occurrence model is selected according to both the Akaike Information Criteria (AIC) and Akaike's Bayesian Information Criterion (ABIC). In addition, the likelihood ratio test can be performed on occurrence models from the same family: e.g., the gamma model relative to the exponential model of return time distribution. Parameter estimation, model selection, and hypothesis testing are performed on data from two IODP holes in the northern Gulf of Mexico that penetrated a total of 14 MTDs, some of which are correlated between the two holes. Each of these events has been assigned an age based on microfossil zonations and magnetostratigraphic datums. Results from these sites indicate that the Poisson assumption is likely valid. However, parameter estimation results using the likelihood method for one of the sites suggest that the events may have occurred quasi-periodically. Methods developed in this study provide tools with which one can determine both the rate of occurrence and the statistical validity of the Poisson assumption when submarine landslides are included in PTHA.

17. Negative Poisson's Ratio in Modern Functional Materials.

PubMed

Huang, Chuanwei; Chen, Lang

2016-10-01

Materials with negative Poisson's ratio attract considerable attention due to their underlying intriguing physical properties and numerous promising applications, particularly in stringent environments such as aerospace and defense areas, because of their unconventional mechanical enhancements. Recent progress in materials with a negative Poisson's ratio are reviewed here, with the current state of research regarding both theory and experiment. The inter-relationship between the underlying structure and a negative Poisson's ratio is discussed in functional materials, including macroscopic bulk, low-dimensional nanoscale particles, films, sheets, or tubes. The coexistence and correlations with other negative indexes (such as negative compressibility and negative thermal expansion) are also addressed. Finally, open questions and future research opportunities are proposed for functional materials with negative Poisson's ratios.

18. Poisson׳s ratio of arterial wall - Inconsistency of constitutive models with experimental data.

PubMed

Skacel, Pavel; Bursa, Jiri

2016-02-01

Poisson׳s ratio of fibrous soft tissues is analyzed in this paper on the basis of constitutive models and experimental data. Three different up-to-date constitutive models accounting for the dispersion of fibre orientations are analyzed. Their predictions of the anisotropic Poisson׳s ratios are investigated under finite strain conditions together with the effects of specific orientation distribution functions and of other parameters. The applied constitutive models predict the tendency to lower (or even negative) out-of-plane Poisson׳s ratio. New experimental data of porcine arterial layer under uniaxial tension in orthogonal directions are also presented and compared with the theoretical predictions and other literature data. The results point out the typical features of recent constitutive models with fibres concentrated in circumferential-axial plane of arterial layers and their potential inconsistence with some experimental data. The volumetric (in)compressibility of arterial tissues is also discussed as an eventual and significant factor influencing this inconsistency.

19. Evaluating the double Poisson generalized linear model.

PubMed

Zou, Yaotian; Geedipally, Srinivas Reddy; Lord, Dominique

2013-10-01

The objectives of this study are to: (1) examine the applicability of the double Poisson (DP) generalized linear model (GLM) for analyzing motor vehicle crash data characterized by over- and under-dispersion and (2) compare the performance of the DP GLM with the Conway-Maxwell-Poisson (COM-Poisson) GLM in terms of goodness-of-fit and theoretical soundness. The DP distribution has seldom been investigated and applied since its first introduction two decades ago. The hurdle for applying the DP is related to its normalizing constant (or multiplicative constant) which is not available in closed form. This study proposed a new method to approximate the normalizing constant of the DP with high accuracy and reliability. The DP GLM and COM-Poisson GLM were developed using two observed over-dispersed datasets and one observed under-dispersed dataset. The modeling results indicate that the DP GLM with its normalizing constant approximated by the new method can handle crash data characterized by over- and under-dispersion. Its performance is comparable to the COM-Poisson GLM in terms of goodness-of-fit (GOF), although COM-Poisson GLM provides a slightly better fit. For the over-dispersed data, the DP GLM performs similar to the NB GLM. Considering the fact that the DP GLM can be easily estimated with inexpensive computation and that it is simpler to interpret coefficients, it offers a flexible and efficient alternative for researchers to model count data.

20. Hip fracture risk assessment: artificial neural network outperforms conditional logistic regression in an age- and sex-matched case control study

PubMed Central

2013-01-01

Background Osteoporotic hip fractures with a significant morbidity and excess mortality among the elderly have imposed huge health and economic burdens on societies worldwide. In this age- and sex-matched case control study, we examined the risk factors of hip fractures and assessed the fracture risk by conditional logistic regression (CLR) and ensemble artificial neural network (ANN). The performances of these two classifiers were compared. Methods The study population consisted of 217 pairs (149 women and 68 men) of fractures and controls with an age older than 60 years. All the participants were interviewed with the same standardized questionnaire including questions on 66 risk factors in 12 categories. Univariate CLR analysis was initially conducted to examine the unadjusted odds ratio of all potential risk factors. The significant risk factors were then tested by multivariate analyses. For fracture risk assessment, the participants were randomly divided into modeling and testing datasets for 10-fold cross validation analyses. The predicting models built by CLR and ANN in modeling datasets were applied to testing datasets for generalization study. The performances, including discrimination and calibration, were compared with non-parametric Wilcoxon tests. Results In univariate CLR analyses, 16 variables achieved significant level, and six of them remained significant in multivariate analyses, including low T score, low BMI, low MMSE score, milk intake, walking difficulty, and significant fall at home. For discrimination, ANN outperformed CLR in both 16- and 6-variable analyses in modeling and testing datasets (p?

1. Prediction of forest fires occurrences with area-level Poisson mixed models.

PubMed

Boubeta, Miguel; Lombardía, María José; Marey-Pérez, Manuel Francisco; Morales, Domingo

2015-05-01

The number of fires in forest areas of Galicia (north-west of Spain) during the summer period is quite high. Local authorities are interested in analyzing the factors that explain this phenomenon. Poisson regression models are good tools for describing and predicting the number of fires per forest areas. This work employs area-level Poisson mixed models for treating real data about fires in forest areas. A parametric bootstrap method is applied for estimating the mean squared errors of fires predictors. The developed methodology and software are applied to a real data set of fires in forest areas of Galicia.

2. Generalized Poisson distribution: the property of mixture of Poisson and comparison with negative binomial distribution.

PubMed

Joe, Harry; Zhu, Rong

2005-04-01

We prove that the generalized Poisson distribution GP(theta, eta) (eta > or = 0) is a mixture of Poisson distributions; this is a new property for a distribution which is the topic of the book by Consul (1989). Because we find that the fits to count data of the generalized Poisson and negative binomial distributions are often similar, to understand their differences, we compare the probability mass functions and skewnesses of the generalized Poisson and negative binomial distributions with the first two moments fixed. They have slight differences in many situations, but their zero-inflated distributions, with masses at zero, means and variances fixed, can differ more. These probabilistic comparisons are helpful in selecting a better fitting distribution for modelling count data with long right tails. Through a real example of count data with large zero fraction, we illustrate how the generalized Poisson and negative binomial distributions as well as their zero-inflated distributions can be discriminated.

3. Introduction to the use of regression models in epidemiology.

PubMed

Bender, Ralf

2009-01-01

Regression modeling is one of the most important statistical techniques used in analytical epidemiology. By means of regression models the effect of one or several explanatory variables (e.g., exposures, subject characteristics, risk factors) on a response variable such as mortality or cancer can be investigated. From multiple regression models, adjusted effect estimates can be obtained that take the effect of potential confounders into account. Regression methods can be applied in all epidemiologic study designs so that they represent a universal tool for data analysis in epidemiology. Different kinds of regression models have been developed in dependence on the measurement scale of the response variable and the study design. The most important methods are linear regression for continuous outcomes, logistic regression for binary outcomes, Cox regression for time-to-event data, and Poisson regression for frequencies and rates. This chapter provides a nontechnical introduction to these regression models with illustrating examples from cancer research.

4. Loop coproducts, Gaudin models and Poisson coalgebras

Musso, F.

2010-10-01

In this paper we show that if A is a Poisson algebra equipped with a set of maps Δ(i)λ: A → Aotimes N satisfying suitable conditions, then the images of the Casimir functions of A under the maps Δ(i)λ (that we call 'loop coproducts') are in involution. Rational, trigonometric and elliptic Gaudin models can be recovered as particular cases of this construction, and we show that the same happens for the integrable (or partially integrable) models that can be obtained through the so-called coproduct method. On the other hand, we show that the loop coproduct approach provides a natural generalization of the Gaudin algebras from the Lie-Poisson to the generic Poisson algebra context and, hopefully, can lead to the definition of new integrable models.

5. Magnetostrictive contribution to Poisson ratio of galfenol

Paes, V. Z. C.; Mosca, D. H.

2013-09-01

In this work we present a detailed study on the magnetostrictive contribution to Poisson ratio for samples under applied mechanical stress. Magnetic contributions to strain and Poisson ratio for cubic materials were derived by accounting elastic and magneto-elastic anisotropy contributions. We apply our theoretical results for a material of interest in magnetomechanics, namely, galfenol (Fe1-xGax). Our results show that there is a non-negligible magnetic contribution in the linear portion of the curve of stress versus strain. The rotation of the magnetization towards [110] crystallographic direction upon application of mechanical stress leads to an auxetic behavior, i.e., exhibiting Poisson ratio with negative values. This magnetic contribution to auxetic behavior provides a novel insight for the discussion of theoretical and experimental developments of materials that display unusual mechanical properties.

6. The BRST complex of homological Poisson reduction

Müller-Lennert, Martin

2017-02-01

BRST complexes are differential graded Poisson algebras. They are associated with a coisotropic ideal J of a Poisson algebra P and provide a description of the Poisson algebra (P/J)^J as their cohomology in degree zero. Using the notion of stable equivalence introduced in Felder and Kazhdan (Contemporary Mathematics 610, Perspectives in representation theory, 2014), we prove that any two BRST complexes associated with the same coisotropic ideal are quasi-isomorphic in the case P = R[V] where V is a finite-dimensional symplectic vector space and the bracket on P is induced by the symplectic structure on V. As a corollary, the cohomology of the BRST complexes is canonically associated with the coisotropic ideal J in the symplectic case. We do not require any regularity assumptions on the constraints generating the ideal J. We finally quantize the BRST complex rigorously in the presence of infinitely many ghost variables and discuss the uniqueness of the quantization procedure.

7. Bivariate Poisson models with varying offsets: an application to the paired mitochondrial DNA dataset.

PubMed

Su, Pei-Fang; Mau, Yu-Lin; Guo, Yan; Li, Chung-I; Liu, Qi; Boice, John D; Shyr, Yu

2017-03-01

To assess the effect of chemotherapy on mitochondrial genome mutations in cancer survivors and their offspring, a study sequenced the full mitochondrial genome and determined the mitochondrial DNA heteroplasmic (mtDNA) mutation rate. To build a model for counts of heteroplasmic mutations in mothers and their offspring, bivariate Poisson regression was used to examine the relationship between mutation count and clinical information while accounting for the paired correlation. However, if the sequencing depth is not adequate, a limited fraction of the mtDNA will be available for variant calling. The classical bivariate Poisson regression model treats the offset term as equal within pairs; thus, it cannot be applied directly. In this research, we propose an extended bivariate Poisson regression model that has a more general offset term to adjust the length of the accessible genome for each observation. We evaluate the performance of the proposed method with comprehensive simulations, and the results show that the regression model provides unbiased parameter estimations. The use of the model is also demonstrated using the paired mtDNA dataset.

8. Extensions of Rasch's Multiplicative Poisson Model.

ERIC Educational Resources Information Center

Jansen, Margo G. H.; van Duijn, Marijtje A. J.

1992-01-01

A model developed by G. Rasch that assumes scores on some attainment tests can be realizations of a Poisson process is explained and expanded by assuming a prior distribution, with fixed but unknown parameters, for the subject parameters. How additional between-subject and within-subject factors can be incorporated is discussed. (SLD)

9. Natural Poisson structures of nonlinear plasma dynamics

SciTech Connect

Kaufman, A.N.

1982-06-01

Hamiltonian field theories, for models of nonlinear plasma dynamics, require a Poisson bracket structure for functionals of the field variables. These are presented, applied, and derived for several sets of field variables: coherent waves, incoherent waves, particle distributions, and multifluid electrodynamics. Parametric coupling of waves and plasma yields concise expressions for ponderomotive effects (in kinetic and fluid models) and for induced scattering.

10. Measuring Poisson Ratios at Low Temperatures

NASA Technical Reports Server (NTRS)

Boozon, R. S.; Shepic, J. A.

1987-01-01

Simple extensometer ring measures bulges of specimens in compression. New method of measuring Poisson's ratio used on brittle ceramic materials at cryogenic temperatures. Extensometer ring encircles cylindrical specimen. Four strain gauges connected in fully active Wheatstone bridge self-temperature-compensating. Used at temperatures as low as liquid helium.

11. Evolutionary inference via the Poisson Indel Process.

PubMed

Bouchard-Côté, Alexandre; Jordan, Michael I

2013-01-22

We address the problem of the joint statistical inference of phylogenetic trees and multiple sequence alignments from unaligned molecular sequences. This problem is generally formulated in terms of string-valued evolutionary processes along the branches of a phylogenetic tree. The classic evolutionary process, the TKF91 model [Thorne JL, Kishino H, Felsenstein J (1991) J Mol Evol 33(2):114-124] is a continuous-time Markov chain model composed of insertion, deletion, and substitution events. Unfortunately, this model gives rise to an intractable computational problem: The computation of the marginal likelihood under the TKF91 model is exponential in the number of taxa. In this work, we present a stochastic process, the Poisson Indel Process (PIP), in which the complexity of this computation is reduced to linear. The Poisson Indel Process is closely related to the TKF91 model, differing only in its treatment of insertions, but it has a global characterization as a Poisson process on the phylogeny. Standard results for Poisson processes allow key computations to be decoupled, which yields the favorable computational profile of inference under the PIP model. We present illustrative experiments in which Bayesian inference under the PIP model is compared with separate inference of phylogenies and alignments.

12. Easy Demonstration of the Poisson Spot

ERIC Educational Resources Information Center

Gluck, Paul

2010-01-01

Many physics teachers have a set of slides of single, double and multiple slits to show their students the phenomena of interference and diffraction. Thomas Young's historic experiments with double slits were indeed a milestone in proving the wave nature of light. But another experiment, namely the Poisson spot, was also important historically and…

13. A new bivariate negative binomial regression model

Faroughi, Pouya; Ismail, Noriszura

2014-12-01

This paper introduces a new form of bivariate negative binomial (BNB-1) regression which can be fitted to bivariate and correlated count data with covariates. The BNB regression discussed in this study can be fitted to bivariate and overdispersed count data with positive, zero or negative correlations. The joint p.m.f. of the BNB1 distribution is derived from the product of two negative binomial marginals with a multiplicative factor parameter. Several testing methods were used to check overdispersion and goodness-of-fit of the model. Application of BNB-1 regression is illustrated on Malaysian motor insurance dataset. The results indicated that BNB-1 regression has better fit than bivariate Poisson and BNB-2 models with regards to Akaike information criterion.

14. Understanding the changes in ductility and Poisson's ratio of metallic glasses during annealing from microscopic dynamics

Wang, Z.; Ngai, K. L.; Wang, W. H.

2015-07-01

In the paper K. L. Ngai et al., [J. Chem. 140, 044511 (2014)], the empirical correlation of ductility with the Poisson's ratio, νPoisson, found in metallic glasses was theoretically explained by microscopic dynamic processes which link on the one hand ductility, and on the other hand the Poisson's ratio. Specifically, the dynamic processes are the primitive relaxation in the Coupling Model which is the precursor of the Johari-Goldstein β-relaxation, and the caged atoms dynamics characterized by the effective Debye-Waller factor f0 or equivalently the nearly constant loss (NCL) in susceptibility. All these processes and the parameters characterizing them are accessible experimentally except f0 or the NCL of caged atoms; thus, so far, the experimental verification of the explanation of the correlation between ductility and Poisson's ratio is incomplete. In the experimental part of this paper, we report dynamic mechanical measurement of the NCL of the metallic glass La60Ni15Al25 as-cast, and the changes by annealing at temperature below Tg. The observed monotonic decrease of the NCL with aging time, reflecting the corresponding increase of f0, correlates with the decrease of νPoisson. This is important observation because such measurements, not made before, provide the missing link in confirming by experiment the explanation of the correlation of ductility with νPoisson. On aging the metallic glass, also observed in the isochronal loss spectra is the shift of the β-relaxation to higher temperatures and reduction of the relaxation strength. These concomitant changes of the β-relaxation and NCL are the root cause of embrittlement by aging the metallic glass. The NCL of caged atoms is terminated by the onset of the primitive relaxation in the Coupling Model, which is generally supported by experiments. From this relation, the monotonic decrease of the NCL with aging time is caused by the slowing down of the primitive relaxation and β-relaxation on annealing, and

15. The Zero-truncated Poisson with Right Censoring: an Application to Translational Breast Cancer Research.

PubMed

Yeh, Hung-Wen; Gajewski, Byron; Mukhopadhyay, Purna; Behbod, Fariba

2012-08-30

We propose to analyze positive count data with right censoring from Behbod et al. (2009) using the censored zero-truncated Poisson model (CZTP). The comparison in truncated means across subgroups in each cell line is carried out through a log-linear model that links the un-truncated Poisson parameter and regression covariates. We also perform simulation to evaluate the performance of the CZTP model in finite and large sample sizes. In general, the CZTP model provides accurate and precise estimates. However, for data with small means and small sample sizes, it may be more proper to make inference based on the mean counts rather than on the regression coefficients. For small sample sizes and moderate means, the likelihood ratio test is more reliable than the Wald test. We also demonstrate how power analysis can be used to justify and/or guide the choice of censoring thresholds in study design. A SAS macro is provided in Appendix for readers' reference.

16. Simulation on Poisson and negative binomial models of count road accident modeling

Sapuan, M. S.; Razali, A. M.; Zamzuri, Z. H.; Ibrahim, K.

2016-11-01

Accident count data have often been shown to have overdispersion. On the other hand, the data might contain zero count (excess zeros). The simulation study was conducted to create a scenarios which an accident happen in T-junction with the assumption the dependent variables of generated data follows certain distribution namely Poisson and negative binomial distribution with different sample size of n=30 to n=500. The study objective was accomplished by fitting Poisson regression, negative binomial regression and Hurdle negative binomial model to the simulated data. The model validation was compared and the simulation result shows for each different sample size, not all model fit the data nicely even though the data generated from its own distribution especially when the sample size is larger. Furthermore, the larger sample size indicates that more zeros accident count in the dataset.

17. A generalized Poisson and Poisson-Boltzmann solver for electrostatic environments.

PubMed

Fisicaro, G; Genovese, L; Andreussi, O; Marzari, N; Goedecker, S

2016-01-07

The computational study of chemical reactions in complex, wet environments is critical for applications in many fields. It is often essential to study chemical reactions in the presence of applied electrochemical potentials, taking into account the non-trivial electrostatic screening coming from the solvent and the electrolytes. As a consequence, the electrostatic potential has to be found by solving the generalized Poisson and the Poisson-Boltzmann equations for neutral and ionic solutions, respectively. In the present work, solvers for both problems have been developed. A preconditioned conjugate gradient method has been implemented for the solution of the generalized Poisson equation and the linear regime of the Poisson-Boltzmann, allowing to solve iteratively the minimization problem with some ten iterations of the ordinary Poisson equation solver. In addition, a self-consistent procedure enables us to solve the non-linear Poisson-Boltzmann problem. Both solvers exhibit very high accuracy and parallel efficiency and allow for the treatment of periodic, free, and slab boundary conditions. The solver has been integrated into the BigDFT and Quantum-ESPRESSO electronic-structure packages and will be released as an independent program, suitable for integration in other codes.

18. A generalized Poisson and Poisson-Boltzmann solver for electrostatic environments

SciTech Connect

Fisicaro, G. Goedecker, S.; Genovese, L.; Andreussi, O.; Marzari, N.

2016-01-07

The computational study of chemical reactions in complex, wet environments is critical for applications in many fields. It is often essential to study chemical reactions in the presence of applied electrochemical potentials, taking into account the non-trivial electrostatic screening coming from the solvent and the electrolytes. As a consequence, the electrostatic potential has to be found by solving the generalized Poisson and the Poisson-Boltzmann equations for neutral and ionic solutions, respectively. In the present work, solvers for both problems have been developed. A preconditioned conjugate gradient method has been implemented for the solution of the generalized Poisson equation and the linear regime of the Poisson-Boltzmann, allowing to solve iteratively the minimization problem with some ten iterations of the ordinary Poisson equation solver. In addition, a self-consistent procedure enables us to solve the non-linear Poisson-Boltzmann problem. Both solvers exhibit very high accuracy and parallel efficiency and allow for the treatment of periodic, free, and slab boundary conditions. The solver has been integrated into the BigDFT and Quantum-ESPRESSO electronic-structure packages and will be released as an independent program, suitable for integration in other codes.

19. The solution of large multi-dimensional Poisson problems

NASA Technical Reports Server (NTRS)

Stone, H. S.

1974-01-01

The Buneman algorithm for solving Poisson problems can be adapted to solve large Poisson problems on computers with a rotating drum memory so that the computation is done with very little time lost due to rotational latency of the drum.

20. Brain, music, and non-Poisson renewal processes

Bianco, Simone; Ignaccolo, Massimiliano; Rider, Mark S.; Ross, Mary J.; Winsor, Phil; Grigolini, Paolo

2007-06-01

In this paper we show that both music composition and brain function, as revealed by the electroencephalogram (EEG) analysis, are renewal non-Poisson processes living in the nonergodic dominion. To reach this important conclusion we process the data with the minimum spanning tree method, so as to detect significant events, thereby building a sequence of times, which is the time series to analyze. Then we show that in both cases, EEG and music composition, these significant events are the signature of a non-Poisson renewal process. This conclusion is reached using a technique of statistical analysis recently developed by our group, the aging experiment (AE). First, we find that in both cases the distances between two consecutive events are described by nonexponential histograms, thereby proving the non-Poisson nature of these processes. The corresponding survival probabilities Ψ(t) are well fitted by stretched exponentials [ Ψ(t)∝exp (-(γt)α) , with 0.5<α<1 .] The second step rests on the adoption of AE, which shows that these are renewal processes. We show that the stretched exponential, due to its renewal character, is the emerging tip of an iceberg, whose underwater part has slow tails with an inverse power law structure with power index μ=1+α . Adopting the AE procedure we find that both EEG and music composition yield μ<2 . On the basis of the recently discovered complexity matching effect, according to which a complex system S with μS<2 responds only to a complex driving signal P with μP⩽μS , we conclude that the results of our analysis may explain the influence of music on the human brain.

1. Brain, music, and non-Poisson renewal processes.

PubMed

Bianco, Simone; Ignaccolo, Massimiliano; Rider, Mark S; Ross, Mary J; Winsor, Phil; Grigolini, Paolo

2007-06-01

In this paper we show that both music composition and brain function, as revealed by the electroencephalogram (EEG) analysis, are renewal non-Poisson processes living in the nonergodic dominion. To reach this important conclusion we process the data with the minimum spanning tree method, so as to detect significant events, thereby building a sequence of times, which is the time series to analyze. Then we show that in both cases, EEG and music composition, these significant events are the signature of a non-Poisson renewal process. This conclusion is reached using a technique of statistical analysis recently developed by our group, the aging experiment (AE). First, we find that in both cases the distances between two consecutive events are described by nonexponential histograms, thereby proving the non-Poisson nature of these processes. The corresponding survival probabilities Psi(t) are well fitted by stretched exponentials [Psi(t) proportional, variant exp (-(gammat){alpha}) , with 0.5

2. Zero-inflated Poisson model based likelihood ratio test for drug safety signal detection.

PubMed

Huang, Lan; Zheng, Dan; Zalkikar, Jyoti; Tiwari, Ram

2017-02-01

In recent decades, numerous methods have been developed for data mining of large drug safety databases, such as Food and Drug Administration's (FDA's) Adverse Event Reporting System, where data matrices are formed by drugs such as columns and adverse events as rows. Often, a large number of cells in these data matrices have zero cell counts and some of them are "true zeros" indicating that the drug-adverse event pairs cannot occur, and these zero counts are distinguished from the other zero counts that are modeled zero counts and simply indicate that the drug-adverse event pairs have not occurred yet or have not been reported yet. In this paper, a zero-inflated Poisson model based likelihood ratio test method is proposed to identify drug-adverse event pairs that have disproportionately high reporting rates, which are also called signals. The maximum likelihood estimates of the model parameters of zero-inflated Poisson model based likelihood ratio test are obtained using the expectation and maximization algorithm. The zero-inflated Poisson model based likelihood ratio test is also modified to handle the stratified analyses for binary and categorical covariates (e.g. gender and age) in the data. The proposed zero-inflated Poisson model based likelihood ratio test method is shown to asymptotically control the type I error and false discovery rate, and its finite sample performance for signal detection is evaluated through a simulation study. The simulation results show that the zero-inflated Poisson model based likelihood ratio test method performs similar to Poisson model based likelihood ratio test method when the estimated percentage of true zeros in the database is small. Both the zero-inflated Poisson model based likelihood ratio test and likelihood ratio test methods are applied to six selected drugs, from the 2006 to 2011 Adverse Event Reporting System database, with varying percentages of observed zero-count cells.

3. A Duflo Star Product for Poisson Groups

2016-09-01

Let G be a finite-dimensional Poisson algebraic, Lie or formal group. We show that the center of the quantization of G provided by an Etingof-Kazhdan functor is isomorphic as an algebra to the Poisson center of the algebra of functions on G. This recovers and generalizes Duflo's theorem which gives an isomorphism between the center of the enveloping algebra of a finite-dimensional Lie algebra a and the subalgebra of ad-invariant in the symmetric algebra of a. As our proof relies on Etingof-Kazhdan construction it ultimately depends on the existence of Drinfeld associators, but otherwise it is a fairly simple application of graphical calculus. This shed some lights on Alekseev-Torossian proof of the Kashiwara-Vergne conjecture, and on the relation observed by Bar-Natan-Le-Thurston between the Duflo isomorphism and the Kontsevich integral of the unknot.

4. A New Echeloned Poisson Series Processor (EPSP)

Ivanova, Tamara

2001-07-01

A specialized Echeloned Poisson Series Processor (EPSP) is proposed. It is a typical software for the implementation of analytical algorithms of Celestial Mechanics. EPSP is designed for manipulating long polynomial-trigonometric series with literal divisors. The coefficients of these echeloned series are the rational or floating-point numbers. The Keplerian processor and analytical generator of special celestial mechanics functions based on the EPSP are also developed.

5. Poisson filtering of laser ranging data

NASA Technical Reports Server (NTRS)

Ricklefs, Randall L.; Shelus, Peter J.

1993-01-01

The filtering of data in a high noise, low signal strength environment is a situation encountered routinely in lunar laser ranging (LLR) and, to a lesser extent, in artificial satellite laser ranging (SLR). The use of Poisson statistics as one of the tools for filtering LLR data is described first in a historical context. The more recent application of this statistical technique to noisy SLR data is also described.

6. Path Selection in a Poisson field

Cohen, Yossi; Rothman, Daniel H.

2016-11-01

A criterion for path selection for channels growing in a Poisson field is presented. We invoke a generalization of the principle of local symmetry. We then use this criterion to grow channels in a confined geometry. The channel trajectories reveal a self-similar shape as they reach steady state. Analyzing their paths, we identify a cause for branching that may result in a ramified structure in which the golden ratio appears.

7. Computation of solar perturbations with Poisson series

NASA Technical Reports Server (NTRS)

Broucke, R.

1974-01-01

Description of a project for computing first-order perturbations of natural or artificial satellites by integrating the equations of motion on a computer with automatic Poisson series expansions. A basic feature of the method of solution is that the classical variation-of-parameters formulation is used rather than rectangular coordinates. However, the variation-of-parameters formulation uses the three rectangular components of the disturbing force rather than the classical disturbing function, so that there is no problem in expanding the disturbing function in series. Another characteristic of the variation-of-parameters formulation employed is that six rather unusual variables are used in order to avoid singularities at the zero eccentricity and zero (or 90 deg) inclination. The integration process starts by assuming that all the orbit elements present on the right-hand sides of the equations of motion are constants. These right-hand sides are then simple Poisson series which can be obtained with the use of the Bessel expansions of the two-body problem in conjunction with certain interation methods. These Poisson series can then be integrated term by term, and a first-order solution is obtained.

8. Modelling of nonlinear filtering Poisson time series

Bochkarev, Vladimir V.; Belashova, Inna A.

2016-08-01

In this article, algorithms of non-linear filtering of Poisson time series are tested using statistical modelling. The objective is to find a representation of a time series as a wavelet series with a small number of non-linear coefficients, which allows distinguishing statistically significant details. There are well-known efficient algorithms of non-linear wavelet filtering for the case when the values of a time series have a normal distribution. However, if the distribution is not normal, good results can be expected using the maximum likelihood estimations. The filtration is studied according to the criterion of maximum likelihood by the example of Poisson time series. For direct optimisation of the likelihood function, different stochastic (genetic algorithms, annealing method) and deterministic optimization algorithms are used. Testing of the algorithm using both simulated series and empirical data (series of rare words frequencies according to the Google Books Ngram data were used) showed that filtering based on the criterion of maximum likelihood has a great advantage over well-known algorithms for the case of Poisson series. Also, the most perspective methods of optimisation were selected for this problem.

9. First- and second-order Poisson spots

Kelly, William R.; Shirley, Eric L.; Migdall, Alan L.; Polyakov, Sergey V.; Hendrix, Kurt

2009-08-01

Although Thomas Young is generally given credit for being the first to provide evidence against Newton's corpuscular theory of light, it was Augustin Fresnel who first stated the modern theory of diffraction. We review the history surrounding Fresnel's 1818 paper and the role of the Poisson spot in the associated controversy. We next discuss the boundary-diffraction-wave approach to calculating diffraction effects and show how it can reduce the complexity of calculating diffraction patterns. We briefly discuss a generalization of this approach that reduces the dimensionality of integrals needed to calculate the complete diffraction pattern of any order diffraction effect. We repeat earlier demonstrations of the conventional Poisson spot and discuss an experimental setup for demonstrating an analogous phenomenon that we call a "second-order Poisson spot." Several features of the diffraction pattern can be explained simply by considering the path lengths of singly and doubly bent paths and distinguishing between first- and second-order diffraction effects related to such paths, respectively.

10. Poisson's ratio over two centuries: challenging hypotheses

PubMed Central

Greaves, G. Neville

2013-01-01

This article explores Poisson's ratio, starting with the controversy concerning its magnitude and uniqueness in the context of the molecular and continuum hypotheses competing in the development of elasticity theory in the nineteenth century, moving on to its place in the development of materials science and engineering in the twentieth century, and concluding with its recent re-emergence as a universal metric for the mechanical performance of materials on any length scale. During these episodes France lost its scientific pre-eminence as paradigms switched from mathematical to observational, and accurate experiments became the prerequisite for scientific advance. The emergence of the engineering of metals followed, and subsequently the invention of composites—both somewhat separated from the discovery of quantum mechanics and crystallography, and illustrating the bifurcation of technology and science. Nowadays disciplines are reconnecting in the face of new scientific demands. During the past two centuries, though, the shape versus volume concept embedded in Poisson's ratio has remained invariant, but its application has exploded from its origins in describing the elastic response of solids and liquids, into areas such as materials with negative Poisson's ratio, brittleness, glass formation, and a re-evaluation of traditional materials. Moreover, the two contentious hypotheses have been reconciled in their complementarity within the hierarchical structure of materials and through computational modelling. PMID:24687094

11. On the Singularity of the Vlasov-Poisson System

SciTech Connect

and Hong Qin, Jian Zheng

2013-04-26

The Vlasov-Poisson system can be viewed as the collisionless limit of the corresponding Fokker- Planck-Poisson system. It is reasonable to expect that the result of Landau damping can also be obtained from the Fokker-Planck-Poisson system when the collision frequency v approaches zero. However, we show that the colllisionless Vlasov-Poisson system is a singular limit of the collisional Fokker-Planck-Poisson system, and Landau's result can be recovered only as the approaching zero from the positive side.

12. On the singularity of the Vlasov-Poisson system

SciTech Connect

Zheng, Jian; Qin, Hong

2013-09-15

The Vlasov-Poisson system can be viewed as the collisionless limit of the corresponding Fokker-Planck-Poisson system. It is reasonable to expect that the result of Landau damping can also be obtained from the Fokker-Planck-Poisson system when the collision frequency ν approaches zero. However, we show that the collisionless Vlasov-Poisson system is a singular limit of the collisional Fokker-Planck-Poisson system, and Landau's result can be recovered only as the ν approaches zero from the positive side.

13. Nonlocal Poisson-Fermi model for ionic solvent.

PubMed

Xie, Dexuan; Liu, Jinn-Liang; Eisenberg, Bob

2016-07-01

We propose a nonlocal Poisson-Fermi model for ionic solvent that includes ion size effects and polarization correlations among water molecules in the calculation of electrostatic potential. It includes the previous Poisson-Fermi models as special cases, and its solution is the convolution of a solution of the corresponding nonlocal Poisson dielectric model with a Yukawa-like kernel function. The Fermi distribution is shown to be a set of optimal ionic concentration functions in the sense of minimizing an electrostatic potential free energy. Numerical results are reported to show the difference between a Poisson-Fermi solution and a corresponding Poisson solution.

14. Nonlocal Poisson-Fermi model for ionic solvent

Xie, Dexuan; Liu, Jinn-Liang; Eisenberg, Bob

2016-07-01

We propose a nonlocal Poisson-Fermi model for ionic solvent that includes ion size effects and polarization correlations among water molecules in the calculation of electrostatic potential. It includes the previous Poisson-Fermi models as special cases, and its solution is the convolution of a solution of the corresponding nonlocal Poisson dielectric model with a Yukawa-like kernel function. The Fermi distribution is shown to be a set of optimal ionic concentration functions in the sense of minimizing an electrostatic potential free energy. Numerical results are reported to show the difference between a Poisson-Fermi solution and a corresponding Poisson solution.

15. Moho Depth and Poisson's Ratio beneath Eastern-Central China and Its Tectonic Implications

Wei, Z.; Chen, L.; Li, Z.; Ling, Y.; Li, J.

2015-12-01

Eastern-central China comprises a complex amalgamation of geotectonic blocks of different ages and undergone significant modification of lithosphere during the Meso-Cenozoic time. To better characterize its deep structure, we estimated the Moho depth and average Poisson's ratio of eastern-central China by H-κ stacking of receiver functions using teleseismic data collected from 1196 broadband stations. A coexistence of modified and preserved crust was revealed in eastern-central China, which was generally in Airy-type isostatic equilibrium. Crust is obviously thicker to the west of the North-South Gravity Lineament but exhibits complex variations in Poisson's ratio with an overall felsic to intermediate bulk crustal composition. Moho depth and Poisson's ratio show striking differences as compared to the surrounding areas in the rifts and tectonic boundary zones, where earthquakes usually occur. Similarities and differences in the Moho depth and average Poisson's ratio were observed among the Northeast China, North China Craton, South China, and the Qinling-Dabie Orogen as well as different areas within these blocks, which may result from their different evolutionary histories and strong tectonic-magma events since the Mesozoic. In addition, we observed an alteration of Moho depth by ~6 km and of Poisson's ratio by ~0.03 as well as striking E-W difference beneath and across the Xuefeng Mountains, suggesting that the Xuefeng Mountains may be a deep tectonic boundary between the eastern Yangtze Craton and western Cathaysia Block.

16. The transverse Poisson's ratio of composites.

NASA Technical Reports Server (NTRS)

Foye, R. L.

1972-01-01

An expression is developed that makes possible the prediction of Poisson's ratio for unidirectional composites with reference to any pair of orthogonal axes that are normal to the direction of the reinforcing fibers. This prediction appears to be a reasonable one in that it follows the trends of the finite element analysis and the bounding estimates, and has the correct limiting value for zero fiber content. It can only be expected to apply to composites containing stiff, circular, isotropic fibers bonded to a soft matrix material.

17. Testing the ratio of two poisson rates.

PubMed

Gu, Kangxia; Ng, Hon Keung Tony; Tang, Man Lai; Schucany, William R

2008-04-01

In this paper we compare the properties of four different general approaches for testing the ratio of two Poisson rates. Asymptotically normal tests, tests based on approximate p -values, exact conditional tests, and a likelihood ratio test are considered. The properties and power performance of these tests are studied by a Monte Carlo simulation experiment. Sample size calculation formulae are given for each of the test procedures and their validities are studied. Some recommendations favoring the likelihood ratio and certain asymptotic tests are based on these simulation results. Finally, all of the test procedures are illustrated with two real life medical examples.

18. Ridge Regression: A Panacea?

ERIC Educational Resources Information Center

Walton, Joseph M.; And Others

1978-01-01

Ridge regression is an approach to the problem of large standard errors of regression estimates of intercorrelated regressors. The effect of ridge regression on the estimated squared multiple correlation coefficient is discussed and illustrated. (JKS)

19. A Poisson model for random multigraphs

PubMed Central

Ranola, John M. O.; Ahn, Sangtae; Sehl, Mary; Smith, Desmond J.; Lange, Kenneth

2010-01-01

Motivation: Biological networks are often modeled by random graphs. A better modeling vehicle is a multigraph where each pair of nodes is connected by a Poisson number of edges. In the current model, the mean number of edges equals the product of two propensities, one for each node. In this context it is possible to construct a simple and effective algorithm for rapid maximum likelihood estimation of all propensities. Given estimated propensities, it is then possible to test statistically for functionally connected nodes that show an excess of observed edges over expected edges. The model extends readily to directed multigraphs. Here, propensities are replaced by outgoing and incoming propensities. Results: The theory is applied to real data on neuronal connections, interacting genes in radiation hybrids, interacting proteins in a literature curated database, and letter and word pairs in seven Shaskespearean plays. Availability: All data used are fully available online from their respective sites. Source code and software is available from http://code.google.com/p/poisson-multigraph/ Contact: klange@ucla.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20554690

20. Non-Poisson Processes: Regression to Equilibrium Versus Equilibrium Correlation Functions

DTIC Science & Technology

2007-11-02

76203-1427, USA cDipartimento di Fisica dell’Università di Pisa and INFM, Via Buonarroti 2, 56127 Pisa, Italy dIstituto dei Processi Chimico Fisici del...CNR, Area della Ricerca di Pisa, Via G. Moruzzi 1, 56124 Pisa, Italy eDipartimento di Fisica and INFM, Center for Statistical Mechanics and Complexity...dichotomous fluctuations that generate super-diffusion. We adopt the Liouville perspective and with it a quantum-like approach based on splitting the density

1. A Method of Poisson's Ration Imaging Within a Material Part

NASA Technical Reports Server (NTRS)

Roth, Don J. (Inventor)

1994-01-01

The present invention is directed to a method of displaying the Poisson's ratio image of a material part. In the present invention, longitudinal data is produced using a longitudinal wave transducer and shear wave data is produced using a shear wave transducer. The respective data is then used to calculate the Poisson's ratio for the entire material part. The Poisson's ratio approximations are then used to display the data.

2. Method of Poisson's ratio imaging within a material part

NASA Technical Reports Server (NTRS)

Roth, Don J. (Inventor)

1996-01-01

The present invention is directed to a method of displaying the Poisson's ratio image of a material part. In the present invention longitudinal data is produced using a longitudinal wave transducer and shear wave data is produced using a shear wave transducer. The respective data is then used to calculate the Poisson's ratio for the entire material part. The Poisson's ratio approximations are then used to displayed the image.

3. Study of non-Hodgkin's lymphoma mortality associated with industrial pollution in Spain, using Poisson models

PubMed Central

Ramis, Rebeca; Vidal, Enrique; García-Pérez, Javier; Lope, Virginia; Aragonés, Nuria; Pérez-Gómez, Beatriz; Pollán, Marina; López-Abente, Gonzalo

2009-01-01

Background Non-Hodgkin's lymphomas (NHLs) have been linked to proximity to industrial areas, but evidence regarding the health risk posed by residence near pollutant industries is very limited. The European Pollutant Emission Register (EPER) is a public register that furnishes valuable information on industries that release pollutants to air and water, along with their geographical location. This study sought to explore the relationship between NHL mortality in small areas in Spain and environmental exposure to pollutant emissions from EPER-registered industries, using three Poisson-regression-based mathematical models. Methods Observed cases were drawn from mortality registries in Spain for the period 1994–2003. Industries were grouped into the following sectors: energy; metal; mineral; organic chemicals; waste; paper; food; and use of solvents. Populations having an industry within a radius of 1, 1.5, or 2 kilometres from the municipal centroid were deemed to be exposed. Municipalities outside those radii were considered as reference populations. The relative risks (RRs) associated with proximity to pollutant industries were estimated using the following methods: Poisson Regression; mixed Poisson model with random provincial effect; and spatial autoregressive modelling (BYM model). Results Only proximity of paper industries to population centres (>2 km) could be associated with a greater risk of NHL mortality (mixed model: RR:1.24, 95% CI:1.09–1.42; BYM model: RR:1.21, 95% CI:1.01–1.45; Poisson model: RR:1.16, 95% CI:1.06–1.27). Spatial models yielded higher estimates. Conclusion The reported association between exposure to air pollution from the paper, pulp and board industry and NHL mortality is independent of the model used. Inclusion of spatial random effects terms in the risk estimate improves the study of associations between environmental exposures and mortality. The EPER could be of great utility when studying the effects of industrial pollution

4. Poisson-Boltzmann versus Size-Modified Poisson-Boltzmann Electrostatics Applied to Lipid Bilayers.

PubMed

Wang, Nuo; Zhou, Shenggao; Kekenes-Huskey, Peter M; Li, Bo; McCammon, J Andrew

2014-12-26

Mean-field methods, such as the Poisson-Boltzmann equation (PBE), are often used to calculate the electrostatic properties of molecular systems. In the past two decades, an enhancement of the PBE, the size-modified Poisson-Boltzmann equation (SMPBE), has been reported. Here, the PBE and the SMPBE are reevaluated for realistic molecular systems, namely, lipid bilayers, under eight different sets of input parameters. The SMPBE appears to reproduce the molecular dynamics simulation results better than the PBE only under specific parameter sets, but in general, it performs no better than the Stern layer correction of the PBE. These results emphasize the need for careful discussions of the accuracy of mean-field calculations on realistic systems with respect to the choice of parameters and call for reconsideration of the cost-efficiency and the significance of the current SMPBE formulation.

5. A Cartesian grid embedded boundary method for Poisson`s equation on irregular domains

SciTech Connect

Johansen, H.; Colella, P.

1997-01-31

The authors present a numerical method for solving Poisson`s equation, with variable coefficients and Dirichlet boundary conditions, on two-dimensional regions. The approach uses a finite-volume discretization, which embeds the domain in a regular Cartesian grid. They treat the solution as a cell-centered quantity, even when those centers are outside the domain. Cells that contain a portion of the domain boundary use conservation differencing of second-order accurate fluxes, on each cell volume. The calculation of the boundary flux ensures that the conditioning of the matrix is relatively unaffected by small cell volumes. This allows them to use multi-grid iterations with a simple point relaxation strategy. They have combined this with an adaptive mesh refinement (AMR) procedure. They provide evidence that the algorithm is second-order accurate on various exact solutions, and compare the adaptive and non-adaptive calculations.

6. Understanding the changes in ductility and Poisson's ratio of metallic glasses during annealing from microscopic dynamics

SciTech Connect

Wang, Z.; Ngai, K. L.; Wang, W. H.

2015-07-21

In the paper K. L. Ngai et al., [J. Chem. 140, 044511 (2014)], the empirical correlation of ductility with the Poisson's ratio, ν{sub Poisson}, found in metallic glasses was theoretically explained by microscopic dynamic processes which link on the one hand ductility, and on the other hand the Poisson's ratio. Specifically, the dynamic processes are the primitive relaxation in the Coupling Model which is the precursor of the Johari–Goldstein β-relaxation, and the caged atoms dynamics characterized by the effective Debye–Waller factor f{sub 0} or equivalently the nearly constant loss (NCL) in susceptibility. All these processes and the parameters characterizing them are accessible experimentally except f{sub 0} or the NCL of caged atoms; thus, so far, the experimental verification of the explanation of the correlation between ductility and Poisson's ratio is incomplete. In the experimental part of this paper, we report dynamic mechanical measurement of the NCL of the metallic glass La{sub 60}Ni{sub 15}Al{sub 25} as-cast, and the changes by annealing at temperature below T{sub g}. The observed monotonic decrease of the NCL with aging time, reflecting the corresponding increase of f{sub 0}, correlates with the decrease of ν{sub Poisson}. This is important observation because such measurements, not made before, provide the missing link in confirming by experiment the explanation of the correlation of ductility with ν{sub Poisson}. On aging the metallic glass, also observed in the isochronal loss spectra is the shift of the β-relaxation to higher temperatures and reduction of the relaxation strength. These concomitant changes of the β-relaxation and NCL are the root cause of embrittlement by aging the metallic glass. The NCL of caged atoms is terminated by the onset of the primitive relaxation in the Coupling Model, which is generally supported by experiments. From this relation, the monotonic decrease of the NCL with aging time is caused by the slowing down

7. Poisson cohomology of scalar multidimensional Dubrovin-Novikov brackets

Carlet, Guido; Casati, Matteo; Shadrin, Sergey

2017-04-01

We compute the Poisson cohomology of a scalar Poisson bracket of Dubrovin-Novikov type with D independent variables. We find that the second and third cohomology groups are generically non-vanishing in D > 1. Hence, in contrast with the D = 1 case, the deformation theory in the multivariable case is non-trivial.

8. LETTER TO THE EDITOR: New generalized Poisson structures

de Azcárraga, J. A.; Perelomov, A. M.; Pérez Bueno, J. C.

1996-04-01

New generalized Poisson structures are introduced by using suitable skew-symmetric contravariant tensors of even order. The corresponding `Jacobi identities' are provided by conditions on these tensors, which may be understood as cocycle conditions. As an example, we provide the linear generalized Poisson structures which can be constructed on the dual spaces of simple Lie algebras.

9. The Schouten - Nijenhuis bracket, cohomology and generalized Poisson structures

de Azcárraga, J. A.; Perelomov, A. M.; Pérez Bueno, J. C.

1996-12-01

Newly introduced generalized Poisson structures based on suitable skew-symmetric contravariant tensors of even order are discussed in terms of the Schouten - Nijenhuis bracket. The associated `Jacobi identities' are expressed as conditions on these tensors, the cohomological contents of which is given. In particular, we determine the linear generalized Poisson structures which can be constructed on the dual spaces of simple Lie algebras.

10. Low porosity metallic periodic structures with negative Poisson's ratio.

PubMed

Taylor, Michael; Francesconi, Luca; Gerendás, Miklós; Shanian, Ali; Carson, Carl; Bertoldi, Katia

2014-04-16

Auxetic behavior in low porosity metallic structures is demonstrated via a simple system of orthogonal elliptical voids. In this minimal 2D system, the Poisson's ratio can be effectively controlled by changing the aspect ratio of the voids. In this way, large negative values of Poisson's ratio can be achieved, indicating an effective strategy for designing auxetic structures with desired porosity.

11. Extreme values of the Poisson's ratio of cubic crystals

Epishin, A. I.; Lisovenko, D. S.

2016-10-01

The problem of determining the extrema of Poisson's ratio for cubic crystals is considered, and analytical expressions are derived to calculate its extreme values. It follows from the obtained solution that, apart from extreme values at standard orientations, extreme values of Poisson's ratio can also be detected at special orientations deviated from the standard ones. The derived analytical expressions are used to calculate the extreme values of Poisson's ratio for a large number of known cubic crystals. The extremely high values of Poisson's ratio are shown to be characteristic of metastable crystals, such as crystals with the shape memory effect caused by martensitic transformation. These crystals are mainly represented by metallic alloys. For some crystals, the absolute extrema of Poisson's ratio can exceed the standard values, which are-1 for a standard minimum and +2 for a standard maximum.

12. Deformation mechanisms in negative Poisson's ratio materials - Structural aspects

NASA Technical Reports Server (NTRS)

Lakes, R.

1991-01-01

Poisson's ratio in materials is governed by the following aspects of the microstructure: the presence of rotational degrees of freedom, non-affine deformation kinematics, or anisotropic structure. Several structural models are examined. The non-affine kinematics are seen to be essential for the production of negative Poisson's ratios for isotropic materials containing central force linkages of positive stiffness. Non-central forces combined with pre-load can also give rise to a negative Poisson's ratio in isotropic materials. A chiral microstructure with non-central force interaction or non-affine deformation can also exhibit a negative Poisson's ratio. Toughness and damage resistance in these materials may be affected by the Poisson's ratio itself, as well as by generalized continuum aspects associated with the microstructure.

13. Regressive systemic sclerosis.

PubMed Central

Black, C; Dieppe, P; Huskisson, T; Hart, F D

1986-01-01

Systemic sclerosis is a disease which usually progresses or reaches a plateau with persistence of symptoms and signs. Regression is extremely unusual. Four cases of established scleroderma are described in which regression is well documented. The significance of this observation and possible mechanisms of disease regression are discussed. Images PMID:3718012

14. NCCS Regression Test Harness

SciTech Connect

Tharrington, Arnold N.

2015-09-09

The NCCS Regression Test Harness is a software package that provides a framework to perform regression and acceptance testing on NCCS High Performance Computers. The package is written in Python and has only the dependency of a Subversion repository to store the regression tests.

15. Unitary Response Regression Models

ERIC Educational Resources Information Center

Lipovetsky, S.

2007-01-01

The dependent variable in a regular linear regression is a numerical variable, and in a logistic regression it is a binary or categorical variable. In these models the dependent variable has varying values. However, there are problems yielding an identity output of a constant value which can also be modelled in a linear or logistic regression with…

16. Poisson-Boltzmann-Nernst-Planck model

SciTech Connect

Zheng Qiong; Wei Guowei

2011-05-21

The Poisson-Nernst-Planck (PNP) model is based on a mean-field approximation of ion interactions and continuum descriptions of concentration and electrostatic potential. It provides qualitative explanation and increasingly quantitative predictions of experimental measurements for the ion transport problems in many areas such as semiconductor devices, nanofluidic systems, and biological systems, despite many limitations. While the PNP model gives a good prediction of the ion transport phenomenon for chemical, physical, and biological systems, the number of equations to be solved and the number of diffusion coefficient profiles to be determined for the calculation directly depend on the number of ion species in the system, since each ion species corresponds to one Nernst-Planck equation and one position-dependent diffusion coefficient profile. In a complex system with multiple ion species, the PNP can be computationally expensive and parameter demanding, as experimental measurements of diffusion coefficient profiles are generally quite limited for most confined regions such as ion channels, nanostructures and nanopores. We propose an alternative model to reduce number of Nernst-Planck equations to be solved in complex chemical and biological systems with multiple ion species by substituting Nernst-Planck equations with Boltzmann distributions of ion concentrations. As such, we solve the coupled Poisson-Boltzmann and Nernst-Planck (PBNP) equations, instead of the PNP equations. The proposed PBNP equations are derived from a total energy functional by using the variational principle. We design a number of computational techniques, including the Dirichlet to Neumann mapping, the matched interface and boundary, and relaxation based iterative procedure, to ensure efficient solution of the proposed PBNP equations. Two protein molecules, cytochrome c551 and Gramicidin A, are employed to validate the proposed model under a wide range of bulk ion concentrations and external

17. Poisson-Boltzmann-Nernst-Planck model

Zheng, Qiong; Wei, Guo-Wei

2011-05-01

The Poisson-Nernst-Planck (PNP) model is based on a mean-field approximation of ion interactions and continuum descriptions of concentration and electrostatic potential. It provides qualitative explanation and increasingly quantitative predictions of experimental measurements for the ion transport problems in many areas such as semiconductor devices, nanofluidic systems, and biological systems, despite many limitations. While the PNP model gives a good prediction of the ion transport phenomenon for chemical, physical, and biological systems, the number of equations to be solved and the number of diffusion coefficient profiles to be determined for the calculation directly depend on the number of ion species in the system, since each ion species corresponds to one Nernst-Planck equation and one position-dependent diffusion coefficient profile. In a complex system with multiple ion species, the PNP can be computationally expensive and parameter demanding, as experimental measurements of diffusion coefficient profiles are generally quite limited for most confined regions such as ion channels, nanostructures and nanopores. We propose an alternative model to reduce number of Nernst-Planck equations to be solved in complex chemical and biological systems with multiple ion species by substituting Nernst-Planck equations with Boltzmann distributions of ion concentrations. As such, we solve the coupled Poisson-Boltzmann and Nernst-Planck (PBNP) equations, instead of the PNP equations. The proposed PBNP equations are derived from a total energy functional by using the variational principle. We design a number of computational techniques, including the Dirichlet to Neumann mapping, the matched interface and boundary, and relaxation based iterative procedure, to ensure efficient solution of the proposed PBNP equations. Two protein molecules, cytochrome c551 and Gramicidin A, are employed to validate the proposed model under a wide range of bulk ion concentrations and external

18. Fully Regressive Melanoma

PubMed Central

Ehrsam, Eric; Kallini, Joseph R.; Lebas, Damien; Modiano, Philippe; Cotten, Hervé

2016-01-01

Fully regressive melanoma is a phenomenon in which the primary cutaneous melanoma becomes completely replaced by fibrotic components as a result of host immune response. Although 10 to 35 percent of cases of cutaneous melanomas may partially regress, fully regressive melanoma is very rare; only 47 cases have been reported in the literature to date. AH of the cases of fully regressive melanoma reported in the literature were diagnosed in conjunction with metastasis on a patient. The authors describe a case of fully regressive melanoma without any metastases at the time of its diagnosis. Characteristic findings on dermoscopy, as well as the absence of melanoma on final biopsy, confirmed the diagnosis. PMID:27672418

19. The Poisson-Helmholtz-Boltzmann model.

PubMed

Bohinc, K; Shrestha, A; May, S

2011-10-01

We present a mean-field model of a one-component electrolyte solution where the mobile ions interact not only via Coulomb interactions but also through a repulsive non-electrostatic Yukawa potential. Our choice of the Yukawa potential represents a simple model for solvent-mediated interactions between ions. We employ a local formulation of the mean-field free energy through the use of two auxiliary potentials, an electrostatic and a non-electrostatic potential. Functional minimization of the mean-field free energy leads to two coupled local differential equations, the Poisson-Boltzmann equation and the Helmholtz-Boltzmann equation. Their boundary conditions account for the sources of both the electrostatic and non-electrostatic interactions on the surface of all macroions that reside in the solution. We analyze a specific example, two like-charged planar surfaces with their mobile counterions forming the electrolyte solution. For this system we calculate the pressure between the two surfaces, and we analyze its dependence on the strength of the Yukawa potential and on the non-electrostatic interactions of the mobile ions with the planar macroion surfaces. In addition, we demonstrate that our mean-field model is consistent with the contact theorem, and we outline its generalization to arbitrary interaction potentials through the use of a Laplace transformation.

20. Generalized HPC method for the Poisson equation

Bardazzi, A.; Lugni, C.; Antuono, M.; Graziani, G.; Faltinsen, O. M.

2015-10-01

An efficient and innovative numerical algorithm based on the use of Harmonic Polynomials on each Cell of the computational domain (HPC method) has been recently proposed by Shao and Faltinsen (2014) [1], to solve Boundary Value Problem governed by the Laplace equation. Here, we extend the HPC method for the solution of non-homogeneous elliptic boundary value problems. The homogeneous solution, i.e. the Laplace equation, is represented through a polynomial function with harmonic polynomials while the particular solution of the Poisson equation is provided by a bi-quadratic function. This scheme has been called generalized HPC method. The present algorithm, accurate up to the 4th order, proved to be efficient, i.e. easy to be implemented and with a low computational effort, for the solution of two-dimensional elliptic boundary value problems. Furthermore, it provides an analytical representation of the solution within each computational stencil, which allows its coupling with existing numerical algorithms within an efficient domain-decomposition strategy or within an adaptive mesh refinement algorithm.

1. Integer lattice dynamics for Vlasov-Poisson

Mocz, Philip; Succi, Sauro

2017-03-01

We revisit the integer lattice (IL) method to numerically solve the Vlasov-Poisson equations, and show that a slight variant of the method is a very easy, viable, and efficient numerical approach to study the dynamics of self-gravitating, collisionless systems. The distribution function lives in a discretized lattice phase-space, and each time-step in the simulation corresponds to a simple permutation of the lattice sites. Hence, the method is Lagrangian, conservative, and fully time-reversible. IL complements other existing methods, such as N-body/particle mesh (computationally efficient, but affected by Monte Carlo sampling noise and two-body relaxation) and finite volume (FV) direct integration schemes (expensive, accurate but diffusive). We also present improvements to the FV scheme, using a moving-mesh approach inspired by IL, to reduce numerical diffusion and the time-step criterion. Being a direct integration scheme like FV, IL is memory limited (memory requirement for a full 3D problem scales as N6, where N is the resolution per linear phase-space dimension). However, we describe a new technique for achieving N4 scaling. The method offers promise for investigating the full 6D phase-space of collisionless systems of stars and dark matter.

2. Causal Poisson bracket via deformation quantization

Berra-Montiel, Jasel; Molgado, Alberto; Palacios-García, César D.

2016-06-01

Starting with the well-defined product of quantum fields at two spacetime points, we explore an associated Poisson structure for classical field theories within the deformation quantization formalism. We realize that the induced star-product is naturally related to the standard Moyal product through an appropriate causal Green’s functions connecting points in the space of classical solutions to the equations of motion. Our results resemble the Peierls-DeWitt bracket that has been analyzed in the multisymplectic context. Once our star-product is defined, we are able to apply the Wigner-Weyl map in order to introduce a generalized version of Wick’s theorem. Finally, we include some examples to explicitly test our method: the real scalar field, the bosonic string and a physically motivated nonlinear particle model. For the field theoretic models, we have encountered causal generalizations of the creation/annihilation relations, and also a causal generalization of the Virasoro algebra for the bosonic string. For the nonlinear particle case, we use the approximate solution in terms of the Green’s function, in order to construct a well-behaved causal bracket.

3. Sign-tunable Poisson's ratio in semi-fluorinated graphene.

PubMed

Qin, Rui; Zheng, Jiaxin; Zhu, Wenjun

2017-01-07

Poisson's ratio is a fundamental property of a material which reflects the transverse strain response to the applied axial strain. Negative Poisson's ratio is allowed theoretically, but is rare in nature. Besides the discovery and tailoring of bulk auxetic materials, recent studies have also found a negative Poisson's ratio in nanomaterials, while their negative Poisson's ratio is mainly based on conventional rigid mechanical models as bulk auxetic materials. In this work, we report the existence of in-plane negative Poisson's ratio in a two-dimensional convex structure of newly synthesized semi-fluorinated graphene by using first-principles calculations. In addition, the sign of the Poisson's ratio can be tuned by the applied strain. Interestingly, we find that this unconventional negative Poisson's ratio cannot be explained by conventional rigid mechanical models but originates from the enhanced bond angle strain over the bond strain due to chemical functionalization. This new mechanism of auxetics extends the scope of auxetic nanomaterials and can serve as design principles for future discovery and design of new auxetic materials.

4. Model building in nonproportional hazard regression.

PubMed

2013-12-30

Recent developments of statistical methods allow for a very flexible modeling of covariates affecting survival times via the hazard rate, including also the inspection of possible time-dependent associations. Despite their immediate appeal in terms of flexibility, these models typically introduce additional difficulties when a subset of covariates and the corresponding modeling alternatives have to be chosen, that is, for building the most suitable model for given data. This is particularly true when potentially time-varying associations are given. We propose to conduct a piecewise exponential representation of the original survival data to link hazard regression with estimation schemes based on of the Poisson likelihood to make recent advances for model building in exponential family regression accessible also in the nonproportional hazard regression context. A two-stage stepwise selection approach, an approach based on doubly penalized likelihood, and a componentwise functional gradient descent approach are adapted to the piecewise exponential regression problem. These three techniques were compared via an intensive simulation study. An application to prognosis after discharge for patients who suffered a myocardial infarction supplements the simulation to demonstrate the pros and cons of the approaches in real data analyses.

5. Aging

PubMed Central

Park, Dong Choon

2013-01-01

Aging is initiated based on genetic and environmental factors that operate from the time of birth of organisms. Aging induces physiological phenomena such as reduction of cell counts, deterioration of tissue proteins, tissue atrophy, a decrease of the metabolic rate, reduction of body fluids, and calcium metabolism abnormalities, with final progression onto pathological aging. Despite the efforts from many researchers, the progression and the mechanisms of aging are not clearly understood yet. Therefore, the authors would like to introduce several theories which have gained attentions among the published theories up to date; genetic program theory, wear-and-tear theory, telomere theory, endocrine theory, DNA damage hypothesis, error catastrophe theory, the rate of living theory, mitochondrial theory, and free radical theory. Although there have been many studies that have tried to prevent aging and prolong life, here we introduce a couple of theories which have been proven more or less; food, exercise, and diet restriction. PMID:24653904

6. Minimum risk wavelet shrinkage operator for Poisson image denoising.

PubMed

Cheng, Wu; Hirakawa, Keigo

2015-05-01

The pixel values of images taken by an image sensor are said to be corrupted by Poisson noise. To date, multiscale Poisson image denoising techniques have processed Haar frame and wavelet coefficients--the modeling of coefficients is enabled by the Skellam distribution analysis. We extend these results by solving for shrinkage operators for Skellam that minimizes the risk functional in the multiscale Poisson image denoising setting. The minimum risk shrinkage operator of this kind effectively produces denoised wavelet coefficients with minimum attainable L2 error.

7. Poisson's Ratios and Volume Changes for Plastically Orthotropic Material

NASA Technical Reports Server (NTRS)

Stowell, Elbridge Z; Pride, Richard A

1956-01-01

Measurements of Poisson's ratios have been made in three orthogonal directions on aluminum alloy blocks in compression and on stainless-steel sheet in both tension and compression. These measurements, as well as those obtained by density determinations, show that there is no permanent plastic change in volume within the accuracy of observation. A method is suggested whereby a correlation may be effected between the measured individual values of the Poisson's ratios and the stress-strain curves for the material. Allowance must be made for the difference in the stress-strain in tension and compression; this difference, wherever it appears, is accompanied by significant changes in the Poisson's ratios.

8. Future-singularity-free accelerating expansion with modified Poisson brackets

SciTech Connect

Kim, Wontae; Son, Edwin J.

2007-01-15

We show that the second accelerating expansion of the universe appears smoothly from the decelerating phase, which follows the initial inflation, in the two-dimensional soluble semiclassical dilaton gravity along with the modified Poisson brackets with noncommutativity between the relevant fields. This is in contrast to the fact that the ordinary solution of the equations of motion following from the conventional Poisson algebra describes a permanent accelerating universe without any phase change. In this modified model, it turns out that the noncommutative Poisson algebra is responsible for the remarkable phase transition to the second accelerating expansion.

9. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

PubMed

Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

2009-11-01

G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

10. A Poisson common factor model for projecting mortality and life expectancy jointly for females and males.

PubMed

Li, Jackie

2013-01-01

We examine the application of a Poisson common factor model for the projection of mortality jointly for females and males. The model structure is an extension of the classical Lee-Carter method in which there is a common factor for the aggregate population, while a number of additional sex-specific factors can also be incorporated. The Poisson distribution is a natural choice for modelling the number of deaths, and its use provides a formal statistical framework for model selection, parameter estimation, and data analysis. Our results for Australian data show that this model leads to projected life expectancy values similar to those produced by the separate projection of mortality for females and males, but possesses the additional advantage of ensuring that the projected male-to-female ratio for death rates at each age converges to a constant. Moreover, the randomness of the corresponding residuals indicates that the model fit is satisfactory.

11. Continental crust composition constrained by measurements of crustal Poisson's ratio

Zandt, George; Ammon, Charles J.

1995-03-01

DECIPHERING the geological evolution of the Earth's continental crust requires knowledge of its bulk composition and global variability. The main uncertainties are associated with the composition of the lower crust. Seismic measurements probe the elastic properties of the crust at depth, from which composition can be inferred. Of particular note is Poisson's ratio,Σ ; this elastic parameter can be determined uniquely from the ratio of P- to S-wave seismic velocity, and provides a better diagnostic of crustal composition than either P- or S-wave velocity alone1. Previous attempts to measure Σ have been limited by difficulties in obtaining coincident P- and S-wave data sampling the entire crust2. Here we report 76 new estimates of crustal Σ spanning all of the continents except Antarctica. We find that, on average, Σ increases with the age of the crust. Our results strongly support the presence of a mafic lower crust beneath cratons, and suggest either a uniformitarian craton formation process involving delamination of the lower crust during continental collisions, followed by magmatic underplating, or a model in which crust formation processes have changed since the Precambrian era.

12. Poisson noise obscures hypometabolic lesions in PET.

PubMed

Kerr, Wesley T; Lau, Edward P

2012-12-01

The technology of fluoro-deoxyglucose positron emission tomography (PET) has drastically increased our ability to visualize the metabolic process of numerous neurological diseases. The relationship between the methodological noise sources inherent to PET technology and the resulting noise in the reconstructed image is complex. In this study, we use Monte Carlo simulations to examine the effect of Poisson noise in the PET signal on the noise in reconstructed space for two pervasive reconstruction algorithms: the historical filtered back-projection (FBP) and the more modern expectation maximization (EM). We confirm previous observations that the image reconstructed with the FBP biases all intensity values toward the mean, likely due to spatial spreading of high intensity voxels. However, we demonstrate that in both algorithms the variance from high intensity voxels spreads to low intensity voxels and obliterates their signal to noise ratio. This finding has profound impacts on the clinical interpretation of hypometabolic lesions. Our results suggest that PET is relatively insensitive when it comes to detecting and quantifying changes in hypometabolic tissue. Further, the images reconstructed with EM visually match the original images more closely, but more detailed analysis reveals as much as a 40 percent decrease in the signal to noise ratio for high intensity voxels relative to the FBP. This suggests that even though the apparent spatial resolution of EM outperforms FBP, the signal to noise ratio of the intensity of each voxel may be higher in the FBP. Therefore, EM may be most appropriate for manual visualization of pathology, but FBP should be used when analyzing quantitative markers of the PET signal. This suggestion that different reconstruction algorithms should be used for quantification versus visualization represents a major paradigm shift in the analysis and interpretation of PET images.

13. Reconstructing Early School Trauma through Age Regression.

ERIC Educational Resources Information Center

Rousell, Michael A.; Gillis, David

1994-01-01

Normal fluctuations in consciousness and spontaneous trance states may produce inadvertent hypnotic influence in the classroom. Two case studies illustrate how students may be thus influenced by explicit or implicit suggestions, resulting in subsequent self-defeating behaviors. These cases were successfully treated by reconstructing earlier…

14. Effect of a national requirement to introduce named accountable general practitioners for patients aged 75 or older in England: regression discontinuity analysis of general practice utilisation and continuity of care

PubMed Central

Barker, Isaac; Lloyd, Therese; Steventon, Adam

2016-01-01

Objective To assess the effect of introducing named accountable general practitioners (GPs) for patients aged 75 years on patterns of general practice utilisation, including continuity of care. Design Regression discontinuity design applied to data from the Clinical Practice Research Datalink to estimate the treatment effect for compliers aged 75. Setting 200 general practices in England. Participants 255 469 patients aged between 65 and 85, after excluding those aged 75. Intervention From April 2014, general practices in England were required to offer patients aged 75 or over a named accountable GP. This study compared having named accountable GPs for patients aged just over 75 with usual care provided for patients just under 75. Outcomes Number of contacts (face-to-face or telephone) with GPs, longitudinal continuity of care (usual provider of care, or UPC, index), number of referrals to specialist care and numbers of common diagnostic tests. Outcomes were measured over 9 months following assignment to a named accountable GP and for a comparable period for those unassigned. Results The proportion of patients with a named accountable GP increased from 3.5% to 79.8% at age 75. No statistically significant effects were detected for continuity of care (estimated treatment effect 0.00, 95% CI −0.01 to 0.02) or the number of GP contacts per person (estimated treatment effect −0.11, 95% CI −0.31 to 0.09) over 9 months. No significant change was seen in the number of referrals, blood pressure or HbA1c diagnostic tests per person. A statistically significant treatment effect of −0.05 cholesterol tests per person (95% CI −0.07 to −0.02) was estimated; however, sensitivity analysis indicated that this effect predated the introduction of named accountable GPs. Conclusions Continuity of care is valued by patients, but the named accountable GP initiative did not improve continuity of care or change patterns of GP utilisation in the first 9 months of the

15. Negative Poisson's ratios for extreme states of matter

PubMed

Baughman; Dantas; Stafstrom; Zakhidov; Mitchell; Dubin

2000-06-16

Negative Poisson's ratios are predicted for body-centered-cubic phases that likely exist in white dwarf cores and neutron star outer crusts, as well as those found for vacuumlike ion crystals, plasma dust crystals, and colloidal crystals (including certain virus crystals). The existence of this counterintuitive property, which means that a material laterally expands when stretched, is experimentally demonstrated for very low density crystals of trapped ions. At very high densities, the large predicted negative and positive Poisson's ratios might be important for understanding the asteroseismology of neutron stars and white dwarfs and the effect of stellar stresses on nuclear reaction rates. Giant Poisson's ratios are both predicted and observed for highly strained coulombic photonic crystals, suggesting possible applications of large, tunable Poisson's ratios for photonic crystal devices.

16. Information transmission using non-poisson regular firing.

PubMed

Koyama, Shinsuke; Omi, Takahiro; Kass, Robert E; Shinomoto, Shigeru

2013-04-01

In many cortical areas, neural spike trains do not follow a Poisson process. In this study, we investigate a possible benefit of non-Poisson spiking for information transmission by studying the minimal rate fluctuation that can be detected by a Bayesian estimator. The idea is that an inhomogeneous Poisson process may make it difficult for downstream decoders to resolve subtle changes in rate fluctuation, but by using a more regular non-Poisson process, the nervous system can make rate fluctuations easier to detect. We evaluate the degree to which regular firing reduces the rate fluctuation detection threshold. We find that the threshold for detection is reduced in proportion to the coefficient of variation of interspike intervals.

17. Modeling laser velocimeter signals as triply stochastic Poisson processes

NASA Technical Reports Server (NTRS)

Mayo, W. T., Jr.

1976-01-01

Previous models of laser Doppler velocimeter (LDV) systems have not adequately described dual-scatter signals in a manner useful for analysis and simulation of low-level photon-limited signals. At low photon rates, an LDV signal at the output of a photomultiplier tube is a compound nonhomogeneous filtered Poisson process, whose intensity function is another (slower) Poisson process with the nonstationary rate and frequency parameters controlled by a random flow (slowest) process. In the present paper, generalized Poisson shot noise models are developed for low-level LDV signals. Theoretical results useful in detection error analysis and simulation are presented, along with measurements of burst amplitude statistics. Computer generated simulations illustrate the difference between Gaussian and Poisson models of low-level signals.

18. Improved Denoising via Poisson Mixture Modeling of Image Sensor Noise.

PubMed

Zhang, Jiachao; Hirakawa, Keigo

2017-04-01

This paper describes a study aimed at comparing the real image sensor noise distribution to the models of noise often assumed in image denoising designs. A quantile analysis in pixel, wavelet transform, and variance stabilization domains reveal that the tails of Poisson, signal-dependent Gaussian, and Poisson-Gaussian models are too short to capture real sensor noise behavior. A new Poisson mixture noise model is proposed to correct the mismatch of tail behavior. Based on the fact that noise model mismatch results in image denoising that undersmoothes real sensor data, we propose a mixture of Poisson denoising method to remove the denoising artifacts without affecting image details, such as edge and textures. Experiments with real sensor data verify that denoising for real image sensor data is indeed improved by this new technique.

19. Negative poisson's ratio in single-layer black phosphorus.

PubMed

Jiang, Jin-Wu; Park, Harold S

2014-08-18

The Poisson's ratio is a fundamental mechanical property that relates the resulting lateral strain to applied axial strain. Although this value can theoretically be negative, it is positive for nearly all materials, though negative values have been observed in so-called auxetic structures. However, nearly all auxetic materials are bulk materials whose microstructure has been specifically engineered to generate a negative Poisson's ratio. Here we report using first-principles calculations the existence of a negative Poisson's ratio in a single-layer, two-dimensional material, black phosphorus. In contrast to engineered bulk auxetics, this behaviour is intrinsic for single-layer black phosphorus, and originates from its puckered structure, where the pucker can be regarded as a re-entrant structure that is comprised of two coupled orthogonal hinges. As a result of this atomic structure, a negative Poisson's ratio is observed in the out-of-plane direction under uniaxial deformation in the direction parallel to the pucker.

20. Tuning the Poisson's Ratio of Biomaterials for Investigating Cellular Response

PubMed Central

Meggs, Kyle; Qu, Xin; Chen, Shaochen

2013-01-01

Cells sense and respond to mechanical forces, regardless of whether the source is from a normal tissue matrix, an adjacent cell or a synthetic substrate. In recent years, cell response to surface rigidity has been extensively studied by modulating the elastic modulus of poly(ethylene glycol) (PEG)-based hydrogels. In the context of biomaterials, Poisson's ratio, another fundamental material property parameter has not been explored, primarily because of challenges involved in tuning the Poisson's ratio in biological scaffolds. Two-photon polymerization is used to fabricate suspended web structures that exhibit positive and negative Poisson's ratio (NPR), based on analytical models. NPR webs demonstrate biaxial expansion/compression behavior, as one or multiple cells apply local forces and move the structures. Unusual cell division on NPR structures is also demonstrated. This methodology can be used to tune the Poisson's ratio of several photocurable biomaterials and could have potential implications in the field of mechanobiology. PMID:24076754

1. Two-part zero-inflated negative binomial regression model for quantitative trait loci mapping with count trait.

PubMed

Moghimbeigi, Abbas

2015-05-07

Poisson regression models provide a standard framework for quantitative trait locus (QTL) mapping of count traits. In practice, however, count traits are often over-dispersed relative to the Poisson distribution. In these situations, the zero-inflated Poisson (ZIP), zero-inflated generalized Poisson (ZIGP) and zero-inflated negative binomial (ZINB) regression may be useful for QTL mapping of count traits. Added genetic variables to the negative binomial part equation, may also affect extra zero data. In this study, to overcome these challenges, I apply two-part ZINB model. The EM algorithm with Newton-Raphson method in the M-step uses for estimating parameters. An application of the two-part ZINB model for QTL mapping is considered to detect associations between the formation of gallstone and the genotype of markers.

2. Quantile Regression Analysis of the Distributional Effects of Air Pollution on Blood Pressure, Heart Rate Variability, Blood Lipids, and Biomarkers of Inflammation in Elderly American Men: The Normative Aging Study

PubMed Central

Bind, Marie-Abele; Peters, Annette; Koutrakis, Petros; Coull, Brent; Vokonas, Pantel; Schwartz, Joel

2016-01-01

Background: Previous studies have observed associations between air pollution and heart disease. Susceptibility to air pollution effects has been examined mostly with a test of effect modification, but little evidence is available whether air pollution distorts cardiovascular risk factor distribution. Objectives: This paper aims to examine distributional and heterogeneous effects of air pollution on known cardiovascular biomarkers. Methods: A total of 1,112 men from the Normative Aging Study and residents of the greater Boston, Massachusetts, area with mean age of 69 years at baseline were included in this study during the period 1995–2013. We used quantile regression and random slope models to investigate distributional effects and heterogeneity in the traffic-related responses on blood pressure, heart rate variability, repolarization, lipids, and inflammation. We considered 28-day averaged exposure to particle number, PM2.5 black carbon, and PM2.5 mass concentrations (measured at a single monitor near the site of the study visits). Results: We observed some evidence suggesting distributional effects of traffic-related pollutants on systolic blood pressure, heart rate variability, corrected QT interval, low density lipoprotein (LDL) cholesterol, triglyceride, and intercellular adhesion molecule-1 (ICAM-1). For example, among participants with LDL cholesterol below 80 mg/dL, an interquartile range increase in PM2.5 black carbon exposure was associated with a 7-mg/dL (95% CI: 5, 10) increase in LDL cholesterol, while among subjects with LDL cholesterol levels close to 160 mg/dL, the same exposure was related to a 16-mg/dL (95% CI: 13, 20) increase in LDL cholesterol. We observed similar heterogeneous associations across low versus high percentiles of the LDL distribution for PM2.5 mass and particle number. Conclusions: These results suggest that air pollution distorts the distribution of cardiovascular risk factors, and that, for several outcomes, effects may be

3. Measurement of Poisson's ratio of dental composite restorative materials.

PubMed

Chung, Sew Meng; Yap, Adrian U Jin; Koh, Wee Kiat; Tsai, Kuo Tsing; Lim, Chwee Teck

2004-06-01

The aim of this study was to determine the Poisson ratio of resin-based dental composites using a static tensile test method. Materials used in this investigation were from the same manufacturer (3M ESPE) and included microfill (A110), minifill (Z100 and Filtek Z250), polyacid-modified (F2000), and flowable (Filtek Flowable [FF]) composites. The Poisson ratio of the materials were determined after 1 week conditioning in water at 37 degrees C. The tensile test was performed with using a uniaxial testing system at crosshead speed of 0.5 mm/min. Data was analysed using one-way ANOVA/post-hoc Scheffe's test and Pearson's correlation test at significance level of 0.05. Mean Poisson's ratio (n=8) ranged from 0.302 to 0.393. The Poisson ratio of FF was significantly higher than all other composites evaluated, and the Poisson ratio of A110 was higher than Z100, Z250 and F2000. The Poisson ratio is higher for materials with lower filler volume fraction.

4. Marginal effect of increasing ageing drivers on injury crashes.

PubMed

Tay, Richard

2008-11-01

The safety effects of the ageing driving population have been a topic of research interests in health and transportation economics in recent years due to the ageing of the baby boomers. This study adds to the current knowledge by examining the marginal effects of changing the driver mix on injury crashes using data from the Canadian Province of Alberta between 1990 and 2004. Results from a Poisson regression model reveal that increasing the number of young and ageing drivers will result in an increase in the number of injury crashes whereas increasing the number of middle-aged drivers will result in a reduction. These results are in contrast to those obtained in a previous study on the marginal effects of changing the driver mix on fatal crashes in the Australian State of Queensland and some possible explanations for the differing results are provided.

5. Morse–Smale Regression

SciTech Connect

Gerber, Samuel; Rubel, Oliver; Bremer, Peer -Timo; Pascucci, Valerio; Whitaker, Ross T.

2012-01-19

This paper introduces a novel partition-based regression approach that incorporates topological information. Partition-based regression typically introduces a quality-of-fit-driven decomposition of the domain. The emphasis in this work is on a topologically meaningful segmentation. Thus, the proposed regression approach is based on a segmentation induced by a discrete approximation of the Morse–Smale complex. This yields a segmentation with partitions corresponding to regions of the function with a single minimum and maximum that are often well approximated by a linear model. This approach yields regression models that are amenable to interpretation and have good predictive capacity. Typically, regression estimates are quantified by their geometrical accuracy. For the proposed regression, an important aspect is the quality of the segmentation itself. Thus, this article introduces a new criterion that measures the topological accuracy of the estimate. The topological accuracy provides a complementary measure to the classical geometrical error measures and is very sensitive to overfitting. The Morse–Smale regression is compared to state-of-the-art approaches in terms of geometry and topology and yields comparable or improved fits in many cases. Finally, a detailed study on climate-simulation data demonstrates the application of the Morse–Smale regression. Supplementary Materials are available online and contain an implementation of the proposed approach in the R package msr, an analysis and simulations on the stability of the Morse–Smale complex approximation, and additional tables for the climate-simulation study.

6. Improved Regression Calibration

ERIC Educational Resources Information Center

Skrondal, Anders; Kuha, Jouni

2012-01-01

The likelihood for generalized linear models with covariate measurement error cannot in general be expressed in closed form, which makes maximum likelihood estimation taxing. A popular alternative is regression calibration which is computationally efficient at the cost of inconsistent estimation. We propose an improved regression calibration…

7. Morse-Smale Regression

PubMed Central

Gerber, Samuel; Rübel, Oliver; Bremer, Peer-Timo; Pascucci, Valerio; Whitaker, Ross T.

2012-01-01

This paper introduces a novel partition-based regression approach that incorporates topological information. Partition-based regression typically introduce a quality-of-fit-driven decomposition of the domain. The emphasis in this work is on a topologically meaningful segmentation. Thus, the proposed regression approach is based on a segmentation induced by a discrete approximation of the Morse-Smale complex. This yields a segmentation with partitions corresponding to regions of the function with a single minimum and maximum that are often well approximated by a linear model. This approach yields regression models that are amenable to interpretation and have good predictive capacity. Typically, regression estimates are quantified by their geometrical accuracy. For the proposed regression, an important aspect is the quality of the segmentation itself. Thus, this paper introduces a new criterion that measures the topological accuracy of the estimate. The topological accuracy provides a complementary measure to the classical geometrical error measures and is very sensitive to over-fitting. The Morse-Smale regression is compared to state-of-the-art approaches in terms of geometry and topology and yields comparable or improved fits in many cases. Finally, a detailed study on climate-simulation data demonstrates the application of the Morse-Smale regression. Supplementary materials are available online and contain an implementation of the proposed approach in the R package msr, an analysis and simulations on the stability of the Morse-Smale complex approximation and additional tables for the climate-simulation study. PMID:23687424

8. Boosted Beta Regression

PubMed Central

Schmid, Matthias; Wickler, Florian; Maloney, Kelly O.; Mitchell, Richard; Fenske, Nora; Mayr, Andreas

2013-01-01

Regression analysis with a bounded outcome is a common problem in applied statistics. Typical examples include regression models for percentage outcomes and the analysis of ratings that are measured on a bounded scale. In this paper, we consider beta regression, which is a generalization of logit models to situations where the response is continuous on the interval (0,1). Consequently, beta regression is a convenient tool for analyzing percentage responses. The classical approach to fit a beta regression model is to use maximum likelihood estimation with subsequent AIC-based variable selection. As an alternative to this established - yet unstable - approach, we propose a new estimation technique called boosted beta regression. With boosted beta regression estimation and variable selection can be carried out simultaneously in a highly efficient way. Additionally, both the mean and the variance of a percentage response can be modeled using flexible nonlinear covariate effects. As a consequence, the new method accounts for common problems such as overdispersion and non-binomial variance structures. PMID:23626706

9. Semiclassical Limits of Ore Extensions and a Poisson Generalized Weyl Algebra

Cho, Eun-Hee; Oh, Sei-Qwon

2016-07-01

We observe [Launois and Lecoutre, Trans. Am. Math. Soc. 368:755-785, 2016, Proposition 4.1] that Poisson polynomial extensions appear as semiclassical limits of a class of Ore extensions. As an application, a Poisson generalized Weyl algebra A 1, considered as a Poisson version of the quantum generalized Weyl algebra, is constructed and its Poisson structures are studied. In particular, a necessary and sufficient condition is obtained, such that A 1 is Poisson simple and established that the Poisson endomorphisms of A 1 are Poisson analogues of the endomorphisms of the quantum generalized Weyl algebra.

10. Comparing regression methods for the two-stage clonal expansion model of carcinogenesis.

PubMed

Kaiser, J C; Heidenreich, W F

2004-11-15

In the statistical analysis of cohort data with risk estimation models, both Poisson and individual likelihood regressions are widely used methods of parameter estimation. In this paper, their performance has been tested with the biologically motivated two-stage clonal expansion (TSCE) model of carcinogenesis. To exclude inevitable uncertainties of existing data, cohorts with simple individual exposure history have been created by Monte Carlo simulation. To generate some similar properties of atomic bomb survivors and radon-exposed mine workers, both acute and protracted exposure patterns have been generated. Then the capacity of the two regression methods has been compared to retrieve a priori known model parameters from the simulated cohort data. For simple models with smooth hazard functions, the parameter estimates from both methods come close to their true values. However, for models with strongly discontinuous functions which are generated by the cell mutation process of transformation, the Poisson regression method fails to produce reliable estimates. This behaviour is explained by the construction of class averages during data stratification. Thereby, some indispensable information on the individual exposure history was destroyed. It could not be repaired by countermeasures such as the refinement of Poisson classes or a more adequate choice of Poisson groups. Although this choice might still exist we were unable to discover it. In contrast to this, the individual likelihood regression technique was found to work reliably for all considered versions of the TSCE model.

11. George: Gaussian Process regression

Foreman-Mackey, Daniel

2015-11-01

George is a fast and flexible library, implemented in C++ with Python bindings, for Gaussian Process regression useful for accounting for correlated noise in astronomical datasets, including those for transiting exoplanet discovery and characterization and stellar population modeling.

12. Assessment of Poisson, probit and linear models for genetic analysis of presence and number of black spots in Corriedale sheep.

PubMed

Peñagaricano, F; Urioste, J I; Naya, H; de los Campos, G; Gianola, D

2011-04-01

Black skin spots are associated with pigmented fibres in wool, an important quality fault. Our objective was to assess alternative models for genetic analysis of presence (BINBS) and number (NUMBS) of black spots in Corriedale sheep. During 2002-08, 5624 records from 2839 animals in two flocks, aged 1 through 6 years, were taken at shearing. Four models were considered: linear and probit for BINBS and linear and Poisson for NUMBS. All models included flock-year and age as fixed effects and animal and permanent environmental as random effects. Models were fitted to the whole data set and were also compared based on their predictive ability in cross-validation. Estimates of heritability ranged from 0.154 to 0.230 for BINBS and 0.269 to 0.474 for NUMBS. For BINBS, the probit model fitted slightly better to the data than the linear model. Predictions of random effects from these models were highly correlated, and both models exhibited similar predictive ability. For NUMBS, the Poisson model, with a residual term to account for overdispersion, performed better than the linear model in goodness of fit and predictive ability. Predictions of random effects from the Poisson model were more strongly correlated with those from BINBS models than those from the linear model. Overall, the use of probit or linear models for BINBS and of a Poisson model with a residual for NUMBS seems a reasonable choice for genetic selection purposes in Corriedale sheep.

13. Poisson image reconstruction with Hessian Schatten-norm regularization.

PubMed

Lefkimmiatis, Stamatios; Unser, Michael

2013-11-01

Poisson inverse problems arise in many modern imaging applications, including biomedical and astronomical ones. The main challenge is to obtain an estimate of the underlying image from a set of measurements degraded by a linear operator and further corrupted by Poisson noise. In this paper, we propose an efficient framework for Poisson image reconstruction, under a regularization approach, which depends on matrix-valued regularization operators. In particular, the employed regularizers involve the Hessian as the regularization operator and Schatten matrix norms as the potential functions. For the solution of the problem, we propose two optimization algorithms that are specifically tailored to the Poisson nature of the noise. These algorithms are based on an augmented-Lagrangian formulation of the problem and correspond to two variants of the alternating direction method of multipliers. Further, we derive a link that relates the proximal map of an l(p) norm with the proximal map of a Schatten matrix norm of order p. This link plays a key role in the development of one of the proposed algorithms. Finally, we provide experimental results on natural and biological images for the task of Poisson image deblurring and demonstrate the practical relevance and effectiveness of the proposed framework.

14. Markov modulated Poisson process models incorporating covariates for rainfall intensity.

PubMed

Thayakaran, R; Ramesh, N I

2013-01-01

Time series of rainfall bucket tip times at the Beaufort Park station, Bracknell, in the UK are modelled by a class of Markov modulated Poisson processes (MMPP) which may be thought of as a generalization of the Poisson process. Our main focus in this paper is to investigate the effects of including covariate information into the MMPP model framework on statistical properties. In particular, we look at three types of time-varying covariates namely temperature, sea level pressure, and relative humidity that are thought to be affecting the rainfall arrival process. Maximum likelihood estimation is used to obtain the parameter estimates, and likelihood ratio tests are employed in model comparison. Simulated data from the fitted model are used to make statistical inferences about the accumulated rainfall in the discrete time interval. Variability of the daily Poisson arrival rates is studied.

15. POISs3: A 3D poisson smoother of structured grids

Lehtimaeki, R.

Flow solvers based on solving Navier-Stokes or Euler equations generally need a computational grid to represent the domain of the flow. A structured computational grid can be efficiently produced by algebraic methods like transfinite interpolation. Unfortunately, algebraic methods propagate all kinds of unsmoothness of the boundary into the field. Unsmoothness of the grid, in turn, can result in inaccuracy in the flow solver. In the present work a 3D elliptic grid smoother was developed. The smoother is based on solving three Poisson equations, one for each curvilinear direction. The Poisson equations formed in the physical region are first transformed to the computational (rectilinear) region. The resulting equations form a system of three coupled elliptic quasi-linear partial differential equations in the computational domain. A short review of the Poisson method is presented. The regularity of a grid cell is studied and a skewness value is developed.

16. A spectral Poisson solver for kinetic plasma simulation

Szeremley, Daniel; Obberath, Jens; Brinkmann, Ralf

2011-10-01

Plasma resonance spectroscopy is a well established plasma diagnostic method, realized in several designs. One of these designs is the multipole resonance probe (MRP). In its idealized - geometrically simplified - version it consists of two dielectrically shielded, hemispherical electrodes to which an RF signal is applied. A numerical tool is under development which is capable of simulating the dynamics of the plasma surrounding the MRP in electrostatic approximation. In this contribution we concentrate on the specialized Poisson solver for that tool. The plasma is represented by an ensemble of point charges. By expanding both the charge density and the potential into spherical harmonics, a largely analytical solution of the Poisson problem can be employed. For a practical implementation, the expansion must be appropriately truncated. With this spectral solver we are able to efficiently solve the Poisson equation in a kinetic plasma simulation without the need of introducing a spatial discretization.

17. Blocked Shape Memory Effect in Negative Poisson's Ratio Polymer Metamaterials.

PubMed

Boba, Katarzyna; Bianchi, Matteo; McCombe, Greg; Gatt, Ruben; Griffin, Anselm C; Richardson, Robert M; Scarpa, Fabrizio; Hamerton, Ian; Grima, Joseph N

2016-08-10

We describe a new class of negative Poisson's ratio (NPR) open cell PU-PE foams produced by blocking the shape memory effect in the polymer. Contrary to classical NPR open cell thermoset and thermoplastic foams that return to their auxetic phase after reheating (and therefore limit their use in technological applications), this new class of cellular solids has a permanent negative Poisson's ratio behavior, generated through multiple shape memory (mSM) treatments that lead to a fixity of the topology of the cell foam. The mSM-NPR foams have Poisson's ratio values similar to the auxetic foams prior their return to the conventional phase, but compressive stress-strain curves similar to the ones of conventional foams. The results show that by manipulating the shape memory effect in polymer microstructures it is possible to obtain new classes of materials with unusual deformation mechanisms.

18. Effect of Poisson noise on adiabatic quantum control

Kiely, A.; Muga, J. G.; Ruschhaupt, A.

2017-01-01

We present a detailed derivation of the master equation describing a general time-dependent quantum system with classical Poisson white noise and outline its various properties. We discuss the limiting cases of Poisson white noise and provide approximations for the different noise strength regimes. We show that using the eigenstates of the noise superoperator as a basis can be a useful way of expressing the master equation. Using this, we simulate various settings to illustrate different effects of Poisson noise. In particular, we show a dip in the fidelity as a function of noise strength where high fidelity can occur in the strong-noise regime for some cases. We also investigate recent claims [J. Jing et al., Phys. Rev. A 89, 032110 (2014), 10.1103/PhysRevA.89.032110] that this type of noise may improve rather than destroy adiabaticity.

19. Poisson distribution to analyze near-threshold motor evoked potentials.

PubMed

Kaelin-Lang, Alain; Conforto, Adriana B; Z'Graggen, Werner; Hess, Christian W

2010-11-01

Motor unit action potentials (MUAPs) evoked by repetitive, low-intensity transcranial magnetic stimulation can be modeled as a Poisson process. A mathematical consequence of such a model is that the ratio of the variance to the mean of the amplitudes of motor evoked potentials (MEPs) should provide an estimate of the mean size of the individual MUAPs that summate to generate each MEP. We found that this is, in fact, the case. Our finding thus supports the use of the Poisson distribution to model MEP generation and indicates that this model enables characterization of the motor unit population that contributes to near-threshold MEPs.

20. Composite laminates with negative through-the-thickness Poisson's ratios

NASA Technical Reports Server (NTRS)

Herakovich, C. T.

1984-01-01

A simple analysis using two dimensional lamination theory combined with the appropriate three dimensional anisotropic constitutive equation is presented to show some rather surprising results for the range of values of the through-the-thickness effective Poisson's ratio nu sub xz for angle ply laminates. Results for graphite-epoxy show that the through-the-thickness effective Poisson's ratio can range from a high of 0.49 for a 90 laminate to a low of -0.21 for a + or - 25s laminate. It is shown that negative values of nu sub xz are also possible for other laminates.

1. Composite laminates with negative through-the-thickness Poisson's ratios

NASA Technical Reports Server (NTRS)

Herakovich, C. T.

1984-01-01

A simple analysis using two-dimensional lamination theory combined with the appropriate three-dimensional anisotropic constitutive equation is presented to show some rather surprising results for the range of values of the through-the-thickness effective Poisson's ratio nu sub xz for angle ply laminates. Results for graphite-epoxy show that the through-the-thickness effective Poisson's ratio can range from a high of 0.49 for a 90 laminate to a low of -0.21 for a + or - 25s laminate. It is shown that negative values of nu sub xz are also possible for other laminates.

2. Validation of the Poisson Stochastic Radiative Transfer Model

NASA Technical Reports Server (NTRS)

Zhuravleva, Tatiana; Marshak, Alexander

2004-01-01

A new approach to validation of the Poisson stochastic radiative transfer method is proposed. In contrast to other validations of stochastic models, the main parameter of the Poisson model responsible for cloud geometrical structure - cloud aspect ratio - is determined entirely by matching measurements and calculations of the direct solar radiation. If the measurements of the direct solar radiation is unavailable, it was shown that there is a range of the aspect ratios that allows the stochastic model to accurately approximate the average measurements of surface downward and cloud top upward fluxes. Realizations of the fractionally integrated cascade model are taken as a prototype of real measurements.

3. A Study of Poisson's Ratio in the Yield Region

NASA Technical Reports Server (NTRS)

Gerard, George; Wildhorn, Sorrel

1952-01-01

In the yield region of the stress-strain curve the variation in Poisson's ratio from the elastic to the plastic value is most pronounced. This variation was studied experimentally by a systematic series of tests on several aluminum alloys. The tests were conducted under simple tensile and compressive loading along three orthogonal axes. A theoretical variation of Poisson's ratio for an orthotropic solid was obtained from dilatational considerations. The assumptions used in deriving the theory were examined by use of the test data and were found to be in reasonable agreement with experimental evidence.

4. [Understanding logistic regression].

PubMed

El Sanharawi, M; Naudet, F

2013-10-01

Logistic regression is one of the most common multivariate analysis models utilized in epidemiology. It allows the measurement of the association between the occurrence of an event (qualitative dependent variable) and factors susceptible to influence it (explicative variables). The choice of explicative variables that should be included in the logistic regression model is based on prior knowledge of the disease physiopathology and the statistical association between the variable and the event, as measured by the odds ratio. The main steps for the procedure, the conditions of application, and the essential tools for its interpretation are discussed concisely. We also discuss the importance of the choice of variables that must be included and retained in the regression model in order to avoid the omission of important confounding factors. Finally, by way of illustration, we provide an example from the literature, which should help the reader test his or her knowledge.

5. The mechanical influences of the graded distribution in the cross-sectional shape, the stiffness and Poisson׳s ratio of palm branches.

PubMed

Liu, Wangyu; Wang, Ningling; Jiang, Xiaoyong; Peng, Yujian

2016-07-01

The branching system plays an important role in maintaining the survival of palm trees. Due to the nature of monocots, no additional vascular bundles can be added in the palm tree tissue as it ages. Therefore, the changing of the cross-sectional area in the palm branch creates a graded distribution in the mechanical properties of the tissue. In the present work, this graded distribution in the tissue mechanical properties from sheath to petiole were studied with a multi-scale modeling approach. Then, the entire palm branch was reconstructed and analyzed using finite element methods. The variation of the elastic modulus can lower the level of mechanical stress in the sheath and also allow the branch to have smaller values of pressure on the other branches. Under impact loading, the enhanced frictional dissipation at the surfaces of adjacent branches benefits from the large Poisson׳s ratio of the sheath tissue. These findings can help to link the wind resistance ability of palm trees to their graded materials distribution in the branching system.

6. Practical Session: Logistic Regression

Clausel, M.; Grégoire, G.

2014-12-01

An exercise is proposed to illustrate the logistic regression. One investigates the different risk factors in the apparition of coronary heart disease. It has been proposed in Chapter 5 of the book of D.G. Kleinbaum and M. Klein, "Logistic Regression", Statistics for Biology and Health, Springer Science Business Media, LLC (2010) and also by D. Chessel and A.B. Dufour in Lyon 1 (see Sect. 6 of http://pbil.univ-lyon1.fr/R/pdf/tdr341.pdf). This example is based on data given in the file evans.txt coming from http://www.sph.emory.edu/dkleinb/logreg3.htm#data.

7. Ridge Regression: A Regression Procedure for Analyzing correlated Independent Variables

ERIC Educational Resources Information Center

Rakow, Ernest A.

1978-01-01

Ridge regression is a technique used to ameliorate the problem of highly correlated independent variables in multiple regression analysis. This paper explains the fundamentals of ridge regression and illustrates its use. (JKS)

8. Modern Regression Discontinuity Analysis

ERIC Educational Resources Information Center

Bloom, Howard S.

2012-01-01

This article provides a detailed discussion of the theory and practice of modern regression discontinuity (RD) analysis for estimating the effects of interventions or treatments. Part 1 briefly chronicles the history of RD analysis and summarizes its past applications. Part 2 explains how in theory an RD analysis can identify an average effect of…

9. Multiple linear regression analysis

NASA Technical Reports Server (NTRS)

Edwards, T. R.

1980-01-01

Program rapidly selects best-suited set of coefficients. User supplies only vectors of independent and dependent data and specifies confidence level required. Program uses stepwise statistical procedure for relating minimal set of variables to set of observations; final regression contains only most statistically significant coefficients. Program is written in FORTRAN IV for batch execution and has been implemented on NOVA 1200.

10. Explorations in Statistics: Regression

ERIC Educational Resources Information Center

Curran-Everett, Douglas

2011-01-01

Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This seventh installment of "Explorations in Statistics" explores regression, a technique that estimates the nature of the relationship between two things for which we may only surmise a mechanistic or predictive…

11. Zero-Inflated Poisson Modeling of Fall Risk Factors in Community-Dwelling Older Adults.

PubMed

Jung, Dukyoo; Kang, Younhee; Kim, Mi Young; Ma, Rye-Won; Bhandari, Pratibha

2016-02-01

The aim of this study was to identify risk factors for falls among community-dwelling older adults. The study used a cross-sectional descriptive design. Self-report questionnaires were used to collect data from 658 community-dwelling older adults and were analyzed using logistic and zero-inflated Poisson (ZIP) regression. Perceived health status was a significant factor in the count model, and fall efficacy emerged as a significant predictor in the logistic models. The findings suggest that fall efficacy is important for predicting not only faller and nonfaller status but also fall counts in older adults who may or may not have experienced a previous fall. The fall predictors identified in this study--perceived health status and fall efficacy--indicate the need for fall-prevention programs tailored to address both the physical and psychological issues unique to older adults.

12. On covariant Poisson brackets in classical field theory

Forger, Michael; Salles, Mário O.

2015-10-01

How to give a natural geometric definition of a covariant Poisson bracket in classical field theory has for a long time been an open problem—as testified by the extensive literature on "multisymplectic Poisson brackets," together with the fact that all these proposals suffer from serious defects. On the other hand, the functional approach does provide a good candidate which has come to be known as the Peierls-De Witt bracket and whose construction in a geometrical setting is now well understood. Here, we show how the basic "multisymplectic Poisson bracket" already proposed in the 1970s can be derived from the Peierls-De Witt bracket, applied to a special class of functionals. This relation allows to trace back most (if not all) of the problems encountered in the past to ambiguities (the relation between differential forms on multiphase space and the functionals they define is not one-to-one) and also to the fact that this class of functionals does not form a Poisson subalgebra.

13. The Poisson-Lognormal Model for Bibliometric/Scientometric Distributions.

ERIC Educational Resources Information Center

Stewart, John A.

1994-01-01

Illustrates that the Poisson-lognormal model provides good fits to a diverse set of distributions commonly studied in bibliometrics and scientometrics. Topics discussed include applications to the empirical data sets related to the laws of Lotka, Bradford, and Zipf; causal processes that could generate lognormal distributions; and implications for…

14. Wide-area traffic: The failure of Poisson modeling

SciTech Connect

Paxson, V.; Floyd, S.

1994-08-01

Network arrivals are often modeled as Poisson processes for analytic simplicity, even though a number of traffic studies have shown that packet interarrivals are not exponentially distributed. The authors evaluate 21 wide-area traces, investigating a number of wide-area TCP arrival processes (session and connection arrivals, FTPDATA connection arrivals within FTP sessions, and TELNET packet arrivals) to determine the error introduced by modeling them using Poisson processes. The authors find that user-initiated TCP session arrivals, such as remote-login and file-transfer, are well-modeled as Poisson processes with fixed hourly rates, but that other connection arrivals deviate considerably from Poisson; that modeling TELNET packet interarrivals as exponential grievously underestimates the burstiness of TELNET traffic, but using the empirical Tcplib[DJCME92] interarrivals preserves burstiness over many time scales; and that FTPDATA connection arrivals within FTP sessions come bunched into ``connection bursts``, the largest of which are so large that they completely dominate FTPDATA traffic. Finally, they offer some preliminary results regarding how the findings relate to the possible self-similarity of wide-area traffic.

15. Vectorized multigrid Poisson solver for the CDC CYBER 205

NASA Technical Reports Server (NTRS)

Barkai, D.; Brandt, M. A.

1984-01-01

The full multigrid (FMG) method is applied to the two dimensional Poisson equation with Dirichlet boundary conditions. This has been chosen as a relatively simple test case for examining the efficiency of fully vectorizing of the multigrid method. Data structure and programming considerations and techniques are discussed, accompanied by performance details.

16. Wavelet-based Poisson rate estimation using the Skellam distribution

Hirakawa, Keigo; Baqai, Farhan; Wolfe, Patrick J.

2009-02-01

Owing to the stochastic nature of discrete processes such as photon counts in imaging, real-world data measurements often exhibit heteroscedastic behavior. In particular, time series components and other measurements may frequently be assumed to be non-iid Poisson random variables, whose rate parameter is proportional to the underlying signal of interest-witness literature in digital communications, signal processing, astronomy, and magnetic resonance imaging applications. In this work, we show that certain wavelet and filterbank transform coefficients corresponding to vector-valued measurements of this type are distributed as sums and differences of independent Poisson counts, taking the so-called Skellam distribution. While exact estimates rarely admit analytical forms, we present Skellam mean estimators under both frequentist and Bayes models, as well as computationally efficient approximations and shrinkage rules, that may be interpreted as Poisson rate estimation method performed in certain wavelet/filterbank transform domains. This indicates a promising potential approach for denoising of Poisson counts in the above-mentioned applications.

17. Indentability of conventional and negative Poisson's ratio foams

NASA Technical Reports Server (NTRS)

Lakes, R. S.; Elms, K.

1992-01-01

The indentation resistance of foams, both of conventional structure and of reentrant structure giving rise to negative Poisson's ratio, is studied using holographic interferometry. In holographic indentation tests, reentrant foams had higher yield strength and lower stiffness than conventional foams of the same original relative density. Calculated energy absorption for dynamic impact is considerably higher for reentrant foam than conventional foam.

18. A note on robust inference from a conditional Poisson model.

PubMed

Solís-Trápala, Ivonne L; Farewell, Vernon T

2006-02-01

A randomised controlled trial to evaluate a training programme for physician-patient communication required the analysis of paired count data. The impact of departures from the Poisson assumption when paired count data are analysed through use of a conditional likelihood is illustrated. A simple approach to providing robust inference is outlined and illustrated.

19. Application of Poisson random effect models for highway network screening.

PubMed

Jiang, Ximiao; Abdel-Aty, Mohamed; Alamili, Samer

2014-02-01

In recent years, Bayesian random effect models that account for the temporal and spatial correlations of crash data became popular in traffic safety research. This study employs random effect Poisson Log-Normal models for crash risk hotspot identification. Both the temporal and spatial correlations of crash data were considered. Potential for Safety Improvement (PSI) were adopted as a measure of the crash risk. Using the fatal and injury crashes that occurred on urban 4-lane divided arterials from 2006 to 2009 in the Central Florida area, the random effect approaches were compared to the traditional Empirical Bayesian (EB) method and the conventional Bayesian Poisson Log-Normal model. A series of method examination tests were conducted to evaluate the performance of different approaches. These tests include the previously developed site consistence test, method consistence test, total rank difference test, and the modified total score test, as well as the newly proposed total safety performance measure difference test. Results show that the Bayesian Poisson model accounting for both temporal and spatial random effects (PTSRE) outperforms the model that with only temporal random effect, and both are superior to the conventional Poisson Log-Normal model (PLN) and the EB model in the fitting of crash data. Additionally, the method evaluation tests indicate that the PTSRE model is significantly superior to the PLN model and the EB model in consistently identifying hotspots during successive time periods. The results suggest that the PTSRE model is a superior alternative for road site crash risk hotspot identification.

20. 3D soft metamaterials with negative Poisson's ratio.

PubMed

Babaee, Sahab; Shim, Jongmin; Weaver, James C; Chen, Elizabeth R; Patel, Nikita; Bertoldi, Katia

2013-09-25

Buckling is exploited to design a new class of three-dimensional metamaterials with negative Poisson's ratio. A library of auxetic building blocks is identified and procedures are defined to guide their selection and assembly. The auxetic properties of these materials are demonstrated both through experiments and finite element simulations and exhibit excellent qualitative and quantitative agreement.

1. Tailoring graphene to achieve negative Poisson's ratio properties.

PubMed

Grima, Joseph N; Winczewski, Szymon; Mizzi, Luke; Grech, Michael C; Cauchi, Reuben; Gatt, Ruben; Attard, Daphne; Wojciechowski, Krzysztof W; Rybicki, Jarosław

2015-02-25

Graphene can be made auxetic through the introduction of vacancy defects. This results in the thinnest negative Poisson's ratio material at ambient conditions known so far, an effect achieved via a nanoscale de-wrinkling mechanism that mimics the behavior at the macroscale exhibited by a crumpled sheet of paper when stretched.

2. Translated Poisson Mixture Model for Stratification Learning (PREPRINT)

DTIC Science & Technology

2007-09-01

stratification learning in high dimensional data analysis in general and computer vision and image analysis in particular. 15. SUBJECT TERMS 16...unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified Translated Poisson Mixture Model for Stratification Learning Gloria Haro Dept. Teoria ... general and computer vision and image analysis in particular. 1 Introduction Recently, there has been significant interest in analyzing the intrinsic

3. Subsonic Flow for the Multidimensional Euler-Poisson System

Bae, Myoungjean; Duan, Ben; Xie, Chunjing

2016-04-01

We establish the existence and stability of subsonic potential flow for the steady Euler-Poisson system in a multidimensional nozzle of a finite length when prescribing the electric potential difference on a non-insulated boundary from a fixed point at the exit, and prescribing the pressure at the exit of the nozzle. The Euler-Poisson system for subsonic potential flow can be reduced to a nonlinear elliptic system of second order. In this paper, we develop a technique to achieve a priori {C^{1,α}} estimates of solutions to a quasi-linear second order elliptic system with mixed boundary conditions in a multidimensional domain enclosed by a Lipschitz continuous boundary. In particular, we discovered a special structure of the Euler-Poisson system which enables us to obtain {C^{1,α}} estimates of the velocity potential and the electric potential functions, and this leads us to establish structural stability of subsonic flows for the Euler-Poisson system under perturbations of various data.

4. Negative Poisson's Ratio in Single-Layer Graphene Ribbons.

PubMed

Jiang, Jin-Wu; Park, Harold S

2016-04-13

The Poisson's ratio characterizes the resultant strain in the lateral direction for a material under longitudinal deformation. Though negative Poisson's ratios (NPR) are theoretically possible within continuum elasticity, they are most frequently observed in engineered materials and structures, as they are not intrinsic to many materials. In this work, we report NPR in single-layer graphene ribbons, which results from the compressive edge stress induced warping of the edges. The effect is robust, as the NPR is observed for graphene ribbons with widths smaller than about 10 nm, and for tensile strains smaller than about 0.5% with NPR values reaching as large as -1.51. The NPR is explained analytically using an inclined plate model, which is able to predict the Poisson's ratio for graphene sheets of arbitrary size. The inclined plate model demonstrates that the NPR is governed by the interplay between the width (a bulk property), and the warping amplitude of the edge (an edge property), which eventually yields a phase diagram determining the sign of the Poisson's ratio as a function of the graphene geometry.

5. On removal of charge singularity in Poisson-Boltzmann equation.

PubMed

Cai, Qin; Wang, Jun; Zhao, Hong-Kai; Luo, Ray

2009-04-14

The Poisson-Boltzmann theory has become widely accepted in modeling electrostatic solvation interactions in biomolecular calculations. However the standard practice of atomic point charges in molecular mechanics force fields introduces singularity into the Poisson-Boltzmann equation. The finite-difference/finite-volume discretization approach to the Poisson-Boltzmann equation alleviates the numerical difficulty associated with the charge singularity but introduces discretization error into the electrostatic potential. Decomposition of the electrostatic potential has been explored to remove the charge singularity explicitly to achieve higher numerical accuracy in the solution of the electrostatic potential. In this study, we propose an efficient method to overcome the charge singularity problem. In our framework, two separate equations for two different potentials in two different regions are solved simultaneously, i.e., the reaction field potential in the solute region and the total potential in the solvent region. The proposed method can be readily implemented with typical finite-difference Poisson-Boltzmann solvers and return the singularity-free reaction field potential with a single run. Test runs on 42 small molecules and 4 large proteins show a very high agreement between the reaction field energies computed by the proposed method and those by the classical finite-difference Poisson-Boltzmann method. It is also interesting to note that the proposed method converges faster than the classical method, though additional time is needed to compute Coulombic potential on the dielectric boundary. The higher precision, accuracy, and efficiency of the proposed method will allow for more robust electrostatic calculations in molecular mechanics simulations of complex biomolecular systems.

6. Climate variations and salmonellosis transmission in Adelaide, South Australia: a comparison between regression models

Zhang, Ying; Bi, Peng; Hiller, Janet

2008-01-01

This is the first study to identify appropriate regression models for the association between climate variation and salmonellosis transmission. A comparison between different regression models was conducted using surveillance data in Adelaide, South Australia. By using notified salmonellosis cases and climatic variables from the Adelaide metropolitan area over the period 1990-2003, four regression methods were examined: standard Poisson regression, autoregressive adjusted Poisson regression, multiple linear regression, and a seasonal autoregressive integrated moving average (SARIMA) model. Notified salmonellosis cases in 2004 were used to test the forecasting ability of the four models. Parameter estimation, goodness-of-fit and forecasting ability of the four regression models were compared. Temperatures occurring 2 weeks prior to cases were positively associated with cases of salmonellosis. Rainfall was also inversely related to the number of cases. The comparison of the goodness-of-fit and forecasting ability suggest that the SARIMA model is better than the other three regression models. Temperature and rainfall may be used as climatic predictors of salmonellosis cases in regions with climatic characteristics similar to those of Adelaide. The SARIMA model could, thus, be adopted to quantify the relationship between climate variations and salmonellosis transmission.

7. Incidence of Type 1 Diabetes in Sweden Among Individuals Aged 0–34 Years, 1983–2007

PubMed Central

Dahlquist, Gisela G.; Nyström, Lennarth; Patterson, Christopher C.

2011-01-01

OBJECTIVE To clarify whether the increase in childhood type 1 diabetes is mirrored by a decrease in older age-groups, resulting in younger age at diagnosis. RESEARCH DESIGN AND METHODS We used data from two prospective research registers, the Swedish Childhood Diabetes Register, which included case subjects aged 0–14.9 years at diagnosis, and the Diabetes in Sweden Study, which included case subjects aged 15–34.9 years at diagnosis, covering birth cohorts between 1948 and 2007. The total database included 20,249 individuals with diabetes diagnosed between 1983 and 2007. Incidence rates over time were analyzed using Poisson regression models. RESULTS The overall yearly incidence rose to a peak of 42.3 per 100,000 person-years in male subjects aged 10–14 years and to a peak of 37.1 per 100,000 person-years in female subjects aged 5–9 years and decreased thereafter. There was a significant increase by calendar year in both sexes in the three age-groups <15 years; however, there were significant decreases in the older age-groups (25- to 29-years and 30- to 34-years age-groups). Poisson regression analyses showed that a cohort effect seemed to dominate over a time-period effect. CONCLUSIONS Twenty-five years of prospective nationwide incidence registration demonstrates a clear shift to younger age at onset rather than a uniform increase in incidence rates across all age-groups. The dominance of cohort effects over period effects suggests that exposures affecting young children may be responsible for the increasing incidence in the younger age-groups. PMID:21680725

8. Poisson-type inequalities for growth properties of positive superharmonic functions.

PubMed

Luan, Kuan; Vieira, John

2017-01-01

In this paper, we present new Poisson-type inequalities for Poisson integrals with continuous data on the boundary. The obtained inequalities are used to obtain growth properties at infinity of positive superharmonic functions in a smooth cone.

9. Calculating a Stepwise Ridge Regression.

ERIC Educational Resources Information Center

Morris, John D.

1986-01-01

Although methods for using ordinary least squares regression computer programs to calculate a ridge regression are available, the calculation of a stepwise ridge regression requires a special purpose algorithm and computer program. The correct stepwise ridge regression procedure is given, and a parallel FORTRAN computer program is described.…

10. Orthogonal Regression: A Teaching Perspective

ERIC Educational Resources Information Center

Carr, James R.

2012-01-01

A well-known approach to linear least squares regression is that which involves minimizing the sum of squared orthogonal projections of data points onto the best fit line. This form of regression is known as orthogonal regression, and the linear model that it yields is known as the major axis. A similar method, reduced major axis regression, is…

11. Steganalysis using logistic regression

Lubenko, Ivans; Ker, Andrew D.

2011-02-01

We advocate Logistic Regression (LR) as an alternative to the Support Vector Machine (SVM) classifiers commonly used in steganalysis. LR offers more information than traditional SVM methods - it estimates class probabilities as well as providing a simple classification - and can be adapted more easily and efficiently for multiclass problems. Like SVM, LR can be kernelised for nonlinear classification, and it shows comparable classification accuracy to SVM methods. This work is a case study, comparing accuracy and speed of SVM and LR classifiers in detection of LSB Matching and other related spatial-domain image steganography, through the state-of-art 686-dimensional SPAM feature set, in three image sets.

12. Correlation between supercooled liquid relaxation and glass Poisson's ratio

Sun, Qijing; Hu, Lina; Zhou, Chao; Zheng, Haijiao; Yue, Yuanzheng

2015-10-01

We report on a correlation between the supercooled liquid (SL) relaxation and glass Poisson's ratio (v) by comparing the activation energy ratio (r) of the α and the slow β relaxations and the v values for both metallic and nonmetallic glasses. Poisson's ratio v generally increases with an increase in the ratio r and this relation can be described by the empirical function v = 0.5 - A*exp(-B*r), where A and B are constants. This correlation might imply that glass plasticity is associated with the competition between the α and the slow β relaxations in SLs. The underlying physics of this correlation lies in the heredity of the structural heterogeneity from liquid to glass. This work gives insights into both the microscopic mechanism of glass deformation through the SL dynamics and the complex structural evolution during liquid-glass transition.

13. Mixed Poisson distributions in exact solutions of stochastic autoregulation models.

PubMed

Iyer-Biswas, Srividya; Jayaprakash, C

2014-11-01

In this paper we study the interplay between stochastic gene expression and system design using simple stochastic models of autoactivation and autoinhibition. Using the Poisson representation, a technique whose particular usefulness in the context of nonlinear gene regulation models we elucidate, we find exact results for these feedback models in the steady state. Further, we exploit this representation to analyze the parameter spaces of each model, determine which dimensionless combinations of rates are the shape determinants for each distribution, and thus demarcate where in the parameter space qualitatively different behaviors arise. These behaviors include power-law-tailed distributions, bimodal distributions, and sub-Poisson distributions. We also show how these distribution shapes change when the strength of the feedback is tuned. Using our results, we reexamine how well the autoinhibition and autoactivation models serve their conventionally assumed roles as paradigms for noise suppression and noise exploitation, respectively.

14. A dictionary learning approach for Poisson image deblurring.

PubMed

Ma, Liyan; Moisan, Lionel; Yu, Jian; Zeng, Tieyong

2013-07-01

The restoration of images corrupted by blur and Poisson noise is a key issue in medical and biological image processing. While most existing methods are based on variational models, generally derived from a maximum a posteriori (MAP) formulation, recently sparse representations of images have shown to be efficient approaches for image recovery. Following this idea, we propose in this paper a model containing three terms: a patch-based sparse representation prior over a learned dictionary, the pixel-based total variation regularization term and a data-fidelity term capturing the statistics of Poisson noise. The resulting optimization problem can be solved by an alternating minimization technique combined with variable splitting. Extensive experimental results suggest that in terms of visual quality, peak signal-to-noise ratio value and the method noise, the proposed algorithm outperforms state-of-the-art methods.

15. Reference manual for the POISSON/SUPERFISH Group of Codes

SciTech Connect

Not Available

1987-01-01

The POISSON/SUPERFISH Group codes were set up to solve two separate problems: the design of magnets and the design of rf cavities in a two-dimensional geometry. The first stage of either problem is to describe the layout of the magnet or cavity in a way that can be used as input to solve the generalized Poisson equation for magnets or the Helmholtz equations for cavities. The computer codes require that the problems be discretized by replacing the differentials (dx,dy) by finite differences ({delta}X,{delta}Y). Instead of defining the function everywhere in a plane, the function is defined only at a finite number of points on a mesh in the plane.

16. Poisson and symplectic structures on Lie algebras. I

Alekseevsky, D. V.; Perelomov, A. M.

1997-06-01

The purpose of this paper is to describe a new class of Poisson and symplectic structures on Lie algebras. This gives a new class of solutions of the classical Yang-Baxter equation. The class of elementary Lie algebras is defined and the Poisson and symplectic structures for them are described. The algorithm is given for description of all closed 2-forms and of symplectic structures on any Lie algebra G, which is decomposed into semidirect sum of elementary subalgebras. Using these results we obtain the description of closed 2-forms and symplectic forms (if they exist) on the Borel subalgebra B(G) of semisimple Lie algebra G. As a byproduct, we get description of the second cohomology group H2( B( G)).

17. New method for blowup of the Euler-Poisson system

Kwong, Man Kam; Yuen, Manwai

2016-08-01

In this paper, we provide a new method for establishing the blowup of C2 solutions for the pressureless Euler-Poisson system with attractive forces for RN (N ≥ 2) with ρ(0, x0) > 0 and Ω 0 i j ( x 0 ) = /1 2 [" separators=" ∂ i u j ( 0 , x 0 ) - ∂ j u i ( 0 , x 0 ) ] = 0 at some point x0 ∈ RN. By applying the generalized Hubble transformation div u ( t , x 0 ( t ) ) = /N a ˙ ( t ) a ( t ) to a reduced Riccati differential inequality derived from the system, we simplify the inequality into the Emden equation a ̈ ( t ) = - /λ a ( t ) N - 1 , a ( 0 ) = 1 , a ˙ ( 0 ) = /div u ( 0 , x 0 ) N . Known results on its blowup set allow us to easily obtain the blowup conditions of the Euler-Poisson system.

18. Finite-size effects and percolation properties of Poisson geometries

Larmier, C.; Dumonteil, E.; Malvagi, F.; Mazzolo, A.; Zoia, A.

2016-07-01

Random tessellations of the space represent a class of prototype models of heterogeneous media, which are central in several applications in physics, engineering, and life sciences. In this work, we investigate the statistical properties of d -dimensional isotropic Poisson geometries by resorting to Monte Carlo simulation, with special emphasis on the case d =3 . We first analyze the behavior of the key features of these stochastic geometries as a function of the dimension d and the linear size L of the domain. Then, we consider the case of Poisson binary mixtures, where the polyhedra are assigned two labels with complementary probabilities. For this latter class of random geometries, we numerically characterize the percolation threshold, the strength of the percolating cluster, and the average cluster size.

19. Intrinsic Negative Poisson's Ratio for Single-Layer Graphene.

PubMed

Jiang, Jin-Wu; Chang, Tienchong; Guo, Xingming; Park, Harold S

2016-08-10

Negative Poisson's ratio (NPR) materials have drawn significant interest because the enhanced toughness, shear resistance, and vibration absorption that typically are seen in auxetic materials may enable a range of novel applications. In this work, we report that single-layer graphene exhibits an intrinsic NPR, which is robust and independent of its size and temperature. The NPR arises due to the interplay between two intrinsic deformation pathways (one with positive Poisson's ratio, the other with NPR), which correspond to the bond stretching and angle bending interactions in graphene. We propose an energy-based deformation pathway criteria, which predicts that the pathway with NPR has lower energy and thus becomes the dominant deformation mode when graphene is stretched by a strain above 6%, resulting in the NPR phenomenon.

20. Non-linear Poisson-Boltzmann theory for swollen clays

Leote de Carvalho, R. J. F.; Trizac, E.; Hansen, J.-P.

1998-08-01

The non-linear Poisson-Boltzmann (PB) equation for a circular, uniformly char ged platelet, confined together with co- and counter-ions to a cylindrical cell, is solved semi-analytically by transforming it into an integral equation and solving the latter iteratively. This method proves efficient and robust, and can be readily generalized to other problems based on cell models, treated within non-linear Poisson-like theory. The solution to the PB equation is computed over a wide range of physical conditions, and the resulting osmotic equation of state is shown to be in semi-quantitative agreement with recent experimental data for Laponite clay suspensions, in the concentrated gel phase.

1. Invariants and labels for Lie-Poisson Systems

SciTech Connect

Thiffeault, J.L.; Morrison, P.J.

1998-04-01

Reduction is a process that uses symmetry to lower the order of a Hamiltonian system. The new variables in the reduced picture are often not canonical: there are no clear variables representing positions and momenta, and the Poisson bracket obtained is not of the canonical type. Specifically, we give two examples that give rise to brackets of the noncanonical Lie-Poisson form: the rigid body and the two-dimensional ideal fluid. From these simple cases, we then use the semidirect product extension of algebras to describe more complex physical systems. The Casimir invariants in these systems are examined, and some are shown to be linked to the recovery of information about the configuration of the system. We discuss a case in which the extension is not a semidirect product, namely compressible reduced MHD, and find for this case that the Casimir invariants lend partial information about the configuration of the system.

2. Structural regression trees

SciTech Connect

Kramer, S.

1996-12-31

In many real-world domains the task of machine learning algorithms is to learn a theory for predicting numerical values. In particular several standard test domains used in Inductive Logic Programming (ILP) are concerned with predicting numerical values from examples and relational and mostly non-determinate background knowledge. However, so far no ILP algorithm except one can predict numbers and cope with nondeterminate background knowledge. (The only exception is a covering algorithm called FORS.) In this paper we present Structural Regression Trees (SRT), a new algorithm which can be applied to the above class of problems. SRT integrates the statistical method of regression trees into ILP. It constructs a tree containing a literal (an atomic formula or its negation) or a conjunction of literals in each node, and assigns a numerical value to each leaf. SRT provides more comprehensible results than purely statistical methods, and can be applied to a class of problems most other ILP systems cannot handle. Experiments in several real-world domains demonstrate that the approach is competitive with existing methods, indicating that the advantages are not at the expense of predictive accuracy.

3. Studying Resist Stochastics with the Multivariate Poisson Propagation Model

DOE PAGES

Naulleau, Patrick; Anderson, Christopher; Chao, Weilun; ...

2014-01-01

Progress in the ultimate performance of extreme ultraviolet resist has arguably decelerated in recent years suggesting an approach to stochastic limits both in photon counts and material parameters. Here we report on the performance of a variety of leading extreme ultraviolet resist both with and without chemical amplification. The measured performance is compared to stochastic modeling results using the Multivariate Poisson Propagation Model. The results show that the best materials are indeed nearing modeled performance limits.

4. Soft elasticity of RNA gels and negative Poisson ratio.

PubMed

Ahsan, Amir; Rudnick, Joseph; Bruinsma, Robijn

2007-12-01

We propose a model for the elastic properties of RNA gels. The model predicts anomalous elastic properties in the form of a negative Poisson ratio and shape instabilities. The anomalous elasticity is generated by the non-Gaussian force-deformation relation of single-stranded RNA. The effect is greatly magnified by broken rotational symmetry produced by double-stranded sequences and the concomitant soft modes of uniaxial elastomers.

5. Effect of poisson ratio on cellular structure formation.

PubMed

Bischofs, I B; Schwarz, U S

2005-08-05

Mechanically active cells in soft media act as force dipoles. The resulting elastic interactions are long ranged and favor the formation of strings. We show analytically that due to screening, the effective interaction between strings decays exponentially, with a decay length determined only by geometry. Both for disordered and ordered arrangements of cells, we predict novel phase transitions from paraelastic to ferroelastic and antiferroelastic phases as a function of the Poisson ratio.

6. A more general system for Poisson series manipulation.

NASA Technical Reports Server (NTRS)

Cherniack, J. R.

1973-01-01

The design of a working Poisson series processor system is described that is more general than those currently in use. This system is the result of a series of compromises among efficiency, generality, ease of programing, and ease of use. The most general form of coefficients that can be multiplied efficiently is pointed out, and the place of general-purpose algebraic systems in celestial mechanics is discussed.

7. Relaxation in two dimensions and the 'sinh-Poisson' equation

NASA Technical Reports Server (NTRS)

Montgomery, D.; Matthaeus, W. H.; Stribling, W. T.; Martinez, D.; Oughton, S.

1992-01-01

Long-time states of a turbulent, decaying, two-dimensional, Navier-Stokes flow are shown numerically to relax toward maximum-entropy configurations, as defined by the "sinh-Poisson" equation. The large-scale Reynolds number is about 14,000, the spatial resolution is (512)-squared, the boundary conditions are spatially periodic, and the evolution takes place over nearly 400 large-scale eddy-turnover times.

8. Binomial and Poisson Mixtures, Maximum Likelihood, and Maple Code

SciTech Connect

Bowman, Kimiko o; Shenton, LR

2006-01-01

The bias, variance, and skewness of maximum likelihoood estimators are considered for binomial and Poisson mixture distributions. The moments considered are asymptotic, and they are assessed using the Maple code. Question of existence of solutions and Karl Pearson's study are mentioned, along with the problems of valid sample space. Large samples to reduce variances are not unusual; this also applies to the size of the asymptotic skewness.

9. Events in time: Basic analysis of Poisson data

SciTech Connect

Engelhardt, M.E.

1994-09-01

The report presents basic statistical methods for analyzing Poisson data, such as the member of events in some period of time. It gives point estimates, confidence intervals, and Bayesian intervals for the rate of occurrence per unit of time. It shows how to compare subsets of the data, both graphically and by statistical tests, and how to look for trends in time. It presents a compound model when the rate of occurrence varies randomly. Examples and SAS programs are given.

10. Comparing Poisson Sigma Model with A-model

Bonechi, F.; Cattaneo, A. S.; Iraso, R.

2016-10-01

We discuss the A-model as a gauge fixing of the Poisson Sigma Model with target a symplectic structure. We complete the discussion in [4], where a gauge fixing defined by a compatible complex structure was introduced, by showing how to recover the A-model hierarchy of observables in terms of the AKSZ observables. Moreover, we discuss the off-shell supersymmetry of the A-model as a residual BV symmetry of the gauge fixed PSM action.

11. Indentability of conventional and negative Poisson's ratio foams

NASA Technical Reports Server (NTRS)

Lakes, R. S.; Elms, K.

1992-01-01

The indentation resistance of foams, both of conventional structure and of re-entrant structure giving rise to negative Poisson's ratio, is studied using holographic interferometry. In holographic indentation tests, re-entrant foams had higher yield strengths sigma(sub y) and lower stiffness E than conventional foams of the same original relative density. Calculated energy absorption for dynamic impact is considerably higher for re-entrant foam than conventional foam.

12. Poisson problems for semilinear Brinkman systems on Lipschitz domains in

Kohr, Mirela; Lanza de Cristoforis, Massimo; Wendland, Wolfgang L.

2015-06-01

The purpose of this paper is to combine a layer potential analysis with the Schauder fixed point theorem to show the existence of solutions of the Poisson problem for a semilinear Brinkman system on bounded Lipschitz domains in with Dirichlet or Robin boundary conditions and data in L 2-based Sobolev spaces. We also obtain an existence and uniqueness result for the Dirichlet problem for a special semilinear elliptic system, called the Darcy-Forchheimer-Brinkman system.

13. A linear regression solution to the spatial autocorrelation problem

Griffith, Daniel A.

The Moran Coefficient spatial autocorrelation index can be decomposed into orthogonal map pattern components. This decomposition relates it directly to standard linear regression, in which corresponding eigenvectors can be used as predictors. This paper reports comparative results between these linear regressions and their auto-Gaussian counterparts for the following georeferenced data sets: Columbus (Ohio) crime, Ottawa-Hull median family income, Toronto population density, southwest Ohio unemployment, Syracuse pediatric lead poisoning, and Glasgow standard mortality rates, and a small remotely sensed image of the High Peak district. This methodology is extended to auto-logistic and auto-Poisson situations, with selected data analyses including percentage of urban population across Puerto Rico, and the frequency of SIDs cases across North Carolina. These data analytic results suggest that this approach to georeferenced data analysis offers considerable promise.

14. Magnetic axis alignment and the Poisson alignment reference system

Griffith, Lee V.; Schenz, Richard F.; Sommargren, Gary E.

1989-01-01

Three distinct metrological operations are necessary to align a free-electron laser (FEL): the magnetic axis must be located, a straight line reference (SLR) must be generated, and the magnetic axis must be related to the SLR. This paper begins with a review of the motivation for developing an alignment system that will assure better than 100 micrometer accuracy in the alignment of the magnetic axis throughout an FEL. The paper describes techniques for identifying the magnetic axis of solenoids, quadrupoles, and wiggler poles. Propagation of a laser beam is described to the extent of revealing sources of nonlinearity in the beam. Development and use of the Poisson line, a diffraction effect, is described in detail. Spheres in a large-diameter laser beam create Poisson lines and thus provide a necessary mechanism for gauging between the magnetic axis and the SLR. Procedures for installing FEL components and calibrating alignment fiducials to the magnetic axes of the components are also described. An error budget shows that the Poisson alignment reference system will make it possible to meet the alignment tolerances for an FEL.

15. Magnetic alignment and the Poisson alignment reference system

Griffith, L. V.; Schenz, R. F.; Sommargren, G. E.

1990-08-01

Three distinct metrological operations are necessary to align a free-electron laser (FEL): the magnetic axis must be located, a straight line reference (SLR) must be generated, and the magnetic axis must be related to the SLR. This article begins with a review of the motivation for developing an alignment system that will assure better than 100-μm accuracy in the alignment of the magnetic axis throughout an FEL. The 100-μm accuracy is an error circle about an ideal axis for 300 m or more. The article describes techniques for identifying the magnetic axes of solenoids, quadrupoles, and wiggler poles. Propagation of a laser beam is described to the extent of revealing sources of nonlinearity in the beam. Development of a straight-line reference based on the Poisson line, a diffraction effect, is described in detail. Spheres in a large-diameter laser beam create Poisson lines and thus provide a necessary mechanism for gauging between the magnetic axis and the SLR. Procedures for installing FEL components and calibrating alignment fiducials to the magnetic axes of the components are also described. The Poisson alignment reference system should be accurate to 25 μm over 300 m, which is believed to be a factor-of-4 improvement over earlier techniques. An error budget shows that only 25% of the total budgeted tolerance is used for the alignment reference system, so the remaining tolerances should fall within the allowable range for FEL alignment.

16. A generalized Poisson solver for first-principles device simulations

SciTech Connect

Bani-Hashemian, Mohammad Hossein; VandeVondele, Joost; Brück, Sascha; Luisier, Mathieu

2016-01-28

Electronic structure calculations of atomistic systems based on density functional theory involve solving the Poisson equation. In this paper, we present a plane-wave based algorithm for solving the generalized Poisson equation subject to periodic or homogeneous Neumann conditions on the boundaries of the simulation cell and Dirichlet type conditions imposed at arbitrary subdomains. In this way, source, drain, and gate voltages can be imposed across atomistic models of electronic devices. Dirichlet conditions are enforced as constraints in a variational framework giving rise to a saddle point problem. The resulting system of equations is then solved using a stationary iterative method in which the generalized Poisson operator is preconditioned with the standard Laplace operator. The solver can make use of any sufficiently smooth function modelling the dielectric constant, including density dependent dielectric continuum models. For all the boundary conditions, consistent derivatives are available and molecular dynamics simulations can be performed. The convergence behaviour of the scheme is investigated and its capabilities are demonstrated.

17. Poisson Group Testing: A Probabilistic Model for Boolean Compressed Sensing

2015-08-01

We introduce a novel probabilistic group testing framework, termed Poisson group testing, in which the number of defectives follows a right-truncated Poisson distribution. The Poisson model has a number of new applications, including dynamic testing with diminishing relative rates of defectives. We consider both nonadaptive and semi-adaptive identification methods. For nonadaptive methods, we derive a lower bound on the number of tests required to identify the defectives with a probability of error that asymptotically converges to zero; in addition, we propose test matrix constructions for which the number of tests closely matches the lower bound. For semi-adaptive methods, we describe a lower bound on the expected number of tests required to identify the defectives with zero error probability. In addition, we propose a stage-wise reconstruction algorithm for which the expected number of tests is only a constant factor away from the lower bound. The methods rely only on an estimate of the average number of defectives, rather than on the individual probabilities of subjects being defective.

18. Lattice Metamaterials with Mechanically Tunable Poisson's Ratio for Vibration Control

Chen, Yanyu; Li, Tiantian; Scarpa, Fabrizio; Wang, Lifeng

2017-02-01

Metamaterials with artificially designed architectures are increasingly considered as new paradigmatic material systems with unusual physical properties. Here, we report a class of architected lattice metamaterials with mechanically tunable negative Poisson's ratios and vibration-mitigation capability. The proposed lattice metamaterials are built by replacing regular straight beams with sinusoidally shaped ones, which are highly stretchable under uniaxial tension. Our experimental and numerical results indicate that the proposed lattices exhibit extreme Poisson's-ratio variations between -0.7 and 0.5 over large tensile deformations up to 50%. This large variation of Poisson's-ratio values is attributed to the deformation pattern switching from bending to stretching within the sinusoidally shaped beams. The interplay between the multiscale (ligament and cell) architecture and wave propagation also enables remarkable broadband vibration-mitigation capability of the lattice metamaterials, which can be dynamically tuned by an external mechanical stimulus. The material design strategy provides insights into the development of classes of architected metamaterials with potential applications including energy absorption, tunable acoustics, vibration control, responsive devices, soft robotics, and stretchable electronics.

19. Poisson Downward Continuation Solution by the Jacobi Method

Kingdon, R.; Vaníček, P.

2011-03-01

Downward continuation is a continuing problem in geodesy and geophysics. Inversion of the discrete form of the Poisson integration process provides a numerical solution to the problem, but because the B matrix that defines the discrete Poisson integration is not always well conditioned the solution may be noisy in situations where the discretization step is small and in areas containing large heights. We provide two remedies, both in the context of the Jacobi iterative solution to the Poisson downward continuation problem. First, we suggest testing according to the upward continued result from each solution, rather then testing between successive solutions on the geoid, so that choice of a tolerance for the convergence of the iterative method is more meaningful and intuitive. Second, we show how a tolerance that reflects the conditioning of the B matrix can regularize the solution, and suggest an approximate way of choosing such a tolerance. Using these methods, we are able to calculate a solution that appears regular in an area of Papua New Guinea having heights over 3200 m, over a grid with 1 arc-minute spacing, based on a very poorly conditioned B matrix.

20. Blind beam-hardening correction from Poisson measurements

Gu, Renliang; Dogandžić, Aleksandar

2016-02-01

We develop a sparse image reconstruction method for Poisson-distributed polychromatic X-ray computed tomography (CT) measurements under the blind scenario where the material of the inspected object and the incident energy spectrum are unknown. We employ our mass-attenuation spectrum parameterization of the noiseless measurements and express the mass- attenuation spectrum as a linear combination of B-spline basis functions of order one. A block coordinate-descent algorithm is developed for constrained minimization of a penalized Poisson negative log-likelihood (NLL) cost function, where constraints and penalty terms ensure nonnegativity of the spline coefficients and nonnegativity and sparsity of the density map image; the image sparsity is imposed using a convex total-variation (TV) norm penalty term. This algorithm alternates between a Nesterov's proximal-gradient (NPG) step for estimating the density map image and a limited-memory Broyden-Fletcher-Goldfarb-Shanno with box constraints (L-BFGS-B) step for estimating the incident-spectrum parameters. To accelerate convergence of the density- map NPG steps, we apply function restart and a step-size selection scheme that accounts for varying local Lipschitz constants of the Poisson NLL. Real X-ray CT reconstruction examples demonstrate the performance of the proposed scheme.

1. A Poisson-Boltzmann dynamics method with nonperiodic boundary condition

Lu, Qiang; Luo, Ray

2003-12-01

We have developed a well-behaved and efficient finite difference Poisson-Boltzmann dynamics method with a nonperiodic boundary condition. This is made possible, in part, by a rather fine grid spacing used for the finite difference treatment of the reaction field interaction. The stability is also made possible by a new dielectric model that is smooth both over time and over space, an important issue in the application of implicit solvents. In addition, the electrostatic focusing technique facilitates the use of an accurate yet efficient nonperiodic boundary condition: boundary grid potentials computed by the sum of potentials from individual grid charges. Finally, the particle-particle particle-mesh technique is adopted in the computation of the Coulombic interaction to balance accuracy and efficiency in simulations of large biomolecules. Preliminary testing shows that the nonperiodic Poisson-Boltzmann dynamics method is numerically stable in trajectories at least 4 ns long. The new model is also fairly efficient: it is comparable to that of the pairwise generalized Born solvent model, making it a strong candidate for dynamics simulations of biomolecules in dilute aqueous solutions. Note that the current treatment of total electrostatic interactions is with no cutoff, which is important for simulations of biomolecules. Rigorous treatment of the Debye-Hückel screening is also possible within the Poisson-Boltzmann framework: its importance is demonstrated by a simulation of a highly charged protein.

2. Assessment of Linear Finite-Difference Poisson-Boltzmann Solvers

PubMed Central

Wang, Jun; Luo, Ray

2009-01-01

CPU time and memory usage are two vital issues that any numerical solvers for the Poisson-Boltzmann equation have to face in biomolecular applications. In this study we systematically analyzed the CPU time and memory usage of five commonly used finite-difference solvers with a large and diversified set of biomolecular structures. Our comparative analysis shows that modified incomplete Cholesky conjugate gradient and geometric multigrid are the most efficient in the diversified test set. For the two efficient solvers, our test shows that their CPU times increase approximately linearly with the numbers of grids. Their CPU times also increase almost linearly with the negative logarithm of the convergence criterion at very similar rate. Our comparison further shows that geometric multigrid performs better in the large set of tested biomolecules. However, modified incomplete Cholesky conjugate gradient is superior to geometric multigrid in molecular dynamics simulations of tested molecules. We also investigated other significant components in numerical solutions of the Poisson-Boltzmann equation. It turns out that the time-limiting step is the free boundary condition setup for the linear systems for the selected proteins if the electrostatic focusing is not used. Thus, development of future numerical solvers for the Poisson-Boltzmann equation should balance all aspects of the numerical procedures in realistic biomolecular applications. PMID:20063271

3. On Poisson's ratio and composition of the Earth's lower mantle

Poirier, J. P.

1987-07-01

Poisson's ratio of the lower mantle, calculated from recently published values of seismic wave velocities and extrapolated to atmospheric pressure and room temperature is found to be in the range 0.23 ⩽ ν ⩽ 0.25. These values are compared with the values of Poisson's ratio calculated for binary mixtures of MgSiO 3 perovskite and magnesiowüstite with various iron contents. Current values of the experimental error on measured elastic moduli give little hope to be able to discriminate between pyrolite and chondritic lower mantles: both are acceptable if the shear modulus of perovskite is in the upper range of Liebermann et al. estimates. A similar calculation using the seismic parameter φ confirms the results obtained by considering Poisson's ratio and further constrains the value of the shear modulus of perovskite to lie between 1600 and 1700 kilobars for current mantle models to remain plausible. Chemical stratification of the mantle is, therefore, possible but not required by seismological data.

4. Poisson-like spiking in circuits with probabilistic synapses.

PubMed

Moreno-Bote, Rubén

2014-07-01

Neuronal activity in cortex is variable both spontaneously and during stimulation, and it has the remarkable property that it is Poisson-like over broad ranges of firing rates covering from virtually zero to hundreds of spikes per second. The mechanisms underlying cortical-like spiking variability over such a broad continuum of rates are currently unknown. We show that neuronal networks endowed with probabilistic synaptic transmission, a well-documented source of variability in cortex, robustly generate Poisson-like variability over several orders of magnitude in their firing rate without fine-tuning of the network parameters. Other sources of variability, such as random synaptic delays or spike generation jittering, do not lead to Poisson-like variability at high rates because they cannot be sufficiently amplified by recurrent neuronal networks. We also show that probabilistic synapses predict Fano factor constancy of synaptic conductances. Our results suggest that synaptic noise is a robust and sufficient mechanism for the type of variability found in cortex.

5. A Poisson-lognormal conditional-autoregressive model for multivariate spatial analysis of pedestrian crash counts across neighborhoods.

PubMed

Wang, Yiyi; Kockelman, Kara M

2013-11-01

This work examines the relationship between 3-year pedestrian crash counts across Census tracts in Austin, Texas, and various land use, network, and demographic attributes, such as land use balance, residents' access to commercial land uses, sidewalk density, lane-mile densities (by roadway class), and population and employment densities (by type). The model specification allows for region-specific heterogeneity, correlation across response types, and spatial autocorrelation via a Poisson-based multivariate conditional auto-regressive (CAR) framework and is estimated using Bayesian Markov chain Monte Carlo methods. Least-squares regression estimates of walk-miles traveled per zone serve as the exposure measure. Here, the Poisson-lognormal multivariate CAR model outperforms an aspatial Poisson-lognormal multivariate model and a spatial model (without cross-severity correlation), both in terms of fit and inference. Positive spatial autocorrelation emerges across neighborhoods, as expected (due to latent heterogeneity or missing variables that trend in space, resulting in spatial clustering of crash counts). In comparison, the positive aspatial, bivariate cross correlation of severe (fatal or incapacitating) and non-severe crash rates reflects latent covariates that have impacts across severity levels but are more local in nature (such as lighting conditions and local sight obstructions), along with spatially lagged cross correlation. Results also suggest greater mixing of residences and commercial land uses is associated with higher pedestrian crash risk across different severity levels, ceteris paribus, presumably since such access produces more potential conflicts between pedestrian and vehicle movements. Interestingly, network densities show variable effects, and sidewalk provision is associated with lower severe-crash rates.

6. Spontaneous skin regression and predictors of skin regression in Thai scleroderma patients.

PubMed

Foocharoen, Chingching; Mahakkanukrauh, Ajanee; Suwannaroj, Siraphop; Nanagara, Ratanavadee

2011-09-01

Skin tightness is a major clinical manifestation of systemic sclerosis (SSc). Importantly for both clinicians and patients, spontaneous regression of the fibrosis process has been documented. The purpose of this study is to identify the incidence and related clinical characteristics of spontaneous regression among Thai SSc patients. A historical cohort with 4 years of follow-up was performed among SSc patients over 15 years of age diagnosed with SSc between January 1, 2005 and December 31, 2006 in Khon Kaen, Thailand. The start date was the date of the first symptom and the end date was the date of the skin score ≤2. To estimate the respective probability of regression and to assess the associated factors, the Kaplan-Meier method and Cox regression analysis was used. One hundred seventeen cases of SSc were included with a female to male ratio of 1.5:1. Thirteen patients (11.1%) experienced regression. The incidence rate of spontaneous skin regression was 0.31 per 100 person-months and the average duration of SSc at the time of regression was 35.9±15.6 months (range, 15.7-60 months). The factors that negatively correlated with regression were (a) diffuse cutaneous type, (b) Raynaud's phenomenon, (c) esophageal dysmotility, and (d) colchicine treatment at onset with a respective hazard ratio (HR) of 0.19, 0.19, 0.26, and 0.20. By contrast, the factor that positively correlated with regression was active alveolitis with cyclophosphamide therapy at onset with an HR of 4.23 (95% CI, 1.23-14.10). After regression analysis, only Raynaud's phenomenon at onset and diffuse cutaneous type had a significantly negative correlation to regression. A spontaneous regression of the skin fibrosis process was not uncommon among Thai SSc patients. The factors suggesting a poor predictor for cutaneous manifestation were Raynaud's phenomenon, diffuse cutaneous type while early cyclophosphamide therapy might be related to a better skin outcome.

7. Ridge regression processing

NASA Technical Reports Server (NTRS)

Kuhl, Mark R.

1990-01-01

Current navigation requirements depend on a geometric dilution of precision (GDOP) criterion. As long as the GDOP stays below a specific value, navigation requirements are met. The GDOP will exceed the specified value when the measurement geometry becomes too collinear. A new signal processing technique, called Ridge Regression Processing, can reduce the effects of nearly collinear measurement geometry; thereby reducing the inflation of the measurement errors. It is shown that the Ridge signal processor gives a consistently better mean squared error (MSE) in position than the Ordinary Least Mean Squares (OLS) estimator. The applicability of this technique is currently being investigated to improve the following areas: receiver autonomous integrity monitoring (RAIM), coverage requirements, availability requirements, and precision approaches.

8. Lumbar herniated disc: spontaneous regression

PubMed Central

Yüksel, Kasım Zafer

2017-01-01

Background Low back pain is a frequent condition that results in substantial disability and causes admission of patients to neurosurgery clinics. To evaluate and present the therapeutic outcomes in lumbar disc hernia (LDH) patients treated by means of a conservative approach, consisting of bed rest and medical therapy. Methods This retrospective cohort was carried out in the neurosurgery departments of hospitals in Kahramanmaraş city and 23 patients diagnosed with LDH at the levels of L3−L4, L4−L5 or L5−S1 were enrolled. Results The average age was 38.4 ± 8.0 and the chief complaint was low back pain and sciatica radiating to one or both lower extremities. Conservative treatment was administered. Neurological examination findings, durations of treatment and intervals until symptomatic recovery were recorded. Laségue tests and neurosensory examination revealed that mild neurological deficits existed in 16 of our patients. Previously, 5 patients had received physiotherapy and 7 patients had been on medical treatment. The number of patients with LDH at the level of L3−L4, L4−L5, and L5−S1 were 1, 13, and 9, respectively. All patients reported that they had benefit from medical treatment and bed rest, and radiologic improvement was observed simultaneously on MRI scans. The average duration until symptomatic recovery and/or regression of LDH symptoms was 13.6 ± 5.4 months (range: 5−22). Conclusions It should be kept in mind that lumbar disc hernias could regress with medical treatment and rest without surgery, and there should be an awareness that these patients could recover radiologically. This condition must be taken into account during decision making for surgical intervention in LDH patients devoid of indications for emergent surgery. PMID:28119770

9. Polarizable atomic multipole solutes in a Poisson-Boltzmann continuum

Schnieders, Michael J.; Baker, Nathan A.; Ren, Pengyu; Ponder, Jay W.

2007-03-01

Modeling the change in the electrostatics of organic molecules upon moving from vacuum into solvent, due to polarization, has long been an interesting problem. In vacuum, experimental values for the dipole moments and polarizabilities of small, rigid molecules are known to high accuracy; however, it has generally been difficult to determine these quantities for a polar molecule in water. A theoretical approach introduced by Onsager [J. Am. Chem. Soc. 58, 1486 (1936)] used vacuum properties of small molecules, including polarizability, dipole moment, and size, to predict experimentally known permittivities of neat liquids via the Poisson equation. Since this important advance in understanding the condensed phase, a large number of computational methods have been developed to study solutes embedded in a continuum via numerical solutions to the Poisson-Boltzmann equation. Only recently have the classical force fields used for studying biomolecules begun to include explicit polarization in their functional forms. Here the authors describe the theory underlying a newly developed polarizable multipole Poisson-Boltzmann (PMPB) continuum electrostatics model, which builds on the atomic multipole optimized energetics for biomolecular applications (AMOEBA) force field. As an application of the PMPB methodology, results are presented for several small folded proteins studied by molecular dynamics in explicit water as well as embedded in the PMPB continuum. The dipole moment of each protein increased on average by a factor of 1.27 in explicit AMOEBA water and 1.26 in continuum solvent. The essentially identical electrostatic response in both models suggests that PMPB electrostatics offers an efficient alternative to sampling explicit solvent molecules for a variety of interesting applications, including binding energies, conformational analysis, and pKa prediction. Introduction of 150mM salt lowered the electrostatic solvation energy between 2 and 13kcal /mole, depending on

10. Poisson-event-based analysis of cell proliferation.

PubMed

Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

2015-05-01

A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture.

11. A Generalized QMRA Beta-Poisson Dose-Response Model.

PubMed

Xie, Gang; Roiko, Anne; Stratton, Helen; Lemckert, Charles; Dunn, Peter K; Mengersen, Kerrie

2016-10-01

Quantitative microbial risk assessment (QMRA) is widely accepted for characterizing the microbial risks associated with food, water, and wastewater. Single-hit dose-response models are the most commonly used dose-response models in QMRA. Denoting PI(d) as the probability of infection at a given mean dose d, a three-parameter generalized QMRA beta-Poisson dose-response model, PI(d|α,β,r*), is proposed in which the minimum number of organisms required for causing infection, Kmin , is not fixed, but a random variable following a geometric distribution with parameter 0Poisson model, PI(d|α,β), is a special case of the generalized model with Kmin = 1 (which implies r*=1). The generalized beta-Poisson model is based on a conceptual model with greater detail in the dose-response mechanism. Since a maximum likelihood solution is not easily available, a likelihood-free approximate Bayesian computation (ABC) algorithm is employed for parameter estimation. By fitting the generalized model to four experimental data sets from the literature, this study reveals that the posterior median r* estimates produced fall short of meeting the required condition of r* = 1 for single-hit assumption. However, three out of four data sets fitted by the generalized models could not achieve an improvement in goodness of fit. These combined results imply that, at least in some cases, a single-hit assumption for characterizing the dose-response process may not be appropriate, but that the more complex models may be difficult to support especially if the sample size is small. The three-parameter generalized model provides a possibility to investigate the mechanism of a dose-response process in greater detail than is possible under a single-hit model.

12. Maslov indices, Poisson brackets, and singular differential forms

Esterlis, I.; Haggard, H. M.; Hedeman, A.; Littlejohn, R. G.

2014-06-01

Maslov indices are integers that appear in semiclassical wave functions and quantization conditions. They are often notoriously difficult to compute. We present methods of computing the Maslov index that rely only on typically elementary Poisson brackets and simple linear algebra. We also present a singular differential form, whose integral along a curve gives the Maslov index of that curve. The form is closed but not exact, and transforms by an exact differential under canonical transformations. We illustrate the method with the 6j-symbol, which is important in angular-momentum theory and in quantum gravity.

13. Poisson-Boltzmann theory for two parallel uniformly charged plates

SciTech Connect

Xing Xiangjun

2011-04-15

We solve the nonlinear Poisson-Boltzmann equation for two parallel and like-charged plates both inside a symmetric electrolyte, and inside a 2:1 asymmetric electrolyte, in terms of Weierstrass elliptic functions. From these solutions we derive the functional relation between the surface charge density, the plate separation, and the pressure between plates. For the one plate problem, we obtain exact expressions for the electrostatic potential and for the renormalized surface charge density, both in symmetric and in asymmetric electrolytes. For the two plate problems, we obtain new exact asymptotic results in various regimes.

14. A Poisson process approximation for generalized K-5 confidence regions

NASA Technical Reports Server (NTRS)

Arsham, H.; Miller, D. R.

1982-01-01

One-sided confidence regions for continuous cumulative distribution functions are constructed using empirical cumulative distribution functions and the generalized Kolmogorov-Smirnov distance. The band width of such regions becomes narrower in the right or left tail of the distribution. To avoid tedious computation of confidence levels and critical values, an approximation based on the Poisson process is introduced. This aproximation provides a conservative confidence region; moreover, the approximation error decreases monotonically to 0 as sample size increases. Critical values necessary for implementation are given. Applications are made to the areas of risk analysis, investment modeling, reliability assessment, and analysis of fault tolerant systems.

15. An empirical Bayes approach for the Poisson life distribution.

NASA Technical Reports Server (NTRS)

Canavos, G. C.

1973-01-01

A smooth empirical Bayes estimator is derived for the intensity parameter (hazard rate) in the Poisson distribution as used in life testing. The reliability function is also estimated either by using the empirical Bayes estimate of the parameter, or by obtaining the expectation of the reliability function. The behavior of the empirical Bayes procedure is studied through Monte Carlo simulation in which estimates of mean-squared errors of the empirical Bayes estimators are compared with those of conventional estimators such as minimum variance unbiased or maximum likelihood. Results indicate a significant reduction in mean-squared error of the empirical Bayes estimators over the conventional variety.

16. Poisson approach to clustering analysis of regulatory sequences.

PubMed

Wang, Haiying; Zheng, Huiru; Hu, Jinglu

2008-01-01

The presence of similar patterns in regulatory sequences may aid users in identifying co-regulated genes or inferring regulatory modules. By modelling pattern occurrences in regulatory regions with Poisson statistics, this paper presents a log likelihood ratio statistics-based distance measure to calculate pair-wise similarities between regulatory sequences. We employed it within three clustering algorithms: hierarchical clustering, Self-Organising Map, and a self-adaptive neural network. The results indicate that, in comparison to traditional clustering algorithms, the incorporation of the log likelihood ratio statistics-based distance into the learning process may offer considerable improvements in the process of regulatory sequence-based classification of genes.

17. Fission meter and neutron detection using poisson distribution comparison

DOEpatents

Rowland, Mark S; Snyderman, Neal J

2014-11-18

A neutron detector system and method for discriminating fissile material from non-fissile material wherein a digital data acquisition unit collects data at high rate, and in real-time processes large volumes of data directly into information that a first responder can use to discriminate materials. The system comprises counting neutrons from the unknown source and detecting excess grouped neutrons to identify fission in the unknown source. Comparison of the observed neutron count distribution with a Poisson distribution is performed to distinguish fissile material from non-fissile material.

18. Theory of multicolor lattice gas - A cellular automaton Poisson solver

NASA Technical Reports Server (NTRS)

Chen, H.; Matthaeus, W. H.; Klein, L. W.

1990-01-01

The present class of models for cellular automata involving a quiescent hydrodynamic lattice gas with multiple-valued passive labels termed 'colors', the lattice collisions change individual particle colors while preserving net color. The rigorous proofs of the multicolor lattice gases' essential features are rendered more tractable by an equivalent subparticle representation in which the color is represented by underlying two-state 'spins'. Schemes for the introduction of Dirichlet and Neumann boundary conditions are described, and two illustrative numerical test cases are used to verify the theory. The lattice gas model is equivalent to a Poisson equation solution.

19. Lie-Poisson bifurcations for the Maxwell-Bloch equations

SciTech Connect

David, D.

1990-01-01

We present a study of the set of Maxwell-Bloch equations on R{sup 3} from the point of view of Hamiltonian dynamics. These equations are shown to be bi-Hamiltonian, on the one hand, and to possess several inequivalent Lie-Poisson structures, on the other hand, parametrized by the group SL(2,R). Each structure is characterized by a particular distinguished function. The level sets of this function provide two-dimensional surfaces onto which the motion takes various symplectic forms. 4 refs.

20. Poisson's Ratio and the Densification of Glass under High Pressure

SciTech Connect

Rouxel, T.; Ji, H.; Hammouda, T.; Moreac, A.

2008-06-06

Because of a relatively low atomic packing density, (C{sub g}) glasses experience significant densification under high hydrostatic pressure. Poisson's ratio ({nu}) is correlated to C{sub g} and typically varies from 0.15 for glasses with low C{sub g} such as amorphous silica to 0.38 for close-packed atomic networks such as in bulk metallic glasses. Pressure experiments were conducted up to 25 GPa at 293 K on silica, soda-lime-silica, chalcogenide, and bulk metallic glasses. We show from these high-pressure data that there is a direct correlation between {nu} and the maximum post-decompression density change.

1. Some Poisson structures and Lax equations associated with the Toeplitz lattice and the Schur lattice

Lemarie, Caroline

2016-01-01

The Toeplitz lattice is a Hamiltonian system whose Poisson structure is known. In this paper, we unveil the origins of this Poisson structure and derive from it the associated Lax equations for this lattice. We first construct a Poisson subvariety H n of GL n (C), which we view as a real or complex Poisson-Lie group whose Poisson structure comes from a quadratic R-bracket on gl n (C) for a fixed R-matrix. The existence of Hamiltonians, associated to the Toeplitz lattice for the Poisson structure on H n , combined with the properties of the quadratic R-bracket allow us to give explicit formulas for the Lax equation. Then we derive from it the integrability in the sense of Liouville of the Toeplitz lattice. When we view the lattice as being defined over R, we can construct a Poisson subvariety H n τ of U n which is itself a Poisson-Dirac subvariety of GL n R (C). We then construct a Hamiltonian for the Poisson structure induced on H n τ , corresponding to another system which derives from the Toeplitz lattice the modified Schur lattice. Thanks to the properties of Poisson-Dirac subvarieties, we give an explicit Lax equation for the new system and derive from it a Lax equation for the Schur lattice. We also deduce the integrability in the sense of Liouville of the modified Schur lattice.

2. POISSON project. III. Investigating the evolution of the mass accretion rate

Antoniucci, S.; García López, R.; Nisini, B.; Caratti o Garatti, A.; Giannini, T.; Lorenzetti, D.

2014-12-01

Context. As part of the Protostellar Optical-Infrared Spectral Survey On NTT (POISSON) project, we present the results of the analysis of low-resolution near-IR spectroscopic data (0.9-2.4 μm) of two samples of young stellar objects in the Lupus (52 objects) and Serpens (17 objects) star-forming clouds, with masses in the range of 0.1 to 2.0 M⊙ and ages spanning from 105 to a few 107 yr. Aims: After determining the accretion parameters of the targets by analysing their H i near-IR emission features, we added the results from the Lupus and Serpens clouds to those from previous regions (investigated in POISSON with the same methodology) to obtain a final catalogue (143 objects) of mass accretion rate values (Ṁacc) derived in a homogeneous and consistent fashion. Our final goal is to analyse how Ṁacc correlates with the stellar mass (M∗) and how it evolves in time in the whole POISSON sample. Methods: We derived the accretion luminosity (Lacc) and Ṁacc for Lupus and Serpens objects from the Brγ (Paβ in a few cases) line by using relevant empirical relationships available in the literature that connect the H i line luminosity and Lacc. To minimise the biases that arise from adopting literature data that are based on different evolutionary models and also for self-consistency, we re-derived mass and age for each source of the POISSON samples using the same set of evolutionary tracks. Results: We observe a correlation Ṁacc~M*2.2 between mass accretion rate and stellar mass, similarly to what has previously been observed in several star-forming regions. We find that the time variation of Ṁacc is roughly consistent with the expected evolution of the accretion rate in viscous disks, with an asymptotic decay that behaves as t-1.6. However, Ṁacc values are characterised by a large scatter at similar ages and are on average higher than the predictions of viscous models. Conclusions: Although part of the scattering may be related to systematics due to the

3. Evaluating differential effects using regression interactions and regression mixture models

PubMed Central

Van Horn, M. Lee; Jaki, Thomas; Masyn, Katherine; Howe, George; Feaster, Daniel J.; Lamont, Andrea E.; George, Melissa R. W.; Kim, Minjung

2015-01-01

Research increasingly emphasizes understanding differential effects. This paper focuses on understanding regression mixture models, a relatively new statistical methods for assessing differential effects by comparing results to using an interactive term in linear regression. The research questions which each model answers, their formulation, and their assumptions are compared using Monte Carlo simulations and real data analysis. The capabilities of regression mixture models are described and specific issues to be addressed when conducting regression mixtures are proposed. The paper aims to clarify the role that regression mixtures can take in the estimation of differential effects and increase awareness of the benefits and potential pitfalls of this approach. Regression mixture models are shown to be a potentially effective exploratory method for finding differential effects when these effects can be defined by a small number of classes of respondents who share a typical relationship between a predictor and an outcome. It is also shown that the comparison between regression mixture models and interactions becomes substantially more complex as the number of classes increases. It is argued that regression interactions are well suited for direct tests of specific hypotheses about differential effects and regression mixtures provide a useful approach for exploring effect heterogeneity given adequate samples and study design. PMID:26556903

4. Identification d’une Classe de Processus de Poisson Filtres (Identification of a Class of Filtered Poisson Processes).

DTIC Science & Technology

1983-05-20

iei su + n bin oat Is param~tre Ou processus as Poisson. L𔄀vdnement en T proacalt, sur un capteur , A I ins- tant t, un offet proportionnel it I... capteur , *st + aodnergle finieatan a normalisd G. tI-exp - 1:u G(I +S5)))* 2 k k *L𔄀netrgie capt6O est aIdatolre at Is moable consi- oCj < u, 7> uI Z(t I...hiriquoment invariants. ainei I’cffet pose aucun probibme th6orique Ilobservation peut gur Is capteur vaut C G(t- T) avec G fonction car.- Sire veciorlelle

5. Measurement of Young's Modulus and Poisson's Ratio of Tuna Fish

Ogawa, Yutaka; Hagura, Yoshio

Considering that gape and heave produced during the freezing of tuna fish derive from changes in the mechanical properties of tuna fish itself during freezing,the Poisson's ratio and Young's modulus of tuna meat were measured at the respective temperature conditions of i) no freezing,ii) partial freezing,and iii) freezing. The results of measurement were shown that the mechanical properties of tuna meat displayed temperature dependence as sudden change at the boundary temperature of freezing beginning as summarized below: 1) the mechanical properties of tuna meat were anisotropic according to the tissue and structure of the fish body,but these properties greatly varied according to the test temperature;2) the Young's modulus of non-frozen tuna meat were approximately 50 KPa,but these became an extremely large value (approximately 4 GPa) after being frozen; and 3) the Poisson's ratio decreased as the frozen water percentage increased,but these displayed an approximate value of one or more.

6. PSH3D fast Poisson solver for petascale DNS

Adams, Darren; Dodd, Michael; Ferrante, Antonino

2016-11-01

Direct numerical simulation (DNS) of high Reynolds number, Re >= O (105) , turbulent flows requires computational meshes >= O (1012) grid points, and, thus, the use of petascale supercomputers. DNS often requires the solution of a Helmholtz (or Poisson) equation for pressure, which constitutes the bottleneck of the solver. We have developed a parallel solver of the Helmholtz equation in 3D, PSH3D. The numerical method underlying PSH3D combines a parallel 2D Fast Fourier transform in two spatial directions, and a parallel linear solver in the third direction. For computational meshes up to 81923 grid points, our numerical results show that PSH3D scales up to at least 262k cores of Cray XT5 (Blue Waters). PSH3D has a peak performance 6 × faster than 3D FFT-based methods when used with the 'partial-global' optimization, and for a 81923 mesh solves the Poisson equation in 1 sec using 128k cores. Also, we have verified that the use of PSH3D with the 'partial-global' optimization in our DNS solver does not reduce the accuracy of the numerical solution of the incompressible Navier-Stokes equations.

7. Saint-Venant end effects for materials with negative Poisson's ratios

NASA Technical Reports Server (NTRS)

Lakes, R. S.

1992-01-01

Results are presented from an analysis of Saint-Venant end effects for materials with negative Poisson's ratio. Examples are presented showing that slow decay of end stress occurs in circular cylinders of negative Poisson's ratio, whereas a sandwich panel containing rigid face sheets and a compliant core exhibits no anomalous effects for negative Poisson's ratio (but exhibits slow stress decay for core Poisson's ratios approaching 0.5). In sand panels with stiff but not perfectly rigid face sheets, a negative Poisson's ratio results in end stress decay, which is faster than it would be otherwise. It is suggested that the slow decay previously predicted for sandwich strips in plane deformation as a result of the geometry can be mitigated by the use of a negative Poisson's ratio material for the core.

8. Evaluating Differential Effects Using Regression Interactions and Regression Mixture Models

ERIC Educational Resources Information Center

Van Horn, M. Lee; Jaki, Thomas; Masyn, Katherine; Howe, George; Feaster, Daniel J.; Lamont, Andrea E.; George, Melissa R. W.; Kim, Minjung

2015-01-01

Research increasingly emphasizes understanding differential effects. This article focuses on understanding regression mixture models, which are relatively new statistical methods for assessing differential effects by comparing results to using an interactive term in linear regression. The research questions which each model answers, their…

9. Error bounds in cascading regressions

USGS Publications Warehouse

Karlinger, M.R.; Troutman, B.M.

1985-01-01

Cascading regressions is a technique for predicting a value of a dependent variable when no paired measurements exist to perform a standard regression analysis. Biases in coefficients of a cascaded-regression line as well as error variance of points about the line are functions of the correlation coefficient between dependent and independent variables. Although this correlation cannot be computed because of the lack of paired data, bounds can be placed on errors through the required properties of the correlation coefficient. The potential meansquared error of a cascaded-regression prediction can be large, as illustrated through an example using geomorphologic data. ?? 1985 Plenum Publishing Corporation.

10. Non-linear properties of metallic cellular materials with a negative Poisson's ratio

NASA Technical Reports Server (NTRS)

Choi, J. B.; Lakes, R. S.

1992-01-01

Negative Poisson's ratio copper foam was prepared and characterized experimentally. The transformation into re-entrant foam was accomplished by applying sequential permanent compressions above the yield point to achieve a triaxial compression. The Poisson's ratio of the re-entrant foam depended on strain and attained a relative minimum at strains near zero. Poisson's ratio as small as -0.8 was achieved. The strain dependence of properties occurred over a narrower range of strain than in the polymer foams studied earlier. Annealing of the foam resulted in a slightly greater magnitude of negative Poisson's ratio and greater toughness at the expense of a decrease in the Young's modulus.

11. Application of the Hyper-Poisson Generalized Linear Model for Analyzing Motor Vehicle Crashes.

PubMed

Khazraee, S Hadi; Sáez-Castillo, Antonio Jose; Geedipally, Srinivas Reddy; Lord, Dominique

2015-05-01

The hyper-Poisson distribution can handle both over- and underdispersion, and its generalized linear model formulation allows the dispersion of the distribution to be observation-specific and dependent on model covariates. This study's objective is to examine the potential applicability of a newly proposed generalized linear model framework for the hyper-Poisson distribution in analyzing motor vehicle crash count data. The hyper-Poisson generalized linear model was first fitted to intersection crash data from Toronto, characterized by overdispersion, and then to crash data from railway-highway crossings in Korea, characterized by underdispersion. The results of this study are promising. When fitted to the Toronto data set, the goodness-of-fit measures indicated that the hyper-Poisson model with a variable dispersion parameter provided a statistical fit as good as the traditional negative binomial model. The hyper-Poisson model was also successful in handling the underdispersed data from Korea; the model performed as well as the gamma probability model and the Conway-Maxwell-Poisson model previously developed for the same data set. The advantages of the hyper-Poisson model studied in this article are noteworthy. Unlike the negative binomial model, which has difficulties in handling underdispersed data, the hyper-Poisson model can handle both over- and underdispersed crash data. Although not a major issue for the Conway-Maxwell-Poisson model, the effect of each variable on the expected mean of crashes is easily interpretable in the case of this new model.

12. Logistic Regression: Concept and Application

ERIC Educational Resources Information Center

Cokluk, Omay

2010-01-01

The main focus of logistic regression analysis is classification of individuals in different groups. The aim of the present study is to explain basic concepts and processes of binary logistic regression analysis intended to determine the combination of independent variables which best explain the membership in certain groups called dichotomous…

13. Precision Efficacy Analysis for Regression.

ERIC Educational Resources Information Center

Brooks, Gordon P.

When multiple linear regression is used to develop a prediction model, sample size must be large enough to ensure stable coefficients. If the derivation sample size is inadequate, the model may not predict well for future subjects. The precision efficacy analysis for regression (PEAR) method uses a cross- validity approach to select sample sizes…

14. Numerical calibration of the stable poisson loaded specimen

NASA Technical Reports Server (NTRS)

Ghosn, Louis J.; Calomino, Anthony M.; Brewer, Dave N.

1992-01-01

An analytical calibration of the Stable Poisson Loaded (SPL) specimen is presented. The specimen configuration is similar to the ASTM E-561 compact-tension specimen with displacement controlled wedge loading used for R-Curve determination. The crack mouth opening displacements (CMOD's) are produced by the diametral expansion of an axially compressed cylindrical pin located in the wake of a machined notch. Due to the unusual loading configuration, a three-dimensional finite element analysis was performed with gap elements simulating the contact between the pin and specimen. In this report, stress intensity factors, CMOD's, and crack displacement profiles are reported for different crack lengths and different contacting conditions. It was concluded that the computed stress intensity factor decreases sharply with increasing crack length, thus making the SPL specimen configuration attractive for fracture testing of brittle, high modulus materials.

15. Testing homogeneity of two zero-inflated Poisson populations.

PubMed

Tse, Siu Keung; Chow, Shein Chung; Lu, Qingshu; Cosmatos, Dennis

2009-02-01

The problem of testing treatment difference in the occurrence of a safety parameter in a randomized parallel-group comparative clinical trial under the assumption that the number of occurrence follows a zero-inflated Poisson (ZIP) distribution is considered. Likelihood ratio tests (LRT) for homogeneity of two ZIP populations are derived under the hypotheses that (i) there is no difference in inflation parameters, (ii) there is no difference in non-zero means; and (iii) there is no difference in both inflation parameters and non-zero means. Approximate formulas for sample size calculation are also obtained for achieving a desired power for detecting a clinically meaningful difference under the corresponding alternative hypotheses. An example concerning the assessment of the gastrointestinal (GI) safety in terms of the number of erosion counts of a newly developed compound for the treatment of osteoarthritis and rheumatoid arthritis is given for illustration purpose.

16. Shape representation and classification using the poisson equation.

PubMed

Gorelick, Lena; Galun, Meirav; Sharon, Eitan; Basri, Ronen; Brandt, Achi

2006-12-01

We present a novel approach that allows us to reliably compute many useful properties of a silhouette. Our approach assigns, for every internal point of the silhouette, a value reflecting the mean time required for a random walk beginning at the point to hit the boundaries. This function can be computed by solving Poisson's equation, with the silhouette contours providing boundary conditions. We show how this function can be used to reliably extract various shape properties including part structure and rough skeleton, local orientation and aspect ratio of different parts, and convex and concave sections of the boundaries. In addition to this, we discuss properties of the solution and show how to efficiently compute this solution using multigrid algorithms. We demonstrate the utility of the extracted properties by using them for shape classification and retrieval.

17. Flux theory for Poisson distributed pores with Gaussian permeability.

PubMed

Salinas, Dino G

2016-01-01

The mean of the solute flux through membrane pores depends on the random distribution and permeability of the pores. Mathematical models including such randomness factors make it possible to obtain statistical parameters for pore characterization. Here, assuming that pores follow a Poisson distribution in the lipid phase and that their permeabilities follow a Gaussian distribution, a mathematical model for solute dynamics is obtained by applying a general result from a previous work regarding any number of different kinds of randomly distributed pores. The new proposed theory is studied using experimental parameters obtained elsewhere, and a method for finding the mean single pore flux rate from liposome flux assays is suggested. This method is useful for pores without requiring studies by patch-clamp in single cells or single-channel recordings. However, it does not apply in the case of ion-selective channels, in which a more complex flux law combining the concentration and electrical gradient is required.

18. Application of the sine-Poisson equation in solar magnetostatics

NASA Technical Reports Server (NTRS)

Webb, G. M.; Zank, G. P.

1990-01-01

Solutions of the sine-Poisson equations are used to construct a class of isothermal magnetostatic atmospheres, with one ignorable coordinate corresponding to a uniform gravitational field in a plane geometry. The distributed current in the model (j) is directed along the x-axis, where x is the horizontal ignorable coordinate; (j) varies as the sine of the magnetostatic potential and falls off exponentially with distance vertical to the base with an e-folding distance equal to the gravitational scale height. Solutions for the magnetostatic potential A corresponding to the one-soliton, two-soliton, and breather solutions of the sine-Gordon equation are studied. Depending on the values of the free parameters in the soliton solutions, horizontally periodic magnetostatic structures are obtained possessing either a single X-type neutral point, multiple neural X-points, or solutions without X-points.

19. Lindley frailty model for a class of compound Poisson processes

2013-10-01

The Lindley distribution gain importance in survival analysis for the similarity of exponential distribution and allowance for the different shapes of hazard function. Frailty models provide an alternative to proportional hazards model where misspecified or omitted covariates are described by an unobservable random variable. Despite of the distribution of the frailty is generally assumed to be continuous, it is appropriate to consider discrete frailty distributions In some circumstances. In this paper, frailty models with discrete compound Poisson process for the Lindley distributed failure time are introduced. Survival functions are derived and maximum likelihood estimation procedures for the parameters are studied. Then, the fit of the models to the earthquake data set of Turkey are examined.

20. Analytical stress intensity solution for the stable Poisson loaded specimen

Ghosn, Louis J.; Calomino, Anthony M.; Brewer, David N.

1993-04-01

An analytical calibration of the Stable Poisson Loaded (SPL) specimen is presented. The specimen configuration is similar to the ASTM E-561 compact-tension specimen with displacement controlled wedge loading used for R-curve determination. The crack mouth opening displacements (CMOD's) are produced by the diametral expansion of an axially compressed cylindrical pin located in the wake of a machined notch. Due to the unusual loading configuration, a three-dimensional finite element analysis was performed with gap elements simulating the contact between the pin and specimen. In this report, stress intensity factors, CMOD's, and crack displacement profiles, are reported for different crack lengths and different contacting conditions. It was concluded that the computed stress intensity factor decreases sharply with increasing crack length thus making the SPL specimen configuration attractive for fracture testing of brittle, high modulus materials.

1. Numerical calibration of the stable poisson loaded specimen

Ghosn, Louis J.; Calomino, Anthony M.; Brewer, Dave N.

1992-10-01

An analytical calibration of the Stable Poisson Loaded (SPL) specimen is presented. The specimen configuration is similar to the ASTM E-561 compact-tension specimen with displacement controlled wedge loading used for R-Curve determination. The crack mouth opening displacements (CMOD's) are produced by the diametral expansion of an axially compressed cylindrical pin located in the wake of a machined notch. Due to the unusual loading configuration, a three-dimensional finite element analysis was performed with gap elements simulating the contact between the pin and specimen. In this report, stress intensity factors, CMOD's, and crack displacement profiles are reported for different crack lengths and different contacting conditions. It was concluded that the computed stress intensity factor decreases sharply with increasing crack length, thus making the SPL specimen configuration attractive for fracture testing of brittle, high modulus materials.

2. Analytical stress intensity solution for the Stable Poisson Loaded specimen

Ghosn, Louis J.; Calomino, Anthony M.; Brewer, David N.

1993-04-01

An analytical calibration of the Stable Poisson Loaded (SPL) specimen is presented. The specimen configuration is similar to the ASTM E-561 compact-tension specimen with displacement controlled wedge loading used for R-curve determination. The crack mouth opening displacements (CMODs) are produced by the diametral expansion of an axially compressed cylindrical pin located in the wake of a machined notch. Due to the unusual loading configuration, a three-dimensional finite element analysis was performed with gap elements simulating the contact between the pin and specimen. In this report, stress intensity factors, CMODs, and crack displacement profiles, are reported for different crack lengths and different contacting conditions. It was concluded that the computed stress intensity factor decreases sharply with increasing crack length thus making the SPL specimen configuration attractive for fracture testing of brittle, high modulus materials.

3. Note on the Poisson structure of the damped oscillator

SciTech Connect

Hone, A. N. W.; Senthilvelan, M.

2009-10-15

The damped harmonic oscillator is one of the most studied systems with respect to the problem of quantizing dissipative systems. Recently Chandrasekar et al. [J. Math. Phys. 48, 032701 (2007)] applied the Prelle-Singer method to construct conserved quantities and an explicit time-independent Lagrangian and Hamiltonian structure for the damped oscillator. Here we describe the associated Poisson bracket which generates the continuous flow, pointing out that there is a subtle problem of definition on the whole phase space. The action-angle variables for the system are also presented, and we further explain how to extend these considerations to the discrete setting. Some implications for the quantum case are briefly mentioned.

4. Poisson's ratios of auxetic and other technological materials.

PubMed

Ballato, Arthur

2010-01-01

Poisson's ratio, the relation between lateral contraction of a thin, linearly elastic rod when subjected to a longitudinal extension, has a long and interesting history. For isotropic bodies, it can theoretically range from +1/2 to -1; the experimental gamut for anisotropics is even larger. The ratio is positive for all combinations of directions in most crystals. But as far back as the 1800s, Voigt and others found that negative values were encountered for some materials, a property now called auxeticity. Here we examine this property from the point of view of crystal stability and compute extrema of the ratio for various interesting and technologically important materials. Potential applications of the auxetic property are mentioned.

5. Optical Signal Processing: Poisson Image Restoration and Shearing Interferometry

NASA Technical Reports Server (NTRS)

Hong, Yie-Ming

1973-01-01

Optical signal processing can be performed in either digital or analog systems. Digital computers and coherent optical systems are discussed as they are used in optical signal processing. Topics include: image restoration; phase-object visualization; image contrast reversal; optical computation; image multiplexing; and fabrication of spatial filters. Digital optical data processing deals with restoration of images degraded by signal-dependent noise. When the input data of an image restoration system are the numbers of photoelectrons received from various areas of a photosensitive surface, the data are Poisson distributed with mean values proportional to the illuminance of the incoherently radiating object and background light. Optical signal processing using coherent optical systems is also discussed. Following a brief review of the pertinent details of Ronchi's diffraction grating interferometer, moire effect, carrier-frequency photography, and achromatic holography, two new shearing interferometers based on them are presented. Both interferometers can produce variable shear.

6. Nonstationary elementary-field light randomly triggered by Poisson impulses.

PubMed

Fernández-Pousa, Carlos R

2013-05-01

A stochastic theory of nonstationary light describing the random emission of elementary pulses is presented. The emission is governed by a nonhomogeneous Poisson point process determined by a time-varying emission rate. The model describes, in the appropriate limits, stationary, cyclostationary, locally stationary, and pulsed radiation, and reduces to a Gaussian theory in the limit of dense emission rate. The first- and second-order coherence theories are solved after the computation of second- and fourth-order correlation functions by use of the characteristic function. The ergodicity of second-order correlations under various types of detectors is explored and a number of observables, including optical spectrum, amplitude, and intensity correlations, are analyzed.

7. Analytical stress intensity solution for the Stable Poisson Loaded specimen

NASA Technical Reports Server (NTRS)

Ghosn, Louis J.; Calomino, Anthony M.; Brewer, David N.

1993-01-01

An analytical calibration of the Stable Poisson Loaded (SPL) specimen is presented. The specimen configuration is similar to the ASTM E-561 compact-tension specimen with displacement controlled wedge loading used for R-curve determination. The crack mouth opening displacements (CMODs) are produced by the diametral expansion of an axially compressed cylindrical pin located in the wake of a machined notch. Due to the unusual loading configuration, a three-dimensional finite element analysis was performed with gap elements simulating the contact between the pin and specimen. In this report, stress intensity factors, CMODs, and crack displacement profiles, are reported for different crack lengths and different contacting conditions. It was concluded that the computed stress intensity factor decreases sharply with increasing crack length thus making the SPL specimen configuration attractive for fracture testing of brittle, high modulus materials.

8. Discrete Integrable Systems and Poisson Algebras From Cluster Maps

Fordy, Allan P.; Hone, Andrew

2014-01-01

We consider nonlinear recurrences generated from cluster mutations applied to quivers that have the property of being cluster mutation-periodic with period 1. Such quivers were completely classified by Fordy and Marsh, who characterised them in terms of the skew-symmetric matrix that defines the quiver. The associated nonlinear recurrences are equivalent to birational maps, and we explain how these maps can be endowed with an invariant Poisson bracket and/or presymplectic structure. Upon applying the algebraic entropy test, we are led to a series of conjectures which imply that the entropy of the cluster maps can be determined from their tropical analogues, which leads to a sharp classification result. Only four special families of these maps should have zero entropy. These families are examined in detail, with many explicit examples given, and we show how they lead to discrete dynamics that is integrable in the Liouville-Arnold sense.

9. Anisotropic norm-oriented mesh adaptation for a Poisson problem

Brèthes, Gautier; Dervieux, Alain

2016-10-01

We present a novel formulation for the mesh adaptation of the approximation of a Partial Differential Equation (PDE). The discussion is restricted to a Poisson problem. The proposed norm-oriented formulation extends the goal-oriented formulation since it is equation-based and uses an adjoint. At the same time, the norm-oriented formulation somewhat supersedes the goal-oriented one since it is basically a solution-convergent method. Indeed, goal-oriented methods rely on the reduction of the error in evaluating a chosen scalar output with the consequence that, as mesh size is increased (more degrees of freedom), only this output is proven to tend to its continuous analog while the solution field itself may not converge. A remarkable quality of goal-oriented metric-based adaptation is the mathematical formulation of the mesh adaptation problem under the form of the optimization, in the well-identified set of metrics, of a well-defined functional. In the new proposed formulation, we amplify this advantage. We search, in the same well-identified set of metrics, the minimum of a norm of the approximation error. The norm is prescribed by the user and the method allows addressing the case of multi-objective adaptation like, for example in aerodynamics, adaptating the mesh for drag, lift and moment in one shot. In this work, we consider the basic linear finite-element approximation and restrict our study to L2 norm in order to enjoy second-order convergence. Numerical examples for the Poisson problem are computed.

10. A geometric multigrid Poisson solver for domains containing solid inclusions

Botto, Lorenzo

2013-03-01

A Cartesian grid method for the fast solution of the Poisson equation in three-dimensional domains with embedded solid inclusions is presented and its performance analyzed. The efficiency of the method, which assume Neumann conditions at the immersed boundaries, is comparable to that of a multigrid method for regular domains. The method is light in terms of memory usage, and easily adaptable to parallel architectures. Tests with random and ordered arrays of solid inclusions, including spheres and ellipsoids, demonstrate smooth convergence of the residual for small separation between the inclusion surfaces. This feature is important, for instance, in simulations of nearly-touching finite-size particles. The implementation of the method, “MG-Inc”, is available online. Catalogue identifier: AEOE_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOE_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 19068 No. of bytes in distributed program, including test data, etc.: 215118 Distribution format: tar.gz Programming language: C++ (fully tested with GNU GCC compiler). Computer: Any machine supporting standard C++ compiler. Operating system: Any OS supporting standard C++ compiler. RAM: About 150MB for 1283 resolution Classification: 4.3. Nature of problem: Poisson equation in domains containing inclusions; Neumann boundary conditions at immersed boundaries. Solution method: Geometric multigrid with finite-volume discretization. Restrictions: Stair-case representation of the immersed boundaries. Running time: Typically a fraction of a minute for 1283 resolution.

11. A Three-dimensional Polymer Scaffolding Material Exhibiting a Zero Poisson's Ratio.

PubMed

Soman, Pranav; Fozdar, David Y; Lee, Jin Woo; Phadke, Ameya; Varghese, Shyni; Chen, Shaochen

2012-05-14

Poisson's ratio describes the degree to which a material contracts (expands) transversally when axially strained. A material with a zero Poisson's ratio does not transversally deform in response to an axial strain (stretching). In tissue engineering applications, scaffolding having a zero Poisson's ratio (ZPR) may be more suitable for emulating the behavior of native tissues and accommodating and transmitting forces to the host tissue site during wound healing (or tissue regrowth). For example, scaffolding with a zero Poisson's ratio may be beneficial in the engineering of cartilage, ligament, corneal, and brain tissues, which are known to possess Poisson's ratios of nearly zero. Here, we report a 3D biomaterial constructed from polyethylene glycol (PEG) exhibiting in-plane Poisson's ratios of zero for large values of axial strain. We use digital micro-mirror device projection printing (DMD-PP) to create single- and double-layer scaffolds composed of semi re-entrant pores whose arrangement and deformation mechanisms contribute the zero Poisson's ratio. Strain experiments prove the zero Poisson's behavior of the scaffolds and that the addition of layers does not change the Poisson's ratio. Human mesenchymal stem cells (hMSCs) cultured on biomaterials with zero Poisson's ratio demonstrate the feasibility of utilizing these novel materials for biological applications which require little to no transverse deformations resulting from axial strains. Techniques used in this work allow Poisson's ratio to be both scale-independent and independent of the choice of strut material for strains in the elastic regime, and therefore ZPR behavior can be imparted to a variety of photocurable biomaterial.

12. Assessing risk factors for periodontitis using regression

Lobo Pereira, J. A.; Ferreira, Maria Cristina; Oliveira, Teresa

2013-10-01

Multivariate statistical analysis is indispensable to assess the associations and interactions between different factors and the risk of periodontitis. Among others, regression analysis is a statistical technique widely used in healthcare to investigate and model the relationship between variables. In our work we study the impact of socio-demographic, medical and behavioral factors on periodontal health. Using regression, linear and logistic models, we can assess the relevance, as risk factors for periodontitis disease, of the following independent variables (IVs): Age, Gender, Diabetic Status, Education, Smoking status and Plaque Index. The multiple linear regression analysis model was built to evaluate the influence of IVs on mean Attachment Loss (AL). Thus, the regression coefficients along with respective p-values will be obtained as well as the respective p-values from the significance tests. The classification of a case (individual) adopted in the logistic model was the extent of the destruction of periodontal tissues defined by an Attachment Loss greater than or equal to 4 mm in 25% (AL≥4mm/≥25%) of sites surveyed. The association measures include the Odds Ratios together with the correspondent 95% confidence intervals.

13. Time series regression model for infectious disease and weather.

PubMed

Imai, Chisato; Armstrong, Ben; Chalabi, Zaid; Mangtani, Punam; Hashizume, Masahiro

2015-10-01

Time series regression has been developed and long used to evaluate the short-term associations of air pollution and weather with mortality or morbidity of non-infectious diseases. The application of the regression approaches from this tradition to infectious diseases, however, is less well explored and raises some new issues. We discuss and present potential solutions for five issues often arising in such analyses: changes in immune population, strong autocorrelations, a wide range of plausible lag structures and association patterns, seasonality adjustments, and large overdispersion. The potential approaches are illustrated with datasets of cholera cases and rainfall from Bangladesh and influenza and temperature in Tokyo. Though this article focuses on the application of the traditional time series regression to infectious diseases and weather factors, we also briefly introduce alternative approaches, including mathematical modeling, wavelet analysis, and autoregressive integrated moving average (ARIMA) models. Modifications proposed to standard time series regression practice include using sums of past cases as proxies for the immune population, and using the logarithm of lagged disease counts to control autocorrelation due to true contagion, both of which are motivated from "susceptible-infectious-recovered" (SIR) models. The complexity of lag structures and association patterns can often be informed by biological mechanisms and explored by using distributed lag non-linear models. For overdispersed models, alternative distribution models such as quasi-Poisson and negative binomial should be considered. Time series regression can be used to investigate dependence of infectious diseases on weather, but may need modifying to allow for features specific to this context.

14. Extension of the application of conway-maxwell-poisson models: analyzing traffic crash data exhibiting underdispersion.

PubMed

Lord, Dominique; Geedipally, Srinivas Reddy; Guikema, Seth D

2010-08-01

The objective of this article is to evaluate the performance of the COM-Poisson GLM for analyzing crash data exhibiting underdispersion (when conditional on the mean). The COM-Poisson distribution, originally developed in 1962, has recently been reintroduced by statisticians for analyzing count data subjected to either over- or underdispersion. Over the last year, the COM-Poisson GLM has been evaluated in the context of crash data analysis and it has been shown that the model performs as well as the Poisson-gamma model for crash data exhibiting overdispersion. To accomplish the objective of this study, several COM-Poisson models were estimated using crash data collected at 162 railway-highway crossings in South Korea between 1998 and 2002. This data set has been shown to exhibit underdispersion when models linking crash data to various explanatory variables are estimated. The modeling results were compared to those produced from the Poisson and gamma probability models documented in a previous published study. The results of this research show that the COM-Poisson GLM can handle crash data when the modeling output shows signs of underdispersion. Finally, they also show that the model proposed in this study provides better statistical performance than the gamma probability and the traditional Poisson models, at least for this data set.

15. Three-Dimensional Polymer Constructs Exhibiting a Tunable Negative Poisson's Ratio.

PubMed

Fozdar, David Y; Soman, Pranav; Lee, Jin Woo; Han, Li-Hsin; Chen, Shaochen

2011-07-22

Young's modulus and Poisson's ratio of a porous polymeric construct (scaffold) quantitatively describe how it supports and transmits external stresses to its surroundings. While Young's modulus is always non-negative and highly tunable in magnitude, Poisson's ratio can, indeed, take on negative values despite the fact that it is non-negative for virtually every naturally occurring and artificial material. In some applications, a construct having a tunable negative Poisson's ratio (an auxetic construct) may be more suitable for supporting the external forces imposed upon it by its environment. Here, three-dimensional polyethylene glycol scaffolds with tunable negative Poisson's ratios are fabricated. Digital micromirror device projection printing (DMD-PP) is used to print single-layer constructs composed of cellular structures (pores) with special geometries, arrangements, and deformation mechanisms. The presence of the unit-cellular structures tunes the magnitude and polarity (positive or negative) of Poisson's ratio. Multilayer constructs are fabricated with DMD-PP by stacking the single-layer constructs with alternating layers of vertical connecting posts. The Poisson's ratios of the single- and multilayer constructs are determined from strain experiments, which show (1) that the Poisson's ratios of the constructs are accurately predicted by analytical deformation models and (2) that no slipping occurrs between layers in the multilayer constructs and the addition of new layers does not affect Poisson's ratio.

16. Double-Negative Mechanical Metamaterials Displaying Simultaneous Negative Stiffness and Negative Poisson's Ratio Properties.

PubMed

Hewage, Trishan A M; Alderson, Kim L; Alderson, Andrew; Scarpa, Fabrizio

2016-12-01

A scalable mechanical metamaterial simultaneously displaying negative stiffness and negative Poisson's ratio responses is presented. Interlocking hexagonal subunit assemblies containing 3 alternative embedded negative stiffness (NS) element types display Poisson's ratio values of -1 and NS values over two orders of magnitude (-1.4 N mm(-1) to -160 N mm(-1) ), in good agreement with model predictions.

17. Comment on: 'A Poisson resampling method for simulating reduced counts in nuclear medicine images'.

PubMed

de Nijs, Robin

2015-07-21

In order to be able to calculate half-count images from already acquired data, White and Lawson published their method based on Poisson resampling. They verified their method experimentally by measurements with a Co-57 flood source. In this comment their results are reproduced and confirmed by a direct numerical simulation in Matlab. Not only Poisson resampling, but also two direct redrawing methods were investigated. Redrawing methods were based on a Poisson and a Gaussian distribution. Mean, standard deviation, skewness and excess kurtosis half-count/full-count ratios were determined for all methods, and compared to the theoretical values for a Poisson distribution. Statistical parameters showed the same behavior as in the original note and showed the superiority of the Poisson resampling method. Rounding off before saving of the half count image had a severe impact on counting statistics for counts below 100. Only Poisson resampling was not affected by this, while Gaussian redrawing was less affected by it than Poisson redrawing. Poisson resampling is the method of choice, when simulating half-count (or less) images from full-count images. It simulates correctly the statistical properties, also in the case of rounding off of the images.

18. Rank regression: an alternative regression approach for data with outliers.

PubMed

Chen, Tian; Tang, Wan; Lu, Ying; Tu, Xin

2014-10-01

Linear regression models are widely used in mental health and related health services research. However, the classic linear regression analysis assumes that the data are normally distributed, an assumption that is not met by the data obtained in many studies. One method of dealing with this problem is to use semi-parametric models, which do not require that the data be normally distributed. But semi-parametric models are quite sensitive to outlying observations, so the generated estimates are unreliable when study data includes outliers. In this situation, some researchers trim the extreme values prior to conducting the analysis, but the ad-hoc rules used for data trimming are based on subjective criteria so different methods of adjustment can yield different results. Rank regression provides a more objective approach to dealing with non-normal data that includes outliers. This paper uses simulated and real data to illustrate this useful regression approach for dealing with outliers and compares it to the results generated using classical regression models and semi-parametric regression models.

19. Practical Session: Simple Linear Regression

Clausel, M.; Grégoire, G.

2014-12-01

Two exercises are proposed to illustrate the simple linear regression. The first one is based on the famous Galton's data set on heredity. We use the lm R command and get coefficients estimates, standard error of the error, R2, residuals …In the second example, devoted to data related to the vapor tension of mercury, we fit a simple linear regression, predict values, and anticipate on multiple linear regression. This pratical session is an excerpt from practical exercises proposed by A. Dalalyan at EPNC (see Exercises 1 and 2 of http://certis.enpc.fr/~dalalyan/Download/TP_ENPC_4.pdf).

20. Auxetic Black Phosphorus: A 2D Material with Negative Poisson's Ratio

Du, Yuchen; Maassen, Jesse; Wu, Wangran; Luo, Zhe; Xu, Xianfan; Ye, Peide D.

2016-10-01

The Poisson's ratio of a material characterizes its response to uniaxial strain. Materials normally possess a positive Poisson's ratio - they contract laterally when stretched, and expand laterally when compressed. A negative Poisson's ratio is theoretically permissible but has not, with few exceptions of man-made bulk structures, been experimentally observed in any natural materials. Here, we show that the negative Poisson's ratio exists in the low-dimensional natural material black phosphorus, and that our experimental observations are consistent with first principles simulations. Through application of uniaxial strain along zigzag and armchair directions, we find that both interlayer and intralayer negative Poisson's ratios can be obtained in black phosphorus. The phenomenon originates from the puckered structure of its in-plane lattice, together with coupled hinge-like bonding configurations.

1. Age-related changes in the propensity of dogs to bite.

PubMed

Messam, L L McV; Kass, P H; Chomel, B B; Hart, L A

2013-08-01

This retrospective cohort study was aimed at describing the effects of age at acquisition, age, and duration of ownership of dogs on the risk of (1) bites during play and (2) non-play bites to humans. Data were collected on 110 dogs that had bitten during play with a person, 161 dogs that had bitten outside of play and 951 non-biting dogs from veterinary clients in Kingston (KGN), Jamaica and San Francisco (SF), USA. Modified Poisson regression was employed to model the relationships of both types of bites to each variable separately. Effects of the variables on dog bite risk (1) during and (2) outside of play with the dog, differed from each other and by type of bite. Effects varied with the dog's age and age-related associations were strongest in dogs younger than 1 year old. Ages at acquisition of dogs at highest risk for bites during play were substantially lower than those at risk for non-play bites. Ages and durations of ownership of dogs at highest risk for bites during play were also lower than those of dogs at highest risk for non-play bites. The propensity of a dog to bite changes as it ages and relationships between dog bites occurring during and outside of play and the dog's age at acquisition, current age, and duration of ownership, differ from each other.

2. Trends in age-adjusted coronary heart disease mortality rates in Slovakia between 1993 and 2009.

PubMed

Psota, Marek; Pekarciková, Jarmila; O'Mullane, Monica; Rusnák, Martin

2013-06-01

Cardiovascular diseases (CVD) and especially coronary heart disease (CHD) are the main causes of death in the Slovak Republic (SR). The aim of this study is to explore trends in age-adjusted coronary heart disease mortality rates in the whole Slovak population and in the population of working age between the years 1993 and 2009. A related indicator - potential years of life lost (PYLL) due to CHD--was calculated in the same period for males and females. Crude CHD mortality rates were age-adjusted using European standard population. The joinpoint Poisson regression was performed in order to find out the annual percentage change in trends. The age-adjusted CHD mortality rates decreased in the Slovak population and also in the population of working age. The change was significant only within the working-age sub-group. We found that partial diagnoses (myocardial infarction and chronic ischaemic heart disease) developed in the mirror-like manner. PYLL per 100,000 decreased during the observed period and the decline was more prominent in males. For further research we recommend to focus on several other issues, namely, to examine the validity of cause of death codes, to examine the development of mortality rates in selected age groups, to find out the cause of differential development of mortality rates in the Slovak Republic in comparison with the Czech Republic and Poland, and to explain the causes of decrease of the age-adjusted CHD mortality rates in younger age groups in Slovakia.

3. Marginalized zero-inflated negative binomial regression with application to dental caries.

PubMed

Preisser, John S; Das, Kalyan; Long, D Leann; Divaris, Kimon

2016-05-10

The zero-inflated negative binomial regression model (ZINB) is often employed in diverse fields such as dentistry, health care utilization, highway safety, and medicine to examine relationships between exposures of interest and overdispersed count outcomes exhibiting many zeros. The regression coefficients of ZINB have latent class interpretations for a susceptible subpopulation at risk for the disease/condition under study with counts generated from a negative binomial distribution and for a non-susceptible subpopulation that provides only zero counts. The ZINB parameters, however, are not well-suited for estimating overall exposure effects, specifically, in quantifying the effect of an explanatory variable in the overall mixture population. In this paper, a marginalized zero-inflated negative binomial regression (MZINB) model for independent responses is proposed to model the population marginal mean count directly, providing straightforward inference for overall exposure effects based on maximum likelihood estimation. Through simulation studies, the finite sample performance of MZINB is compared with marginalized zero-inflated Poisson, Poisson, and negative binomial regression. The MZINB model is applied in the evaluation of a school-based fluoride mouthrinse program on dental caries in 677 children.

4. Parental report of the early development of children with regressive autism: the delays-plus-regression phenotype.

PubMed

Ozonoff, Sally; Williams, Brenda J; Landa, Rebecca

2005-12-01

Most children with autism demonstrate developmental abnormalities in their first year, whereas others display regression after mostly normal development. Few studies have examined the early development of the latter group. This study developed a retrospective measure, the Early Development Questionnaire (EDQ), to collect specific, parent-reported information about development in the first 18 months. Based on their EDQ scores, 60 children with autism between the ages of 3 and 9 were divided into three groups: an early onset group (n = 29), a definite regression group (n = 23), and a heterogeneous mixed group (n = 8). Significant differences in early social development were found between the early onset and regression groups. However, over 50 percent of the children who experienced a regression demonstrated some early social deficits during the first year of life, long before regression and the apparent onset of autism. This group, tentatively labeled 'delays-plus-regression', deserves further study.

5. Multiple Regression and Its Discontents

ERIC Educational Resources Information Center

Snell, Joel C.; Marsh, Mitchell

2012-01-01

Multiple regression is part of a larger statistical strategy originated by Gauss. The authors raise questions about the theory and suggest some changes that would make room for Mandelbrot and Serendipity.

6. Wrong Signs in Regression Coefficients

NASA Technical Reports Server (NTRS)

McGee, Holly

1999-01-01

When using parametric cost estimation, it is important to note the possibility of the regression coefficients having the wrong sign. A wrong sign is defined as a sign on the regression coefficient opposite to the researcher's intuition and experience. Some possible causes for the wrong sign discussed in this paper are a small range of x's, leverage points, missing variables, multicollinearity, and computational error. Additionally, techniques for determining the cause of the wrong sign are given.

7. Heterogeneity in drug abuse among juvenile offenders: is mixture regression more informative than standard regression?

PubMed

Montgomery, Katherine L; Vaughn, Michael G; Thompson, Sanna J; Howard, Matthew O

2013-11-01

Research on juvenile offenders has largely treated this population as a homogeneous group. However, recent findings suggest that this at-risk population may be considerably more heterogeneous than previously believed. This study compared mixture regression analyses with standard regression techniques in an effort to explain how known factors such as distress, trauma, and personality are associated with drug abuse among juvenile offenders. Researchers recruited 728 juvenile offenders from Missouri juvenile correctional facilities for participation in this study. Researchers investigated past-year substance use in relation to the following variables: demographic characteristics (gender, ethnicity, age, familial use of public assistance), antisocial behavior, and mental illness symptoms (psychopathic traits, psychiatric distress, and prior trauma). Results indicated that standard and mixed regression approaches identified significant variables related to past-year substance use among this population; however, the mixture regression methods provided greater specificity in results. Mixture regression analytic methods may help policy makers and practitioners better understand and intervene with the substance-related subgroups of juvenile offenders.

8. Analytical solutions of the Poisson-Boltzmann equation: biological applications

Fenley, Andrew; Gordon, John; Onufriev, Alexey

2006-03-01

Electrostatic interactions are a key factor for determining many properties of bio-molecules. The ability to compute the electrostatic potential generated by a molecule is often essential in understanding the mechanism behind its biological function such as catalytic activity, ligand binding, and macromolecular association. We propose an approximate analytical solution to the (linearized) Poisson-Boltzmann (PB) equation that is suitable for computing electrostatic potential around realistic biomolecules. The approximation is tested against the numerical solutions of the PB equation on a test set of 600 representative structures including proteins, DNA, and macromolecular complexes. The approach allows one to generate, with the power of a desktop PC, electrostatic potential maps of virtually any molecule of interest, from single proteins to large protein complexes such as viral capsids. The new approach is orders of magnitude less computationally intense than its numerical counterpart, yet is almost equal in accuracy. When studying very large molecular systems, our method is a practical and inexpensive way of computing bio- molecular potential at atomic resolution. We demonstrate the usefullnes of the new approach by exploring the details of electrostatic potentials generated by two of such systems: the nucleosome core particle (25,000 atoms) and tobacco ring spot virus (500,000 atoms). Biologically relevant insights are generated.

9. Linear-Nonlinear-Poisson Models of Primate Choice Dynamics

PubMed Central

Corrado, Greg S; Sugrue, Leo P; Sebastian Seung, H; Newsome, William T

2005-01-01

The equilibrium phenomenon of matching behavior traditionally has been studied in stationary environments. Here we attempt to uncover the local mechanism of choice that gives rise to matching by studying behavior in a highly dynamic foraging environment. In our experiments, 2 rhesus monkeys (Macacca mulatta) foraged for juice rewards by making eye movements to one of two colored icons presented on a computer monitor, each rewarded on dynamic variable-interval schedules. Using a generalization of Wiener kernel analysis, we recover a compact mechanistic description of the impact of past reward on future choice in the form of a Linear-Nonlinear-Poisson model. We validate this model through rigorous predictive and generative testing. Compared to our earlier work with this same data set, this model proves to be a better description of choice behavior and is more tightly correlated with putative neural value signals. Refinements over previous models include hyperbolic (as opposed to exponential) temporal discounting of past rewards, and differential (as opposed to fractional) comparisons of option value. Through numerical simulation we find that within this class of strategies, the model parameters employed by animals are very close to those that maximize reward harvesting efficiency. PMID:16596981

10. Fast Poisson noise removal by biorthogonal Haar domain hypothesis testing

Zhang, B.; Fadili, M. J.; Starck, J.-L.; Digel, S. W.

2008-07-01

Methods based on hypothesis tests (HTs) in the Haar domain are widely used to denoise Poisson count data. Facing large datasets or real-time applications, Haar-based denoisers have to use the decimated transform to meet limited-memory or computation-time constraints. Unfortunately, for regular underlying intensities, decimation yields discontinuous estimates and strong “staircase” artifacts. In this paper, we propose to combine the HT framework with the decimated biorthogonal Haar (Bi-Haar) transform instead of the classical Haar. The Bi-Haar filter bank is normalized such that the p-values of Bi-Haar coefficients (p) provide good approximation to those of Haar (pH) for high-intensity settings or large scales; for low-intensity settings and small scales, we show that p are essentially upper-bounded by pH. Thus, we may apply the Haar-based HTs to Bi-Haar coefficients to control a prefixed false positive rate. By doing so, we benefit from the regular Bi-Haar filter bank to gain a smooth estimate while always maintaining a low computational complexity. A Fisher-approximation-based threshold implementing the HTs is also established. The efficiency of this method is illustrated on an example of hyperspectral-source-flux estimation.

11. The Euler-Poisson-Darboux equation for relativists

Stewart, John M.

2009-09-01

12. Analysis of Poisson frequency data under a simple crossover trial.

PubMed

Lui, Kung-Jong; Chang, Kuang-Chao

2016-02-01

When the frequency of occurrence for an event of interest follows a Poisson distribution, we develop asymptotic and exact procedures for testing non-equality, non-inferiority and equivalence, as well as asymptotic and exact interval estimators for the ratio of mean frequencies between two treatments under a simple crossover design. Using Monte Carlo simulations, we evaluate the performance of these test procedures and interval estimators in a variety of situations. We note that all asymptotic test procedures developed here can generally perform well with respect to Type I error and can be preferable to the exact test procedure with respect to power if the number of patients per group is moderate or large. We further find that in these cases the asymptotic interval estimator with the logarithmic transformation can be more precise than the exact interval estimator without sacrificing the accuracy with respect to the coverage probability. However, the exact test procedure and exact interval estimator can be of use when the number of patients per group is small. We use a double-blind randomized crossover trial comparing salmeterol with a placebo in exacerbations of asthma to illustrate the practical use of these estimators.

13. The Poisson Gamma distribution for wind speed data

Ćakmakyapan, Selen; Özel, Gamze

2016-04-01

The wind energy is one of the most significant alternative clean energy source and rapidly developing renewable energy sources in the world. For the evaluation of wind energy potential, probability density functions (pdfs) are usually used to model wind speed distributions. The selection of the appropriate pdf reduces the wind power estimation error and also allow to achieve characteristics. In the literature, different pdfs used to model wind speed data for wind energy applications. In this study, we propose a new probability distribution to model the wind speed data. Firstly, we defined the new probability distribution named Poisson-Gamma (PG) distribution and we analyzed a wind speed data sets which are about five pressure degree for the station. We obtained the data sets from Turkish State Meteorological Service. Then, we modelled the data sets with Exponential, Weibull, Lomax, 3 parameters Burr, Gumbel, Gamma, Rayleigh which are used to model wind speed data, and PG distributions. Finally, we compared the distribution, to select the best fitted model and demonstrated that PG distribution modeled the data sets better.

14. Bayesian Inference and Online Learning in Poisson Neuronal Networks.

PubMed

Huang, Yanping; Rao, Rajesh P N

2016-08-01

Motivated by the growing evidence for Bayesian computation in the brain, we show how a two-layer recurrent network of Poisson neurons can perform both approximate Bayesian inference and learning for any hidden Markov model. The lower-layer sensory neurons receive noisy measurements of hidden world states. The higher-layer neurons infer a posterior distribution over world states via Bayesian inference from inputs generated by sensory neurons. We demonstrate how such a neuronal network with synaptic plasticity can implement a form of Bayesian inference similar to Monte Carlo methods such as particle filtering. Each spike in a higher-layer neuron represents a sample of a particular hidden world state. The spiking activity across the neural population approximates the posterior distribution over hidden states. In this model, variability in spiking is regarded not as a nuisance but as an integral feature that provides the variability necessary for sampling during inference. We demonstrate how the network can learn the likelihood model, as well as the transition probabilities underlying the dynamics, using a Hebbian learning rule. We present results illustrating the ability of the network to perform inference and learning for arbitrary hidden Markov models.

15. Universal Poisson Statistics of mRNAs with Complex Decay Pathways.

PubMed

Thattai, Mukund

2016-01-19

Messenger RNA (mRNA) dynamics in single cells are often modeled as a memoryless birth-death process with a constant probability per unit time that an mRNA molecule is synthesized or degraded. This predicts a Poisson steady-state distribution of mRNA number, in close agreement with experiments. This is surprising, since mRNA decay is known to be a complex process. The paradox is resolved by realizing that the Poisson steady state generalizes to arbitrary mRNA lifetime distributions. A mapping between mRNA dynamics and queueing theory highlights an identifiability problem: a measured Poisson steady state is consistent with a large variety of microscopic models. Here, I provide a rigorous and intuitive explanation for the universality of the Poisson steady state. I show that the mRNA birth-death process and its complex decay variants all take the form of the familiar Poisson law of rare events, under a nonlinear rescaling of time. As a corollary, not only steady-states but also transients are Poisson distributed. Deviations from the Poisson form occur only under two conditions, promoter fluctuations leading to transcriptional bursts or nonindependent degradation of mRNA molecules. These results place severe limits on the power of single-cell experiments to probe microscopic mechanisms, and they highlight the need for single-molecule measurements.

16. Pluripotent stem cell expansion and neural differentiation in 3-D scaffolds of tunable Poisson's ratio.

PubMed

Yan, Yuanwei; Li, Yan; Song, Liqing; Zeng, Changchun; Li, Yan

2017-02-01

Biophysical properties of the scaffolds such as the elastic modulus, have been recently shown to impact stem cell lineage commitment. On the other hand, the contribution of the Poisson's ratio, another important biophysical property, to the stem cell fate decision, has not been studied. Scaffolds with tunable Poisson's ratio (ν) (termed as auxetic scaffolds when Poisson's ratio is zero or negative) are anticipated to provide a spectrum of unique biophysical 3-D microenvironments to influence stem cell fate. To test this hypothesis, in the present work we fabricated auxetic polyurethane scaffolds (ν=0 to -0.45) and evaluated their effects on neural differentiation of mouse embryonic stem cells (ESCs) and human induced pluripotent stem cells (hiPSCs). Compared to the regular scaffolds (ν=+0.30) before auxetic conversion, the auxetic scaffolds supported smaller aggregate formation and higher expression of β-tubulin III upon neural differentiation. The influences of pore structure, Poisson's ratio, and elastic modulus on neural lineage commitment were further evaluated using a series of auxetic scaffolds. The results indicate that Poisson's ratio may confound the effects of elastic modulus, and auxetic scaffolds with proper pore structure and Poisson's ratio enhance neural differentiation. This study demonstrates that tuning the Poisson's ratio of the scaffolds together with elastic modulus and microstructure would enhance the capability to generate broader, more diversified ranges of biophysical 3-D microenvironments for the modulation of cellular differentiation.

17. XRA image segmentation using regression

Jin, Jesse S.

1996-04-01

Segmentation is an important step in image analysis. Thresholding is one of the most important approaches. There are several difficulties in segmentation, such as automatic selecting threshold, dealing with intensity distortion and noise removal. We have developed an adaptive segmentation scheme by applying the Central Limit Theorem in regression. A Gaussian regression is used to separate the distribution of background from foreground in a single peak histogram. The separation will help to automatically determine the threshold. A small 3 by 3 widow is applied and the modal of the local histogram is used to overcome noise. Thresholding is based on local weighting, where regression is used again for parameter estimation. A connectivity test is applied to the final results to remove impulse noise. We have applied the algorithm to x-ray angiogram images to extract brain arteries. The algorithm works well for single peak distribution where there is no valley in the histogram. The regression provides a method to apply knowledge in clustering. Extending regression for multiple-level segmentation needs further investigation.

18. Survival Data and Regression Models

Grégoire, G.

2014-12-01

We start this chapter by introducing some basic elements for the analysis of censored survival data. Then we focus on right censored data and develop two types of regression models. The first one concerns the so-called accelerated failure time models (AFT), which are parametric models where a function of a parameter depends linearly on the covariables. The second one is a semiparametric model, where the covariables enter in a multiplicative form in the expression of the hazard rate function. The main statistical tool for analysing these regression models is the maximum likelihood methodology and, in spite we recall some essential results about the ML theory, we refer to the chapter "Logistic Regression" for a more detailed presentation.

19. Regressive evolution in Astyanax cavefish.

PubMed

Jeffery, William R

2009-01-01

A diverse group of animals, including members of most major phyla, have adapted to life in the perpetual darkness of caves. These animals are united by the convergence of two regressive phenotypes, loss of eyes and pigmentation. The mechanisms of regressive evolution are poorly understood. The teleost Astyanax mexicanus is of special significance in studies of regressive evolution in cave animals. This species includes an ancestral surface dwelling form and many con-specific cave-dwelling forms, some of which have evolved their recessive phenotypes independently. Recent advances in Astyanax development and genetics have provided new information about how eyes and pigment are lost during cavefish evolution; namely, they have revealed some of the molecular and cellular mechanisms involved in trait modification, the number and identity of the underlying genes and mutations, the molecular basis of parallel evolution, and the evolutionary forces driving adaptation to the cave environment.

20. Holographic study of conventional and negative Poisson's ratio metallic foams - Elasticity, yield and micro-deformation

NASA Technical Reports Server (NTRS)

Chen, C. P.; Lakes, R. S.

1991-01-01

An experimental study by holographic interferometry is reported of the following material properties of conventional and negative Poisson's ratio copper foams: Young's moduli, Poisson's ratios, yield strengths and characteristic lengths associated with inhomogeneous deformation. The Young's modulus and yield strength of the conventional copper foam were comparable to those predicted by microstructural modeling on the basis of cellular rib bending. The reentrant copper foam exhibited a negative Poisson's ratio, as indicated by the elliptical contour fringes on the specimen surface in the bending tests. Inhomogeneous, non-affine deformation was observed holographically in both foam materials.

1. Universal Negative Poisson Ratio of Self-Avoiding Fixed-Connectivity Membranes

SciTech Connect

Bowick, M.; Cacciuto, A.; Thorleifsson, G.; Travesset, A.

2001-10-01

We determine the Poisson ratio of self-avoiding fixed-connectivity membranes, modeled as impenetrable plaquettes, to be {sigma}=-0.37(6) , in statistical agreement with the Poisson ratio of phantom fixed-connectivity membranes {sigma}=-0.32(4) . Together with the equality of critical exponents, this result implies a unique universality class for fixed-connectivity membranes. Our findings thus establish that physical fixed-connectivity membranes provide a wide class of auxetic (negative Poisson ratio) materials with significant potential applications in materials science.

2. Hamiltonian field description of the one-dimensional Poisson-Vlasov equations

SciTech Connect

Morrison, P.J.

1981-07-01

The one-dimensional Poisson-Vlasov equations are cast into Hamiltonian form. A Poisson Bracket in terms of the phase space density, as sole dynamical variable, is presented. This Poisson bracket is not of the usual form, but possesses the commutator properties of antisymmetry, bilinearity, and nonassociativity by virtue of the Jacobi requirement. Clebsch potentials are seen to yield a conventional (canonical) formulation. This formulation is discretized by expansion in terms of an arbitrary complete set of basis functions. In particular, a wave field representation is obtained.

3. Poisson-weighted Lindley distribution and its application on insurance claim data

Manesh, Somayeh Nik; Hamzah, Nor Aishah; Zamani, Hossein

2014-07-01

This paper introduces a new two-parameter mixed Poisson distribution, namely the Poisson-weighted Lindley (P-WL), which is obtained by mixing the Poisson with a new class of weighted Lindley distributions. The closed form, the moment generating function and the probability generating function are derived. The parameter estimations methods of moments and the maximum likelihood procedure are provided. Some simulation studies are conducted to investigate the performance of P-WL distribution. In addition, the compound P-WL distribution is derived and some applications to insurance area based on observations of the number of claims and on observations of the total amount of claims incurred will be illustrated.

4. Blow-up conditions for two dimensional modified Euler-Poisson equations

Lee, Yongki

2016-09-01

The multi-dimensional Euler-Poisson system describes the dynamic behavior of many important physical flows, yet as a hyperbolic system its solution can blow-up for some initial configurations. This article strives to advance our understanding on the critical threshold phenomena through the study of a two-dimensional modified Euler-Poisson system with a modified Riesz transform where the singularity at the origin is removed. We identify upper-thresholds for finite time blow-up of solutions for the modified Euler-Poisson equations with attractive/repulsive forcing.

5. Universal negative poisson ratio of self-avoiding fixed-connectivity membranes.

PubMed

Bowick, M; Cacciuto, A; Thorleifsson, G; Travesset, A

2001-10-01

We determine the Poisson ratio of self-avoiding fixed-connectivity membranes, modeled as impenetrable plaquettes, to be sigma = -0.37(6), in statistical agreement with the Poisson ratio of phantom fixed-connectivity membranes sigma = -0.32(4). Together with the equality of critical exponents, this result implies a unique universality class for fixed-connectivity membranes. Our findings thus establish that physical fixed-connectivity membranes provide a wide class of auxetic (negative Poisson ratio) materials with significant potential applications in materials science.

6. A regularization corrected score method for nonlinear regression models with covariate error.

PubMed

Zucker, David M; Gorfine, Malka; Li, Yi; Tadesse, Mahlet G; Spiegelman, Donna

2013-03-01

Many regression analyses involve explanatory variables that are measured with error, and failing to account for this error is well known to lead to biased point and interval estimates of the regression coefficients. We present here a new general method for adjusting for covariate error. Our method consists of an approximate version of the Stefanski-Nakamura corrected score approach, using the method of regularization to obtain an approximate solution of the relevant integral equation. We develop the theory in the setting of classical likelihood models; this setting covers, for example, linear regression, nonlinear regression, logistic regression, and Poisson regression. The method is extremely general in terms of the types of measurement error models covered, and is a functional method in the sense of not involving assumptions on the distribution of the true covariate. We discuss the theoretical properties of the method and present simulation results in the logistic regression setting (univariate and multivariate). For illustration, we apply the method to data from the Harvard Nurses' Health Study concerning the relationship between physical activity and breast cancer mortality in the period following a diagnosis of breast cancer.

7. Generalized master equation via aging continuous-time random walks.

PubMed

Allegrini, Paolo; Aquino, Gerardo; Grigolini, Paolo; Palatella, Luigi; Rosa, Angelo

2003-11-01

We discuss the problem of the equivalence between continuous-time random walk (CTRW) and generalized master equation (GME). The walker, making instantaneous jumps from one site of the lattice to another, resides in each site for extended times. The sojourn times have a distribution density psi(t) that is assumed to be an inverse power law with the power index micro. We assume that the Onsager principle is fulfilled, and we use this assumption to establish a complete equivalence between GME and the Montroll-Weiss CTRW. We prove that this equivalence is confined to the case where psi(t) is an exponential. We argue that is so because the Montroll-Weiss CTRW, as recently proved by Barkai [E. Barkai, Phys. Rev. Lett. 90, 104101 (2003)], is nonstationary, thereby implying aging, while the Onsager principle is valid only in the case of fully aged systems. The case of a Poisson distribution of sojourn times is the only one with no aging associated to it, and consequently with no need to establish special initial conditions to fulfill the Onsager principle. We consider the case of a dichotomous fluctuation, and we prove that the Onsager principle is fulfilled for any form of regression to equilibrium provided that the stationary condition holds true. We set the stationary condition on both the CTRW and the GME, thereby creating a condition of total equivalence, regardless of the nature of the waiting-time distribution. As a consequence of this procedure we create a GME that is a bona fide master equation, in spite of being non-Markov. We note that the memory kernel of the GME affords information on the interaction between system of interest and its bath. The Poisson case yields a bath with infinitely fast fluctuations. We argue that departing from the Poisson form has the effect of creating a condition of infinite memory and that these results might be useful to shed light on the problem of how to unravel non-Markov quantum master equations.

8. 3DGRAPE - THREE DIMENSIONAL GRIDS ABOUT ANYTHING BY POISSON'S EQUATION

NASA Technical Reports Server (NTRS)

Sorenson, R. L.

1994-01-01

The ability to treat arbitrary boundary shapes is one of the most desirable characteristics of a method for generating grids. 3DGRAPE is designed to make computational grids in or about almost any shape. These grids are generated by the solution of Poisson's differential equations in three dimensions. The program automatically finds its own values for inhomogeneous terms which give near-orthogonality and controlled grid cell height at boundaries. Grids generated by 3DGRAPE have been applied to both viscous and inviscid aerodynamic problems, and to problems in other fluid-dynamic areas. 3DGRAPE uses zones to solve the problem of warping one cube into the physical domain in real-world computational fluid dynamics problems. In a zonal approach, a physical domain is divided into regions, each of which maps into its own computational cube. It is believed that even the most complicated physical region can be divided into zones, and since it is possible to warp a cube into each zone, a grid generator which is oriented to zones and allows communication across zonal boundaries (where appropriate) solves the problem of topological complexity. 3DGRAPE expects to read in already-distributed x,y,z coordinates on the bodies of interest, coordinates which will remain fixed during the entire grid-generation process. The 3DGRAPE code makes no attempt to fit given body shapes and redistribute points thereon. Body-fitting is a formidable problem in itself. The user must either be working with some simple analytical body shape, upon which a simple analytical distribution can be easily effected, or must have available some sophisticated stand-alone body-fitting software. 3DGRAPE does not require the user to supply the block-to-block boundaries nor the shapes of the distribution of points. 3DGRAPE will typically supply those block-to-block boundaries simply as surfaces in the elliptic grid. Thus at block-to-block boundaries the following conditions are obtained: (1) grids lines will

9. Geostatistical analysis of disease data: estimation of cancer mortality risk from empirical frequencies using Poisson kriging

PubMed Central

Goovaerts, Pierre

2005-01-01

Background Cancer mortality maps are used by public health officials to identify areas of excess and to guide surveillance and control activities. Quality of decision-making thus relies on an accurate quantification of risks from observed rates which can be very unreliable when computed from sparsely populated geographical units or recorded for minority populations. This paper presents a geostatistical methodology that accounts for spatially varying population sizes and spatial patterns in the processing of cancer mortality data. Simulation studies are conducted to compare the performances of Poisson kriging to a few simple smoothers (i.e. population-weighted estimators and empirical Bayes smoothers) under different scenarios for the disease frequency, the population size, and the spatial pattern of risk. A public-domain executable with example datasets is provided. Results The analysis of age-adjusted mortality rates for breast and cervix cancers illustrated some key features of commonly used smoothing techniques. Because of the small weight assigned to the rate observed over the entity being smoothed (kernel weight), the population-weighted average leads to risk maps that show little variability. Other techniques assign larger and similar kernel weights but they use a different piece of auxiliary information in the prediction: global or local means for global or local empirical Bayes smoothers, and spatial combination of surrounding rates for the geostatistical estimator. Simulation studies indicated that Poisson kriging outperforms other approaches for most scenarios, with a clear benefit when the risk values are spatially correlated. Global empirical Bayes smoothers provide more accurate predictions under the least frequent scenario of spatially random risk. Conclusion The approach presented in this paper enables researchers to incorporate the pattern of spatial dependence of mortality rates into the mapping of risk values and the quantification of the

10. Linear regression analysis of survival data with missing censoring indicators

PubMed Central

Wang, Qihua

2010-01-01

Linear regression analysis has been studied extensively in a random censorship setting, but typically all of the censoring indicators are assumed to be observed. In this paper, we develop synthetic data methods for estimating regression parameters in a linear model when some censoring indicators are missing. We define estimators based on regression calibration, imputation, and inverse probability weighting techniques, and we prove all three estimators are asymptotically normal. The finite-sample performance of each estimator is evaluated via simulation. We illustrate our methods by assessing the effects of sex and age on the time to non-ambulatory progression for patients in a brain cancer clinical trial. PMID:20559722

11. Cactus: An Introduction to Regression

ERIC Educational Resources Information Center

Hyde, Hartley

2008-01-01

When the author first used "VisiCalc," the author thought it a very useful tool when he had the formulas. But how could he design a spreadsheet if there was no known formula for the quantities he was trying to predict? A few months later, the author relates he learned to use multiple linear regression software and suddenly it all clicked into…

12. Multiple Regression: A Leisurely Primer.

ERIC Educational Resources Information Center

Daniel, Larry G.; Onwuegbuzie, Anthony J.

Multiple regression is a useful statistical technique when the researcher is considering situations in which variables of interest are theorized to be multiply caused. It may also be useful in those situations in which the researchers is interested in studies of predictability of phenomena of interest. This paper provides an introduction to…

13. Weighting Regressions by Propensity Scores

ERIC Educational Resources Information Center

Freedman, David A.; Berk, Richard A.

2008-01-01

Regressions can be weighted by propensity scores in order to reduce bias. However, weighting is likely to increase random error in the estimates, and to bias the estimated standard errors downward, even when selection mechanisms are well understood. Moreover, in some cases, weighting will increase the bias in estimated causal parameters. If…

14. Quantile Regression with Censored Data

ERIC Educational Resources Information Center

Lin, Guixian

2009-01-01

The Cox proportional hazards model and the accelerated failure time model are frequently used in survival data analysis. They are powerful, yet have limitation due to their model assumptions. Quantile regression offers a semiparametric approach to model data with possible heterogeneity. It is particularly powerful for censored responses, where the…

15. A Poisson-based adaptive affinity propagation clustering for SAGE data.

PubMed

Tang, DongMing; Zhu, QingXin; Yang, Fan

2010-02-01

Serial analysis of gene expression (SAGE) is a powerful tool to obtain gene expression profiles. Clustering analysis is a valuable technique for analyzing SAGE data. In this paper, we propose an adaptive clustering method for SAGE data analysis, namely, PoissonAPS. The method incorporates a novel clustering algorithm, Affinity Propagation (AP). While AP algorithm has demonstrated good performance on many different data sets, it also faces several limitations. PoissonAPS overcomes the limitations of AP using the clustering validation measure as a cost function of merging and splitting, and as a result, it can automatically cluster SAGE data without user-specified parameters. We evaluated PoissonAPS and compared its performance with other methods on several real life SAGE datasets. The experimental results show that PoissonAPS can produce meaningful and interpretable clusters for SAGE data.

16. Particle trapping: A key requisite of structure formation and stability of Vlasov–Poisson plasmas

SciTech Connect

Schamel, Hans

2015-04-15

Particle trapping is shown to control the existence of undamped coherent structures in Vlasov–Poisson plasmas and thereby affects the onset of plasma instability beyond the realm of linear Landau theory.

17. A Hands-on Activity for Teaching the Poisson Distribution Using the Stock Market

ERIC Educational Resources Information Center

Dunlap, Mickey; Studstill, Sharyn

2014-01-01

The number of increases a particular stock makes over a fixed period follows a Poisson distribution. This article discusses using this easily-found data as an opportunity to let students become involved in the data collection and analysis process.

18. Hierarchical Adaptive Regression Kernels for Regression with Functional Predictors

PubMed Central

Woodard, Dawn B.; Crainiceanu, Ciprian; Ruppert, David

2013-01-01

We propose a new method for regression using a parsimonious and scientifically interpretable representation of functional predictors. Our approach is designed for data that exhibit features such as spikes, dips, and plateaus whose frequency, location, size, and shape varies stochastically across subjects. We propose Bayesian inference of the joint functional and exposure models, and give a method for efficient computation. We contrast our approach with existing state-of-the-art methods for regression with functional predictors, and show that our method is more effective and efficient for data that include features occurring at varying locations. We apply our methodology to a large and complex dataset from the Sleep Heart Health Study, to quantify the association between sleep characteristics and health outcomes. Software and technical appendices are provided in online supplemental materials. PMID:24293988

19. SU-E-T-144: Bayesian Inference of Local Relapse Data Using a Poisson-Based Tumour Control Probability Model

SciTech Connect

La Russa, D

2015-06-15

Purpose: The purpose of this project is to develop a robust method of parameter estimation for a Poisson-based TCP model using Bayesian inference. Methods: Bayesian inference was performed using the PyMC3 probabilistic programming framework written in Python. A Poisson-based TCP regression model that accounts for clonogen proliferation was fit to observed rates of local relapse as a function of equivalent dose in 2 Gy fractions for a population of 623 stage-I non-small-cell lung cancer patients. The Slice Markov Chain Monte Carlo sampling algorithm was used to sample the posterior distributions, and was initiated using the maximum of the posterior distributions found by optimization. The calculation of TCP with each sample step required integration over the free parameter α, which was performed using an adaptive 24-point Gauss-Legendre quadrature. Convergence was verified via inspection of the trace plot and posterior distribution for each of the fit parameters, as well as with comparisons of the most probable parameter values with their respective maximum likelihood estimates. Results: Posterior distributions for α, the standard deviation of α (σ), the average tumour cell-doubling time (Td), and the repopulation delay time (Tk), were generated assuming α/β = 10 Gy, and a fixed clonogen density of 10{sup 7} cm−{sup 3}. Posterior predictive plots generated from samples from these posterior distributions are in excellent agreement with the observed rates of local relapse used in the Bayesian inference. The most probable values of the model parameters also agree well with maximum likelihood estimates. Conclusion: A robust method of performing Bayesian inference of TCP data using a complex TCP model has been established.

20. Noise parameter estimation for poisson corrupted images using variance stabilization transforms.

PubMed

Jin, Xiaodan; Xu, Zhenyu; Hirakawa, Keigo

2014-03-01

Noise is present in all images captured by real-world image sensors. Poisson distribution is said to model the stochastic nature of the photon arrival process and agrees with the distribution of measured pixel values. We propose a method for estimating unknown noise parameters from Poisson corrupted images using properties of variance stabilization. With a significantly lower computational complexity and improved stability, the proposed estimation technique yields noise parameters that are comparable in accuracy to the state-of-art methods.

1. Massively Parallel Solution of Poisson Equation on Coarse Grain MIMD Architectures

NASA Technical Reports Server (NTRS)

Fijany, A.; Weinberger, D.; Roosta, R.; Gulati, S.

1998-01-01

In this paper a new algorithm, designated as Fast Invariant Imbedding algorithm, for solution of Poisson equation on vector and massively parallel MIMD architectures is presented. This algorithm achieves the same optimal computational efficiency as other Fast Poisson solvers while offering a much better structure for vector and parallel implementation. Our implementation on the Intel Delta and Paragon shows that a speedup of over two orders of magnitude can be achieved even for moderate size problems.

2. Fractional poisson--a simple dose-response model for human norovirus.

PubMed

Messner, Michael J; Berger, Philip; Nappier, Sharon P

2014-10-01

This study utilizes old and new Norovirus (NoV) human challenge data to model the dose-response relationship for human NoV infection. The combined data set is used to update estimates from a previously published beta-Poisson dose-response model that includes parameters for virus aggregation and for a beta-distribution that describes variable susceptibility among hosts. The quality of the beta-Poisson model is examined and a simpler model is proposed. The new model (fractional Poisson) characterizes hosts as either perfectly susceptible or perfectly immune, requiring a single parameter (the fraction of perfectly susceptible hosts) in place of the two-parameter beta-distribution. A second parameter is included to account for virus aggregation in the same fashion as it is added to the beta-Poisson model. Infection probability is simply the product of the probability of nonzero exposure (at least one virus or aggregate is ingested) and the fraction of susceptible hosts. The model is computationally simple and appears to be well suited to the data from the NoV human challenge studies. The model's deviance is similar to that of the beta-Poisson, but with one parameter, rather than two. As a result, the Akaike information criterion favors the fractional Poisson over the beta-Poisson model. At low, environmentally relevant exposure levels (<100), estimation error is small for the fractional Poisson model; however, caution is advised because no subjects were challenged at such a low dose. New low-dose data would be of great value to further clarify the NoV dose-response relationship and to support improved risk assessment for environmentally relevant exposures.

3. Finite element solution of torsion and other 2-D Poisson equations

NASA Technical Reports Server (NTRS)

Everstine, G. C.

1982-01-01

The NASTRAN structural analysis computer program may be used, without modification, to solve two dimensional Poisson equations such as arise in the classical Saint Venant torsion problem. The nonhomogeneous term (the right-hand side) in the Poisson equation can be handled conveniently by specifying a gravitational load in a "structural" analysis. The use of an analogy between the equations of elasticity and those of classical mathematical physics is summarized in detail.

4. Hyperbolically Patterned 3D Graphene Metamaterial with Negative Poisson's Ratio and Superelasticity.

PubMed

Zhang, Qiangqiang; Xu, Xiang; Lin, Dong; Chen, Wenli; Xiong, Guoping; Yu, Yikang; Fisher, Timothy S; Li, Hui

2016-03-16

A hyperbolically patterned 3D graphene metamaterial (GM) with negative Poisson's ratio and superelasticity is highlighted. It is synthesized by a modified hydrothermal approach and subsequent oriented freeze-casting strategy. GM presents a tunable Poisson's ratio by adjusting the structural porosity, macroscopic aspect ratio (L/D), and freeze-casting conditions. Such a GM suggests promising applications as soft actuators, sensors, robust shock absorbers, and environmental remediation.

5. Regression Verification Using Impact Summaries

NASA Technical Reports Server (NTRS)

Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

2013-01-01

Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

6. Bayesian inference on multiscale models for poisson intensity estimation: applications to photon-limited image denoising.

PubMed

Lefkimmiatis, Stamatios; Maragos, Petros; Papandreou, George

2009-08-01

We present an improved statistical model for analyzing Poisson processes, with applications to photon-limited imaging. We build on previous work, adopting a multiscale representation of the Poisson process in which the ratios of the underlying Poisson intensities (rates) in adjacent scales are modeled as mixtures of conjugate parametric distributions. Our main contributions include: 1) a rigorous and robust regularized expectation-maximization (EM) algorithm for maximum-likelihood estimation of the rate-ratio density parameters directly from the noisy observed Poisson data (counts); 2) extension of the method to work under a multiscale hidden Markov tree model (HMT) which couples the mixture label assignments in consecutive scales, thus modeling interscale coefficient dependencies in the vicinity of image edges; 3) exploration of a 2-D recursive quad-tree image representation, involving Dirichlet-mixture rate-ratio densities, instead of the conventional separable binary-tree image representation involving beta-mixture rate-ratio densities; and 4) a novel multiscale image representation, which we term Poisson-Haar decomposition, that better models the image edge structure, thus yielding improved performance. Experimental results on standard images with artificially simulated Poisson noise and on real photon-limited images demonstrate the effectiveness of the proposed techniques.

7. An intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces.

PubMed

Ying, Xiang; Xin, Shi-Qing; Sun, Qian; He, Ying

2013-09-01

Poisson disk sampling has excellent spatial and spectral properties, and plays an important role in a variety of visual computing. Although many promising algorithms have been proposed for multidimensional sampling in euclidean space, very few studies have been reported with regard to the problem of generating Poisson disks on surfaces due to the complicated nature of the surface. This paper presents an intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces. In sharp contrast to the conventional parallel approaches, our method neither partitions the given surface into small patches nor uses any spatial data structure to maintain the voids in the sampling domain. Instead, our approach assigns each sample candidate a random and unique priority that is unbiased with regard to the distribution. Hence, multiple threads can process the candidates simultaneously and resolve conflicts by checking the given priority values. Our algorithm guarantees that the generated Poisson disks are uniformly and randomly distributed without bias. It is worth noting that our method is intrinsic and independent of the embedding space. This intrinsic feature allows us to generate Poisson disk patterns on arbitrary surfaces in IR(n). To our knowledge, this is the first intrinsic, parallel, and accurate algorithm for surface Poisson disk sampling. Furthermore, by manipulating the spatially varying density function, we can obtain adaptive sampling easily.

8. On the Determination of Poisson Statistics for Haystack Radar Observations of Orbital Debris

NASA Technical Reports Server (NTRS)

Stokely, Christopher L.; Benbrook, James R.; Horstman, Matt

2007-01-01

A convenient and powerful method is used to determine if radar detections of orbital debris are observed according to Poisson statistics. This is done by analyzing the time interval between detection events. For Poisson statistics, the probability distribution of the time interval between events is shown to be an exponential distribution. This distribution is a special case of the Erlang distribution that is used in estimating traffic loads on telecommunication networks. Poisson statistics form the basis of many orbital debris models but the statistical basis of these models has not been clearly demonstrated empirically until now. Interestingly, during the fiscal year 2003 observations with the Haystack radar in a fixed staring mode, there are no statistically significant deviations observed from that expected with Poisson statistics, either independent or dependent of altitude or inclination. One would potentially expect some significant clustering of events in time as a result of satellite breakups, but the presence of Poisson statistics indicates that such debris disperse rapidly with respect to Haystack's very narrow radar beam. An exception to Poisson statistics is observed in the months following the intentional breakup of the Fengyun satellite in January 2007.

9. Auxetic Black Phosphorus: A 2D Material with Negative Poisson's Ratio.

PubMed

Du, Yuchen; Maassen, Jesse; Wu, Wangran; Luo, Zhe; Xu, Xianfan; Ye, Peide D

2016-10-12

The Poisson's ratio of a material characterizes its response to uniaxial strain. Materials normally possess a positive Poisson's ratio - they contract laterally when stretched, and expand laterally when compressed. A negative Poisson's ratio is theoretically permissible but has not, with few exceptions of man-made bulk structures, been experimentally observed in any natural materials. Here, we show that the negative Poisson's ratio exists in the low-dimensional natural material black phosphorus and that our experimental observations are consistent with first-principles simulations. Through applying uniaxial strain along armchair direction, we have succeeded in demonstrating a cross-plane interlayer negative Poisson's ratio on black phosphorus for the first time. Meanwhile, our results support the existence of a cross-plane intralayer negative Poisson's ratio in the constituent phosphorene layers under uniaxial deformation along the zigzag axis, which is in line with a previous theoretical prediction. The phenomenon originates from the puckered structure of its in-plane lattice, together with coupled hinge-like bonding configurations.

10. High order solution of Poisson problems with piecewise constant coefficients and interface jumps

Marques, Alexandre Noll; Nave, Jean-Christophe; Rosales, Rodolfo Ruben

2017-04-01

We present a fast and accurate algorithm to solve Poisson problems in complex geometries, using regular Cartesian grids. We consider a variety of configurations, including Poisson problems with interfaces across which the solution is discontinuous (of the type arising in multi-fluid flows). The algorithm is based on a combination of the Correction Function Method (CFM) and Boundary Integral Methods (BIM). Interface and boundary conditions can be treated in a fast and accurate manner using boundary integral equations, and the associated BIM. Unfortunately, BIM can be costly when the solution is needed everywhere in a grid, e.g. fluid flow problems. We use the CFM to circumvent this issue. The solution from the BIM is used to rewrite the problem as a series of Poisson problems in rectangular domains-which requires the BIM solution at interfaces/boundaries only. These Poisson problems involve discontinuities at interfaces, of the type that the CFM can handle. Hence we use the CFM to solve them (to high order of accuracy) with finite differences and a Fast Fourier Transform based fast Poisson solver. We present 2-D examples of the algorithm applied to Poisson problems involving complex geometries, including cases in which the solution is discontinuous. We show that the algorithm produces solutions that converge with either 3rd or 4th order of accuracy, depending on the type of boundary condition and solution discontinuity.

11. Hydrodynamic limit of Wigner-Poisson kinetic theory: Revisited

Akbari-Moghanjoughi, M.

2015-02-01

In this paper, we revisit the hydrodynamic limit of the Langmuir wave dispersion relation based on the Wigner-Poisson model in connection with that obtained directly from the original Lindhard dielectric function based on the random-phase-approximation. It is observed that the (fourth-order) expansion of the exact Lindhard dielectric constant correctly reduces to the hydrodynamic dispersion relation with an additional term of fourth-order, beside that caused by the quantum diffraction effect. It is also revealed that the generalized Lindhard dielectric theory accounts for the recently discovered Shukla-Eliasson attractive potential (SEAP). However, the expansion of the exact Lindhard static dielectric function leads to a k4 term of different magnitude than that obtained from the linearized quantum hydrodynamics model. It is shown that a correction factor of 1/9 should be included in the term arising from the quantum Bohm potential of the momentum balance equation in fluid model in order for a correct plasma dielectric response treatment. Finally, it is observed that the long-range oscillatory screening potential (Friedel oscillations) of type cos ( 2 k F r ) / r 3 , which is a consequence of the divergence of the dielectric function at point k = 2kF in a quantum plasma, arises due to the finiteness of the Fermi-wavenumber and is smeared out in the limit of very high electron number-densities, typical of white dwarfs and neutron stars. In the very low electron number-density regime, typical of semiconductors and metals, where the Friedel oscillation wavelength becomes much larger compared to the interparticle distances, the SEAP appears with a much deeper potential valley. It is remarked that the fourth-order approximate Lindhard dielectric constant approaches that of the linearized quantum hydrodynamic in the limit if very high electron number-density. By evaluation of the imaginary part of the Lindhard dielectric function, it is shown that the Landau

12. Interaction Models for Functional Regression

PubMed Central

USSET, JOSEPH; STAICU, ANA-MARIA; MAITY, ARNAB

2015-01-01

A functional regression model with a scalar response and multiple functional predictors is proposed that accommodates two-way interactions in addition to their main effects. The proposed estimation procedure models the main effects using penalized regression splines, and the interaction effect by a tensor product basis. Extensions to generalized linear models and data observed on sparse grids or with measurement error are presented. A hypothesis testing procedure for the functional interaction effect is described. The proposed method can be easily implemented through existing software. Numerical studies show that fitting an additive model in the presence of interaction leads to both poor estimation performance and lost prediction power, while fitting an interaction model where there is in fact no interaction leads to negligible losses. The methodology is illustrated on the AneuRisk65 study data. PMID:26744549

13. Astronomical Methods for Nonparametric Regression

2017-01-01

I will discuss commonly used techniques for nonparametric regression in astronomy. We find that several of them, particularly running averages and running medians, are generically biased, asymmetric between dependent and independent variables, and perform poorly in recovering the underlying function, even when errors are present only in one variable. We then examine less-commonly used techniques such as Multivariate Adaptive Regressive Splines and Boosted Trees and find them superior in bias, asymmetry, and variance both theoretically and in practice under a wide range of numerical benchmarks. In this context the chief advantage of the common techniques is runtime, which even for large datasets is now measured in microseconds compared with milliseconds for the more statistically robust techniques. This points to a tradeoff between bias, variance, and computational resources which in recent years has shifted heavily in favor of the more advanced methods, primarily driven by Moore's Law. Along these lines, we also propose a new algorithm which has better overall statistical properties than all techniques examined thus far, at the cost of significantly worse runtime, in addition to providing guidance on choosing the nonparametric regression technique most suitable to any specific problem. We then examine the more general problem of errors in both variables and provide a new algorithm which performs well in most cases and lacks the clear asymmetry of existing non-parametric methods, which fail to account for errors in both variables.

14. Quantile Regression Models for Current Status Data.

PubMed

Ou, Fang-Shu; Zeng, Donglin; Cai, Jianwen

2016-11-01

Current status data arise frequently in demography, epidemiology, and econometrics where the exact failure time cannot be determined but is only known to have occurred before or after a known observation time. We propose a quantile regression model to analyze current status data, because it does not require distributional assumptions and the coefficients can be interpreted as direct regression effects on the distribution of failure time in the original time scale. Our model assumes that the conditional quantile of failure time is a linear function of covariates. We assume conditional independence between the failure time and observation time. An M-estimator is developed for parameter estimation which is computed using the concave-convex procedure and its confidence intervals are constructed using a subsampling method. Asymptotic properties for the estimator are derived and proven using modern empirical process theory. The small sample performance of the proposed method is demonstrated via simulation studies. Finally, we apply the proposed method to analyze data from the Mayo Clinic Study of Aging.

15. Regression analysis of cytopathological data

SciTech Connect

Whittemore, A.S.; McLarty, J.W.; Fortson, N.; Anderson, K.

1982-12-01

Epithelial cells from the human body are frequently labelled according to one of several ordered levels of abnormality, ranging from normal to malignant. The label of the most abnormal cell in a specimen determines the score for the specimen. This paper presents a model for the regression of specimen scores against continuous and discrete variables, as in host exposure to carcinogens. Application to data and tests for adequacy of model fit are illustrated using sputum specimens obtained from a cohort of former asbestos workers.

16. Censored Median Regression and Profile Empirical Likelihood

PubMed Central

Subramanian, Sundarraman

2007-01-01

We implement profile empirical likelihood based inference for censored median regression models. Inference for any specified sub-vector is carried out by profiling out the nuisance parameters from the “plug-in” empirical likelihood ratio function proposed by Qin and Tsao. To obtain the critical value of the profile empirical likelihood ratio statistic, we first investigate its asymptotic distribution. The limiting distribution is a sum of weighted chi square distributions. Unlike for the full empirical likelihood, however, the derived asymptotic distribution has intractable covariance structure. Therefore, we employ the bootstrap to obtain the critical value, and compare the resulting confidence intervals with the ones obtained through Basawa and Koul’s minimum dispersion statistic. Furthermore, we obtain confidence intervals for the age and treatment effects in a lung cancer data set. PMID:19112527

17. Spinocerebellar ataxia type 2 presenting with cognitive regression in childhood.

PubMed

Ramocki, Melissa B; Chapieski, Lynn; McDonald, Ryan O; Fernandez, Fabio; Malphrus, Amy D

2008-09-01

Spinocerebellar ataxia type 2 typically presents in adulthood with progressive ataxia, dysarthria, tremor, and slow saccadic eye movements. Childhood-onset spinocerebellar ataxia type 2 is rare, and only the infantile-onset form has been well characterized clinically. This article describes a girl who met all developmental milestones until age 3(1/2) years, when she experienced cognitive regression that preceded motor regression by 6 months. A diagnosis of spinocerebellar ataxia type 2 was delayed until she presented to the emergency department at age 7 years. This report documents the results of her neuropsychologic evaluation at both time points. This case broadens the spectrum of spinocerebellar ataxia type 2 presentation in childhood, highlights the importance of considering a spinocerebellar ataxia in a child who presents with cognitive regression only, and extends currently available clinical information to help clinicians discuss the prognosis in childhood spinocerebellar ataxia type 2.

18. Multiatlas segmentation as nonparametric regression.

PubMed

Awate, Suyash P; Whitaker, Ross T

2014-09-01

This paper proposes a novel theoretical framework to model and analyze the statistical characteristics of a wide range of segmentation methods that incorporate a database of label maps or atlases; such methods are termed as label fusion or multiatlas segmentation. We model these multiatlas segmentation problems as nonparametric regression problems in the high-dimensional space of image patches. We analyze the nonparametric estimator's convergence behavior that characterizes expected segmentation error as a function of the size of the multiatlas database. We show that this error has an analytic form involving several parameters that are fundamental to the specific segmentation problem (determined by the chosen anatomical structure, imaging modality, registration algorithm, and label-fusion algorithm). We describe how to estimate these parameters and show that several human anatomical structures exhibit the trends modeled analytically. We use these parameter estimates to optimize the regression estimator. We show that the expected error for large database sizes is well predicted by models learned on small databases. Thus, a few expert segmentations can help predict the database sizes required to keep the expected error below a specified tolerance level. Such cost-benefit analysis is crucial for deploying clinical multiatlas segmentation systems.

19. Multiatlas Segmentation as Nonparametric Regression

PubMed Central

Awate, Suyash P.; Whitaker, Ross T.

2015-01-01

This paper proposes a novel theoretical framework to model and analyze the statistical characteristics of a wide range of segmentation methods that incorporate a database of label maps or atlases; such methods are termed as label fusion or multiatlas segmentation. We model these multiatlas segmentation problems as nonparametric regression problems in the high-dimensional space of image patches. We analyze the nonparametric estimator’s convergence behavior that characterizes expected segmentation error as a function of the size of the multiatlas database. We show that this error has an analytic form involving several parameters that are fundamental to the specific segmentation problem (determined by the chosen anatomical structure, imaging modality, registration algorithm, and label-fusion algorithm). We describe how to estimate these parameters and show that several human anatomical structures exhibit the trends modeled analytically. We use these parameter estimates to optimize the regression estimator. We show that the expected error for large database sizes is well predicted by models learned on small databases. Thus, a few expert segmentations can help predict the database sizes required to keep the expected error below a specified tolerance level. Such cost-benefit analysis is crucial for deploying clinical multiatlas segmentation systems. PMID:24802528

20. Modeling Creep Processes in Aging Polymers

Olali, N. V.; Voitovich, L. V.; Zazimko, N. N.; Malezhik, M. P.

2016-03-01

The photoelastic method is generalized to creep in hereditary aging materials. Optical-creep curves and mechanical-creep or optical-relaxation curves are used to interpret fringe patterns. For materials with constant Poisson's ratio, it is sufficient to use mechanical- or optical-creep curves for this purpose

1. Species abundance in a forest community in South China: A case of poisson lognormal distribution

USGS Publications Warehouse

Yin, Z.-Y.; Ren, H.; Zhang, Q.-M.; Peng, S.-L.; Guo, Q.-F.; Zhou, G.-Y.

2005-01-01

Case studies on Poisson lognormal distribution of species abundance have been rare, especially in forest communities. We propose a numerical method to fit the Poisson lognormal to the species abundance data at an evergreen mixed forest in the Dinghushan Biosphere Reserve, South China. Plants in the tree, shrub and herb layers in 25 quadrats of 20 m??20 m, 5 m??5 m, and 1 m??1 m were surveyed. Results indicated that: (i) for each layer, the observed species abundance with a similarly small median, mode, and a variance larger than the mean was reverse J-shaped and followed well the zero-truncated Poisson lognormal; (ii) the coefficient of variation, skewness and kurtosis of abundance, and two Poisson lognormal parameters (?? and ??) for shrub layer were closer to those for the herb layer than those for the tree layer; and (iii) from the tree to the shrub to the herb layer, the ?? and the coefficient of variation decreased, whereas diversity increased. We suggest that: (i) the species abundance distributions in the three layers reflects the overall community characteristics; (ii) the Poisson lognormal can describe the species abundance distribution in diverse communities with a few abundant species but many rare species; and (iii) 1/?? should be an alternative measure of diversity.

2. An Intrinsic Algorithm for Parallel Poisson Disk Sampling on Arbitrary Surfaces.

PubMed

Ying, Xiang; Xin, Shi-Qing; Sun, Qian; He, Ying

2013-03-08

Poisson disk sampling plays an important role in a variety of visual computing, due to its useful statistical property in distribution and the absence of aliasing artifacts. While many effective techniques have been proposed to generate Poisson disk distribution in Euclidean space, relatively few work has been reported to the surface counterpart. This paper presents an intrinsic algorithm for parallel Poisson disk sampling on arbitrary surfaces. We propose a new technique for parallelizing the dart throwing. Rather than the conventional approaches that explicitly partition the spatial domain to generate the samples in parallel, our approach assigns each sample candidate a random and unique priority that is unbiased with regard to the distribution. Hence, multiple threads can process the candidates simultaneously and resolve conflicts by checking the given priority values. It is worth noting that our algorithm is accurate as the generated Poisson disks are uniformly and randomly distributed without bias. Our method is intrinsic in that all the computations are based on the intrinsic metric and are independent of the embedding space. This intrinsic feature allows us to generate Poisson disk distributions on arbitrary surfaces. Furthermore, by manipulating the spatially varying density function, we can obtain adaptive sampling easily.

3. Solving the Fluid Pressure Poisson Equation Using Multigrid - Evaluation and Improvements.

PubMed

Dick, Christian; Rogowsky, Marcus; Westermann, Ruediger

2015-12-23

In many numerical simulations of fluids governed by the incompressible Navier-Stokes equations, the pressure Poisson equation needs to be solved to enforce mass conservation. Multigrid solvers show excellent convergence in simple scenarios, yet they can converge slowly in domains where physically separated regions are combined at coarser scales. Moreover, existing multigrid solvers are tailored to specific discretizations of the pressure Poisson equation, and they cannot easily be adapted to other discretizations. In this paper we analyze the convergence properties of existing multigrid solvers for the pressure Poisson equation in different simulation domains, and we show how to further improve the multigrid convergence rate by using a graph-based extension to determine the coarse grid hierarchy. The proposed multigrid solver is generic in that it can be applied to different kinds of discretizations of the pressure Poisson equation, by using solely the specification of the simulation domain and pre-assembled computational stencils. We analyze the proposed solver in combination with finite difference and finite volume discretizations of the pressure Poisson equation. Our evaluations show that, despite the common assumption, multigrid schemes can exploit their potential even in the most complicated simulation scenarios, yet this behavior is obtained at the price of higher memory consumption.

4. Solving the Fluid Pressure Poisson Equation Using Multigrid-Evaluation and Improvements.

PubMed

Dick, Christian; Rogowsky, Marcus; Westermann, Rudiger

2016-11-01

In many numerical simulations of fluids governed by the incompressible Navier-Stokes equations, the pressure Poisson equation needs to be solved to enforce mass conservation. Multigrid solvers show excellent convergence in simple scenarios, yet they can converge slowly in domains where physically separated regions are combined at coarser scales. Moreover, existing multigrid solvers are tailored to specific discretizations of the pressure Poisson equation, and they cannot easily be adapted to other discretizations. In this paper we analyze the convergence properties of existing multigrid solvers for the pressure Poisson equation in different simulation domains, and we show how to further improve the multigrid convergence rate by using a graph-based extension to determine the coarse grid hierarchy. The proposed multigrid solver is generic in that it can be applied to different kinds of discretizations of the pressure Poisson equation, by using solely the specification of the simulation domain and pre-assembled computational stencils. We analyze the proposed solver in combination with finite difference and finite volume discretizations of the pressure Poisson equation. Our evaluations show that, despite the common assumption, multigrid schemes can exploit their potential even in the most complicated simulation scenarios, yet this behavior is obtained at the price of higher memory consumption.

5. Ultra-soft 100 nm thick zero Poisson's ratio film with 60% reversible compressibility

Nguyen, Chieu; Szalewski, Steve; Saraf, Ravi

2013-03-01

Squeezing films of most solids, liquids and granular materials causes dilation in the lateral dimension which is characterized by a positive Poisson's ratio. Auxetic materials, such as, special foams, crumpled graphite, zeolites, spectrin/actin membrane, and carbon nanotube laminates shrink, i.e., their Poisson's ratio is negative. As a result of Poisson's effect, the force to squeeze an amorphous material, such as a viscous thin film coating adhered to rigid surface increases by over million fold as the thickness decreases from 10 μm to 100 nm due to constrain on lateral deformations and off-plane relaxation. We demonstrate, ultra-soft, 100 nm films of polymer/nanoparticle composite adhered to 1.25 cm diameter glass that can be reversibly squeezed over 60% strain between rigid plates requiring (very) low stresses below 100 KPa. Unlike non-zero Poisson's ratio materials, stiffness decreases with thickness, and the stress distribution is uniform over the film as mapped electro-optically. The high deformability at very low stresses is explained by considering reentrant cellular structure found in cork and the wings of beetles that have Poisson's ratio near zero.

6. Classification of four-dimensional real Lie bialgebras of symplectic type and their Poisson-Lie groups

Abedi-Fardad, J.; Rezaei-Aghdam, A.; Haghighatdoost, Gh.

2017-01-01

We classify all four-dimensional real Lie bialgebras of symplectic type and obtain the classical r-matrices for these Lie bialgebras and Poisson structures on all the associated four-dimensional Poisson-Lie groups. We obtain some new integrable models where a Poisson-Lie group plays the role of the phase space and its dual Lie group plays the role of the symmetry group of the system.

7. A flexible count data regression model for risk analysis.

PubMed

Guikema, Seth D; Coffelt, Jeremy P; Goffelt, Jeremy P

2008-02-01

In many cases, risk and reliability analyses involve estimating the probabilities of discrete events such as hardware failures and occurrences of disease or death. There is often additional information in the form of explanatory variables that can be used to help estimate the likelihood of different numbers of events in the future through the use of an appropriate regression model, such as a generalized linear model. However, existing generalized linear models (GLM) are limited in their ability to handle the types of variance structures often encountered in using count data in risk and reliability analysis. In particular, standard models cannot handle both underdispersed data (variance less than the mean) and overdispersed data (variance greater than the mean) in a single coherent modeling framework. This article presents a new GLM based on a reformulation of the Conway-Maxwell Poisson (COM) distribution that is useful for both underdispersed and overdispersed count data and demonstrates this model by applying it to the assessment of electric power system reliability. The results show that the proposed COM GLM can provide as good of fits to data as the commonly used existing models for overdispered data sets while outperforming these commonly used models for underdispersed data sets.

8. Recognition of caudal regression syndrome.

PubMed

Boulas, Mari M

2009-04-01

Caudal regression syndrome, also referred to as caudal dysplasia and sacral agenesis syndrome, is a rare congenital malformation characterized by varying degrees of developmental failure early in gestation. It involves the lower extremities, the lumbar and coccygeal vertebrae, and corresponding segments of the spinal cord. This is a rare disorder, and true pathogenesis is unclear. The etiology is thought to be related to maternal diabetes, genetic predisposition, and vascular hypoperfusion, but no true causative factor has been determined. Fetal diagnostic tools allow for early recognition of the syndrome, and careful examination of the newborn is essential to determine the extent of the disorder. Associated organ system dysfunction depends on the severity of the disease. Related defects are structural, and systematic problems including respiratory, cardiac, gastrointestinal, urinary, orthopedic, and neurologic can be present in varying degrees of severity and in different combinations. A multidisciplinary approach to management is crucial. Because the primary pathology is irreversible, treatment is only supportive.

9. Practical Session: Multiple Linear Regression

Clausel, M.; Grégoire, G.

2014-12-01

Three exercises are proposed to illustrate the simple linear regression. In the first one investigates the influence of several factors on atmospheric pollution. It has been proposed by D. Chessel and A.B. Dufour in Lyon 1 (see Sect. 6 of http://pbil.univ-lyon1.fr/R/pdf/tdr33.pdf) and is based on data coming from 20 cities of U.S. Exercise 2 is an introduction to model selection whereas Exercise 3 provides a first example of analysis of variance. Exercises 2 and 3 have been proposed by A. Dalalyan at ENPC (see Exercises 2 and 3 of http://certis.enpc.fr/~dalalyan/Download/TP_ENPC_5.pdf).

10. Novel Two-Dimensional Silicon Dioxide with in-Plane Negative Poisson's Ratio.

PubMed

Gao, Zhibin; Dong, Xiao; Li, Nianbei; Ren, Jie

2017-02-08

Silicon dioxide or silica, normally existing in various bulk crystalline and amorphous forms, was recently found to possess a two-dimensional structure. In this work, we use ab initio calculation and evolutionary algorithm to unveil three new two-dimensional (2D) silica structures whose thermal, dynamical, and mechanical stabilities are compared with many typical bulk silica. In particular, we find that all three of these 2D silica structures have large in-plane negative Poisson's ratios with the largest one being double of penta graphene and three times of borophenes. The negative Poisson's ratio originates from the interplay of lattice symmetry and Si-O tetrahedron symmetry. Slab silica is also an insulating 2D material with the highest electronic band gap (>7 eV) among reported 2D structures. These exotic 2D silica with in-plane negative Poisson's ratios and widest band gaps are expected to have great potential applications in nanomechanics and nanoelectronics.

11. Energy conserving discontinuous Galerkin spectral element method for the Vlasov-Poisson system

Madaule, Éric; Restelli, Marco; Sonnendrücker, Eric

2014-12-01

We propose a new, energy conserving, spectral element, discontinuous Galerkin method for the approximation of the Vlasov-Poisson system in arbitrary dimension, using Cartesian grids. The method is derived from the one proposed in [4], with two modifications: energy conservation is obtained by a suitable projection operator acting on the solution of the Poisson problem, rather than by solving multiple Poisson problems, and all the integrals appearing in the finite element formulation are approximated with Gauss-Lobatto quadrature, thereby yielding a spectral element formulation. The resulting method has the following properties: exact energy conservation (up to errors introduced by the time discretization), stability (thanks to the use of upwind numerical fluxes), high order accuracy and high locality. For the time discretization, we consider both Runge-Kutta methods and exponential integrators, and show results for 1D and 2D cases (2D and 4D in phase space, respectively).

12. A special relation between Young's modulus, Rayleigh-wave velocity, and Poisson's ratio.

PubMed

Malischewsky, Peter G; Tuan, Tran Thanh

2009-12-01

Bayon et al. [(2005). J. Acoust. Soc. Am. 117, 3469-3477] described a method for the determination of Young's modulus by measuring the Rayleigh-wave velocity and the ellipticity of Rayleigh waves, and found a peculiar almost linear relation between a non-dimensional quantity connecting Young's modulus, Rayleigh-wave velocity and density, and Poisson's ratio. The analytical reason for this special behavior remained unclear. It is demonstrated here that this behavior is a simple consequence of the mathematical form of the Rayleigh-wave velocity as a function of Poisson's ratio. The consequences for auxetic materials (those materials for which Poisson's ratio is negative) are discussed, as well as the determination of the shear and bulk moduli.

13. Heterogeneous PVA hydrogels with micro-cells of both positive and negative Poisson's ratios.

PubMed

Ma, Yanxuan; Zheng, Yudong; Meng, Haoye; Song, Wenhui; Yao, Xuefeng; Lv, Hexiang

2013-07-01

Many models describing the deformation of general foam or auxetic materials are based on the assumption of homogeneity and order within the materials. However, non-uniform heterogeneity is often an inherent nature in many porous materials and composites, but difficult to measure. In this work, inspired by the structures of auxetic materials, the porous PVA hydrogels with internal inby-concave pores (IICP) or interconnected pores (ICP) were designed and processed. The deformation of the PVA hydrogels under compression was tested and their Poisson's ratio was characterized. The results indicated that the size, shape and distribution of the pores in the hydrogel matrix had strong influence on the local Poisson's ratio, which varying from positive to negative at micro-scale. The size-dependency of their local Poisson's ratio reflected and quantified the uniformity and heterogeneity of the micro-porous structures in the PVA hydrogels.

14. What's the Risk? A Simple Approach for Estimating Adjusted Risk Measures from Nonlinear Models Including Logistic Regression

PubMed Central

Kleinman, Lawrence C; Norton, Edward C

2009-01-01

Objective To develop and validate a general method (called regression risk analysis) to estimate adjusted risk measures from logistic and other nonlinear multiple regression models. We show how to estimate standard errors for these estimates. These measures could supplant various approximations (e.g., adjusted odds ratio [AOR]) that may diverge, especially when outcomes are common. Study Design Regression risk analysis estimates were compared with internal standards as well as with Mantel–Haenszel estimates, Poisson and log-binomial regressions, and a widely used (but flawed) equation to calculate adjusted risk ratios (ARR) from AOR. Data Collection Data sets produced using Monte Carlo simulations. Principal Findings Regression risk analysis accurately estimates ARR and differences directly from multiple regression models, even when confounders are continuous, distributions are skewed, outcomes are common, and effect size is large. It is statistically sound and intuitive, and has properties favoring it over other methods in many cases. Conclusions Regression risk analysis should be the new standard for presenting findings from multiple regression analysis of dichotomous outcomes for cross-sectional, cohort, and population-based case–control studies, particularly when outcomes are common or effect size is large. PMID:18793213

15. Modeling the durability of ZOSTAVAX® vaccine efficacy in people ≥60 years of age.

PubMed

Li, Xiaoming; Zhang, Jane H; Betts, Robert F; Morrison, Vicki A; Xu, Ruifeng; Itzler, Robbin F; Acosta, Camilo J; Dasbach, Erik J; Pellissier, James M; Johnson, Gary R; Chan, Ivan S F

2015-03-17

Since 2006, the vaccine, ZOSTAVAX(®), has been licensed to prevent herpes zoster. Only limited clinical follow-up data are available to evaluate duration of protection, an important consideration when developing HZ vaccination policy recommendations. Four Poisson regression models were developed based on an integrated analysis of data from the Shingles Prevention Study and its Short Term Persistence extension to estimate the effects of years-since-vaccination and chronological-age on vaccine efficacy among people ≥60 years old. The models included number of HZ cases parsed into categories by chronological-age and time-since-vaccination as the dependent variable with different explanatory variables in each model. In all models, the interaction between vaccine-group and chronological-age was statistically significant indicating that vaccine efficacy decreases with the expected effects of advancing age but the interaction between vaccine-group and time-since-vaccination was not statistically significant indicating that much of the reduction in vaccine efficacy over time-since-vaccination can be explained by increasing age.

16. Early age at menarche and wheezing in adolescence. The 1993 Pelotas (Brazil) birth cohort study

PubMed Central

Joseph, Gary; Baptista Menezes, Ana Maria; Wehrmeister, Fernando C.

2015-01-01

Objective To evaluate the effect of menarche before 11 years of age on the incidence of wheezing/asthma in girls 11 to 18 years of age. Methods The study sample comprised 1,350 girls from a birth cohort that started in 1993 in the urban area of the city of Pelotas, southern Brazil; this cohort was followed until 18 years of age. We assessed wheezing by the question, “Have you ever had wheezing in the chest at any time in the past?,” from the International Study of Asthma and Allergies in Childhood (ISAAC) questionnaire. Early menarche was defined as occurring before 11 years of age. We estimated the cumulative incidence of wheezing excluding from the analysis all those participants who reported wheezing before age of 11 years. We performed the chi-square test to assess the association between ever wheezing and independent variables. Poisson regression models with robust variance were used to estimate cumulative incidence ratios. Results The average age at menarche in the cohort girls was 12 years (95% CI: 11.1–12.1). The prevalence of early menarche before 11 years of age was 11% (95% CI: 9.7–12.3). The cumulative incidence of wheezing from 11 to 18 years of age was 33.5% (95% CI: 30.9– 36.0). The crude association between ever wheezing in adolescence and early menarche before age 11 was 1.19 (95% CI: 0.96–1.48). After adjusting for early childhood and contemporaneous variables, no significant association for early menarche before 11 years of age and wheezing during adolescence was found (CIR: 1.18; CI95%: 0.93-1.49). Conclusion Early menarche before 11 years of age is not associated with an increased risk of wheezing during adolescence. PMID:26870751

17. A Conway-Maxwell-Poisson (CMP) model to address data dispersion on positron emission tomography.

PubMed

Santarelli, Maria Filomena; Della Latta, Daniele; Scipioni, Michele; Positano, Vincenzo; Landini, Luigi

2016-10-01

Positron emission tomography (PET) in medicine exploits the properties of positron-emitting unstable nuclei. The pairs of γ- rays emitted after annihilation are revealed by coincidence detectors and stored as projections in a sinogram. It is well known that radioactive decay follows a Poisson distribution; however, deviation from Poisson statistics occurs on PET projection data prior to reconstruction due to physical effects, measurement errors, correction of deadtime, scatter, and random coincidences. A model that describes the statistical behavior of measured and corrected PET data can aid in understanding the statistical nature of the data: it is a prerequisite to develop efficient reconstruction and processing methods and to reduce noise. The deviation from Poisson statistics in PET data could be described by the Conway-Maxwell-Poisson (CMP) distribution model, which is characterized by the centring parameter λ and the dispersion parameter ν, the latter quantifying the deviation from a Poisson distribution model. In particular, the parameter ν allows quantifying over-dispersion (ν<1) or under-dispersion (ν>1) of data. A simple and efficient method for λ and ν parameters estimation is introduced and assessed using Monte Carlo simulation for a wide range of activity values. The application of the method to simulated and experimental PET phantom data demonstrated that the CMP distribution parameters could detect deviation from the Poisson distribution both in raw and corrected PET data. It may be usefully implemented in image reconstruction algorithms and quantitative PET data analysis, especially in low counting emission data, as in dynamic PET data, where the method demonstrated the best accuracy.

18. Simultaneous estimation of Poisson's ratio and Young's modulus using a single indentation: a finite element study

Zheng, Y. P.; Choi, A. P. C.; Ling, H. Y.; Huang, Y. P.

2009-04-01

Indentation is commonly used to determine the mechanical properties of different kinds of biological tissues and engineering materials. With the force-deformation data obtained from an indentation test, Young's modulus of the tissue can be calculated using a linear elastic indentation model with a known Poisson's ratio. A novel method for simultaneous estimation of Young's modulus and Poisson's ratio of the tissue using a single indentation was proposed in this study. Finite element (FE) analysis using 3D models was first used to establish the relationship between Poisson's ratio and the deformation-dependent indentation stiffness for different aspect ratios (indentor radius/tissue original thickness) in the indentation test. From the FE results, it was found that the deformation-dependent indentation stiffness linearly increased with the deformation. Poisson's ratio could be extracted based on the deformation-dependent indentation stiffness obtained from the force-deformation data. Young's modulus was then further calculated with the estimated Poisson's ratio. The feasibility of this method was demonstrated in virtue of using the indentation models with different material properties in the FE analysis. The numerical results showed that the percentage errors of the estimated Poisson's ratios and the corresponding Young's moduli ranged from -1.7% to -3.2% and 3.0% to 7.2%, respectively, with the aspect ratio (indentor radius/tissue thickness) larger than 1. It is expected that this novel method can be potentially used for quantitative assessment of various kinds of engineering materials and biological tissues, such as articular cartilage.

19. Anisotropic elasticity and abnormal Poisson's ratios in super-hard materials

Huang, Chuanwei; Li, Rongpeng; Chen, Lang

2014-12-01

We theoretically investigated the variable mechanical properties such as Young's modulus, Poisson's ratios and compressibility in super-hard materials. Our tensorial analysis reveals that the mechanical properties of super-hard materials are strongly sensitive to the anisotropy index of materials. In sharp contrast to the traditional positive constant as thought before, the Poisson's ratio of super-hard materials could be unexpectedly negative, zero, or even positive with a value much larger than the isotropic upper limit of 0.5 along definite directions. Our results uncover a correlation between compressibility and hardness, which offer insights on the prediction of new super-hard materials.

20. Solution of the nonlinear Poisson-Boltzmann equation: Application to ionic diffusion in cementitious materials

SciTech Connect

Arnold, J.; Kosson, D.S.; Garrabrants, A.; Meeussen, J.C.L.; Sloot, H.A. van der

2013-02-15

A robust numerical solution of the nonlinear Poisson-Boltzmann equation for asymmetric polyelectrolyte solutions in discrete pore geometries is presented. Comparisons to the linearized approximation of the Poisson-Boltzmann equation reveal that the assumptions leading to linearization may not be appropriate for the electrochemical regime in many cementitious materials. Implications of the electric double layer on both partitioning of species and on diffusive release are discussed. The influence of the electric double layer on anion diffusion relative to cation diffusion is examined.

1. Incorporation of solvation effects into the fragment molecular orbital calculations with the Poisson-Boltzmann equation

Watanabe, Hirofumi; Okiyama, Yoshio; Nakano, Tatsuya; Tanaka, Shigenori

2010-11-01

We developed FMO-PB method, which incorporates solvation effects into the Fragment Molecular Orbital calculation with the Poisson-Boltzmann equation. This method retains good accuracy in energy calculations with reduced computational time. We calculated the solvation free energies for polyalanines, Alpha-1 peptide, tryptophan cage, and complex of estrogen receptor and 17 β-estradiol to show the applicability of this method for practical systems. From the calculated results, it has been confirmed that the FMO-PB method is useful for large biomolecules in solution. We also discussed the electric charges which are used in solving the Poisson-Boltzmann equation.

2. On Poisson's ratio for metal matrix composite laminates. [aluminum boron composites

NASA Technical Reports Server (NTRS)

Herakovich, C. T.; Shuart, M. J.

1978-01-01

The definition of Poisson's ratio for nonlinear behavior of metal matrix composite laminates is discussed and experimental results for tensile and compressive loading of five different boron-aluminum laminates are presented. It is shown that there may be considerable difference in the value of Poisson's ratio as defined by a total strain or an incremental strain definition. It is argued that the incremental definition is more appropriate for nonlinear material behavior. Results from a (0) laminate indicate that the incremental definition provides a precursor to failure which is not evident if the total strain definition is used.

3. A Bayesian approach to parameter and reliability estimation in the Poisson distribution.

NASA Technical Reports Server (NTRS)

Canavos, G. C.

1972-01-01

For life testing procedures, a Bayesian analysis is developed with respect to a random intensity parameter in the Poisson distribution. Bayes estimators are derived for the Poisson parameter and the reliability function based on uniform and gamma prior distributions of that parameter. A Monte Carlo procedure is implemented to make possible an empirical mean-squared error comparison between Bayes and existing minimum variance unbiased, as well as maximum likelihood, estimators. As expected, the Bayes estimators have mean-squared errors that are appreciably smaller than those of the other two.

4. A Poisson equation formulation for pressure calculations in penalty finite element models for viscous incompressible flows

NASA Technical Reports Server (NTRS)

Sohn, J. L.; Heinrich, J. C.

1990-01-01

The calculation of pressures when the penalty-function approximation is used in finite-element solutions of laminar incompressible flows is addressed. A Poisson equation for the pressure is formulated that involves third derivatives of the velocity field. The second derivatives appearing in the weak formulation of the Poisson equation are calculated from the C0 velocity approximation using a least-squares method. The present scheme is shown to be efficient, free of spurious oscillations, and accurate. Examples of applications are given and compared with results obtained using mixed formulations.

5. The Z2-graded Schouten-Nijenhuis bracket and generalized super-Poisson structures

de Azcárraga, J. A.; Izquierdo, J. M.; Perelomov, A. M.; Pérez-Bueno, J. C.

1997-07-01

The super or Z2-graded Schouten-Nijenhuis bracket is introduced. Using it, new generalized super-Poisson structures are found which are given in terms of certain graded-skew-symmetric contravariant tensors Λ of even order. The corresponding super "Jacobi identities" are expressed by stating that these tensors have a zero super Schouten-Nijenhuis bracket with themselves [Λ,Λ]=0. As a particular case, we provide the linear generalized super-Poisson structures which can be constructed on the dual spaces of simple superalgebras with a non-degenerate Killing metric. The su(3,1) superalgebra is given as a representative example.

6. The noncommutative Poisson bracket and the deformation of the family algebras

SciTech Connect

Wei, Zhaoting

2015-07-15

The family algebras are introduced by Kirillov in 2000. In this paper, we study the noncommutative Poisson bracket P on the classical family algebra C{sub τ}(g). We show that P controls the first-order 1-parameter formal deformation from C{sub τ}(g) to Q{sub τ}(g) where the latter is the quantum family algebra. Moreover, we will prove that the noncommutative Poisson bracket is in fact a Hochschild 2-coboundary, and therefore, the deformation is infinitesimally trivial. In the last part of this paper, we discuss the relation between Mackey’s analogue and the quantization problem of the family algebras.

7. Reentrant Origami-Based Metamaterials with Negative Poisson's Ratio and Bistability

Yasuda, H.; Yang, J.

2015-05-01

We investigate the unique mechanical properties of reentrant 3D origami structures based on the Tachi-Miura polyhedron (TMP). We explore the potential usage as mechanical metamaterials that exhibit tunable negative Poisson's ratio and structural bistability simultaneously. We show analytically and experimentally that the Poisson's ratio changes from positive to negative and vice versa during its folding motion. In addition, we verify the bistable mechanism of the reentrant 3D TMP under rigid origami configurations without relying on the buckling motions of planar origami surfaces. This study forms a foundation in designing and constructing TMP-based metamaterials in the form of bellowslike structures for engineering applications.

8. Newton's and Poisson's Impact Law for the Non-Convex Case of Reentrant Corners

Glocker, Christoph

The paper reviews the frictionless collision problem in rigid body dynamics. Newton's and Poisson's impact laws are stated in inequality form for one collision point and extended by superposition to multicontact configurations. One special case within this framework are impacts with global dissipation index, for which it is shown that Newton's impact law reduces to Moreau's impact rule and that both of them coincide with Poisson's law when a certain kinematic compatibility condition is met. A geometrical interpretation of this impact law is given for a tangentially regular boundary and then extended to re-entrant corners.

9. iAPBS: a programming interface to Adaptive Poisson-Boltzmann Solver

SciTech Connect

Konecny, Robert; Baker, Nathan A.; McCammon, J. A.

2012-07-26

The Adaptive Poisson-Boltzmann Solver (APBS) is a state-of-the-art suite for performing Poisson-Boltzmann electrostatic calculations on biomolecules. The iAPBS package provides a modular programmatic interface to the APBS library of electrostatic calculation routines. The iAPBS interface library can be linked with a Fortran or C/C++ program thus making all of the APBS functionality available from within the application. Several application modules for popular molecular dynamics simulation packages -- Amber, NAMD and CHARMM are distributed with iAPBS allowing users of these packages to perform implicit solvent electrostatic calculations with APBS.

10. Poisson's ratio from polarization of acoustic zero-group velocity Lamb mode.

PubMed

Baggens, Oskar; Ryden, Nils

2015-07-01

Poisson's ratio of an isotropic and free elastic plate is estimated from the polarization of the first symmetric acoustic zero-group velocity Lamb mode. This polarization is interpreted as the ratio of the absolute amplitudes of the surface normal and surface in-plane components of the acoustic mode. Results from the evaluation of simulated datasets indicate that the presented relation, which links the polarization and Poisson's ratio, can be extended to incorporate plates with material damping. Furthermore, the proposed application of the polarization is demonstrated in a practical field case, where an increased accuracy of estimated nominal thickness is obtained.

11. Birthweight Related Factors in Northwestern Iran: Using Quantile Regression Method

PubMed Central

Fallah, Ramazan; Kazemnejad, Anoshirvan; Zayeri, Farid; Shoghli, Alireza

2016-01-01

Introduction: Birthweight is one of the most important predicting indicators of the health status in adulthood. Having a balanced birthweight is one of the priorities of the health system in most of the industrial and developed countries. This indicator is used to assess the growth and health status of the infants. The aim of this study was to assess the birthweight of the neonates by using quantile regression in Zanjan province. Methods: This analytical descriptive study was carried out using pre-registered (March 2010 - March 2012) data of neonates in urban/rural health centers of Zanjan province using multiple-stage cluster sampling. Data were analyzed using multiple linear regressions andquantile regression method and SAS 9.2 statistical software. Results: From 8456 newborn baby, 4146 (49%) were female. The mean age of the mothers was 27.1±5.4 years. The mean birthweight of the neonates was 3104 ± 431 grams. Five hundred and seventy-three patients (6.8%) of the neonates were less than 2500 grams. In all quantiles, gestational age of neonates (p<0.05), weight and educational level of the mothers (p<0.05) showed a linear significant relationship with the i of the neonates. However, sex and birth rank of the neonates, mothers age, place of residence (urban/rural) and career were not significant in all quantiles (p>0.05). Conclusion: This study revealed the results of multiple linear regression and quantile regression were not identical. We strictly recommend the use of quantile regression when an asymmetric response variable or data with outliers is available. PMID:26925889

12. Genetics Home Reference: caudal regression syndrome

MedlinePlus

... of a genetic condition? Genetic and Rare Diseases Information Center Frequency Caudal regression syndrome is estimated to occur in 1 to ... parts of the skeleton, gastrointestinal system, and genitourinary ... caudal regression syndrome results from the presence of an abnormal ...

13. Semiparametric maximum likelihood for nonlinear regression with measurement errors.

PubMed

Suh, Eun-Young; Schafer, Daniel W

2002-06-01

This article demonstrates semiparametric maximum likelihood estimation of a nonlinear growth model for fish lengths using imprecisely measured ages. Data on the species corvina reina, found in the Gulf of Nicoya, Costa Rica, consist of lengths and imprecise ages for 168 fish and precise ages for a subset of 16 fish. The statistical problem may therefore be classified as nonlinear errors-in-variables regression with internal validation data. Inferential techniques are based on ideas extracted from several previous works on semiparametric maximum likelihood for errors-in-variables problems. The illustration of the example clarifies practical aspects of the associated computational, inferential, and data analytic techniques.

14. Semiparametric regression during 2003–2007*

PubMed Central

Ruppert, David; Wand, M.P.; Carroll, Raymond J.

2010-01-01

Semiparametric regression is a fusion between parametric regression and nonparametric regression that integrates low-rank penalized splines, mixed model and hierarchical Bayesian methodology – thus allowing more streamlined handling of longitudinal and spatial correlation. We review progress in the field over the five-year period between 2003 and 2007. We find semiparametric regression to be a vibrant field with substantial involvement and activity, continual enhancement and widespread application. PMID:20305800

15. Factors Influencing Drug Injection History among Prisoners: A Comparison between Classification and Regression Trees and Logistic Regression Analysis

PubMed Central

Rastegari, Azam; Haghdoost, Ali Akbar; Baneshi, Mohammad Reza

2013-01-01

Background Due to the importance of medical studies, researchers of this field should be familiar with various types of statistical analyses to select the most appropriate method based on the characteristics of their data sets. Classification and regression trees (CARTs) can be as complementary to regression models. We compared the performance of a logistic regression model and a CART in predicting drug injection among prisoners. Methods Data of 2720 Iranian prisoners was studied to determine the factors influencing drug injection. The collected data was divided into two groups of training and testing. A logistic regression model and a CART were applied on training data. The performance of the two models was then evaluated on testing data. Findings The regression model and the CART had 8 and 4 significant variables, respectively. Overall, heroin use, history of imprisonment, age at first drug use, and marital status were important factors in determining the history of drug injection. Subjects without the history of heroin use or heroin users with short-term imprisonment were at lower risk of drug injection. Among heroin addicts with long-term imprisonment, individuals with higher age at first drug use and married subjects were at lower risk of drug injection. Although the logistic regression model was more sensitive than the CART, the two models had the same levels of specificity and classification accuracy. Conclusion In this study, both sensitivity and specificity were important. While the logistic regression model had better performance, the graphical presentation of the CART simplifies the interpretation of the results. In general, a combination of different analytical methods is recommended to explore the effects of variables. PMID:24494152

16. Bayesian Unimodal Density Regression for Causal Inference

ERIC Educational Resources Information Center

Karabatsos, George; Walker, Stephen G.

2011-01-01

Karabatsos and Walker (2011) introduced a new Bayesian nonparametric (BNP) regression model. Through analyses of real and simulated data, they showed that the BNP regression model outperforms other parametric and nonparametric regression models of common use, in terms of predictive accuracy of the outcome (dependent) variable. The other,…

17. Developmental Regression in Autism Spectrum Disorders

ERIC Educational Resources Information Center

Rogers, Sally J.

2004-01-01

The occurrence of developmental regression in autism is one of the more puzzling features of this disorder. Although several studies have documented the validity of parental reports of regression using home videos, accumulating data suggest that most children who demonstrate regression also demonstrated previous, subtle, developmental differences.…

18. Standards for Standardized Logistic Regression Coefficients

ERIC Educational Resources Information Center

Menard, Scott

2011-01-01

Standardized coefficients in logistic regression analysis have the same utility as standardized coefficients in linear regression analysis. Although there has been no consensus on the best way to construct standardized logistic regression coefficients, there is now sufficient evidence to suggest a single best approach to the construction of a…

19. Regression Analysis by Example. 5th Edition

ERIC Educational Resources Information Center

2012-01-01

Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. "Regression Analysis by Example, Fifth Edition" has been expanded and thoroughly…

20. Synthesizing Regression Results: A Factored Likelihood Method

ERIC Educational Resources Information Center

Wu, Meng-Jia; Becker, Betsy Jane

2013-01-01

Regression methods are widely used by researchers in many fields, yet methods for synthesizing regression results are scarce. This study proposes using a factored likelihood method, originally developed to handle missing data, to appropriately synthesize regression models involving different predictors. This method uses the correlations reported…

1. Streamflow forecasting using functional regression

Masselot, Pierre; Dabo-Niang, Sophie; Chebana, Fateh; Ouarda, Taha B. M. J.

2016-07-01

Streamflow, as a natural phenomenon, is continuous in time and so are the meteorological variables which influence its variability. In practice, it can be of interest to forecast the whole flow curve instead of points (daily or hourly). To this end, this paper introduces the functional linear models and adapts it to hydrological forecasting. More precisely, functional linear models are regression models based on curves instead of single values. They allow to consider the whole process instead of a limited number of time points or features. We apply these models to analyse the flow volume and the whole streamflow curve during a given period by using precipitations curves. The functional model is shown to lead to encouraging results. The potential of functional linear models to detect special features that would have been hard to see otherwise is pointed out. The functional model is also compared to the artificial neural network approach and the advantages and disadvantages of both models are discussed. Finally, future research directions involving the functional model in hydrology are presented.

2. Survival analysis and Cox regression.

PubMed

Benítez-Parejo, N; Rodríguez del Águila, M M; Pérez-Vicente, S

2011-01-01

The data provided by clinical trials are often expressed in terms of survival. The analysis of survival comprises a series of statistical analytical techniques in which the measurements analysed represent the time elapsed between a given exposure and the outcome of a certain event. Despite the name of these techniques, the outcome in question does not necessarily have to be either survival or death, and may be healing versus no healing, relief versus pain, complication versus no complication, relapse versus no relapse, etc. The present article describes the analysis of survival from both a descriptive perspective, based on the Kaplan-Meier estimation method, and in terms of bivariate comparisons using the log-rank statistic. Likewise, a description is provided of the Cox regression models for the study of risk factors or covariables associated to the probability of survival. These models are defined in both simple and multiple forms, and a description is provided of how they are calculated and how the postulates for application are checked - accompanied by illustrating examples with the shareware application R.

3. Estimating equivalence with quantile regression

USGS Publications Warehouse

2011-01-01

Equivalence testing and corresponding confidence interval estimates are used to provide more enlightened statistical statements about parameter estimates by relating them to intervals of effect sizes deemed to be of scientific or practical importance rather than just to an effect size of zero. Equivalence tests and confidence interval estimates are based on a null hypothesis that a parameter estimate is either outside (inequivalence hypothesis) or inside (equivalence hypothesis) an equivalence region, depending on the question of interest and assignment of risk. The former approach, often referred to as bioequivalence testing, is often used in regulatory settings because it reverses the burden of proof compared to a standard test of significance, following a precautionary principle for environmental protection. Unfortunately, many applications of equivalence testing focus on establishing average equivalence by estimating differences in means of distributions that do not have homogeneous variances. I discuss how to compare equivalence across quantiles of distributions using confidence intervals on quantile regression estimates that detect differences in heterogeneous distributions missed by focusing on means. I used one-tailed confidence intervals based on inequivalence hypotheses in a two-group treatment-control design for estimating bioequivalence of arsenic concentrations in soils at an old ammunition testing site and bioequivalence of vegetation biomass at a reclaimed mining site. Two-tailed confidence intervals based both on inequivalence and equivalence hypotheses were used to examine quantile equivalence for negligible trends over time for a continuous exponential model of amphibian abundance. ?? 2011 by the Ecological Society of America.

4. Breastfeeding and educational achievement at age 5.

PubMed

Heikkilä, Katriina; Kelly, Yvonne; Renfrew, Mary J; Sacker, Amanda; Quigley, Maria A

2014-01-01

Our aim was to investigate whether the duration of breastfeeding, at all or exclusively, is associated with educational achievement at age 5. We used data from a prospective, population-based UK cohort study, the Millennium Cohort Study (MCS). 5489 children from White ethnic background born at term in 2000-2001, attending school in England in 2006, were included in our analyses. Educational achievement was measured using the Foundation Stage Profile (FSP), a statutory assessment undertaken by teachers at the end of the child's first school year. Breastfeeding duration was ascertained from interviews with the mother when the child was 9 months old. We used modified Poisson's regression to model the association of breastfeeding duration with having reached a good level of achievement overall (≥78 overall points and ≥6 in 'personal, social and emotional development' and 'communication, language and literacy' points) and in specific areas (≥6 points) of development. Children who had been breastfed for up to 2 months were more likely to have reached a good level of overall achievement [adjusted rate ratio (RR): 1.09, 95% confidence interval (CI): 1.01, 1.19] than never breastfed children. This association was more marked in children breastfed for 2-4 months (adjusted RR: 1.17, 95% CI: 1.07, 1.29) and in those breastfed for longer than 4 months (adjusted RR: 1.16, 95% CI: 1.07, 1.26). The associations of exclusive breastfeeding with the educational achievement were similar. Our findings suggest that longer duration of breastfeeding, at all or exclusively, is associated with better educational achievement at age 5.

5. A Negative Binomial Regression Model for Accuracy Tests

ERIC Educational Resources Information Center

Hung, Lai-Fa

2012-01-01

Rasch used a Poisson model to analyze errors and speed in reading tests. An important property of the Poisson distribution is that the mean and variance are equal. However, in social science research, it is very common for the variance to be greater than the mean (i.e., the data are overdispersed). This study embeds the Rasch model within an…

6. The Poisson model limits in NBA basketball: Complexity in team sports

Martín-González, Juan Manuel; de Saá Guerra, Yves; García-Manso, Juan Manuel; Arriaza, Enrique; Valverde-Estévez, Teresa

2016-12-01

Team sports are frequently studied by researchers. There is presumption that scoring in basketball is a random process and that can be described using the Poisson Model. Basketball is a collaboration-opposition sport, where the non-linear local interactions among players are reflected in the evolution of the score that ultimately determines the winner. In the NBA, the outcomes of close games are often decided in the last minute, where fouls play a main role. We examined 6130 NBA games in order to analyze the time intervals between baskets and scoring dynamics. Most numbers of baskets (n) over a time interval (ΔT) follow a Poisson distribution, but some (e.g., ΔT = 10 s, n > 3) behave as a Power Law. The Poisson distribution includes most baskets in any game, in most game situations, but in close games in the last minute, the numbers of events are distributed following a Power Law. The number of events can be adjusted by a mixture of two distributions. In close games, both teams try to maintain their advantage solely in order to reach the last minute: a completely different game. For this reason, we propose to use the Poisson model as a reference. The complex dynamics will emerge from the limits of this model.

7. Poisson Growth Mixture Modeling of Intensive Longitudinal Data: An Application to Smoking Cessation Behavior

ERIC Educational Resources Information Center

Shiyko, Mariya P.; Li, Yuelin; Rindskopf, David

2012-01-01

Intensive longitudinal data (ILD) have become increasingly common in the social and behavioral sciences; count variables, such as the number of daily smoked cigarettes, are frequently used outcomes in many ILD studies. We demonstrate a generalized extension of growth mixture modeling (GMM) to Poisson-distributed ILD for identifying qualitatively…

8. About solvability of some boundary value problems for Poisson equation in a ball

Koshanova, Maira D.; Usmanov, Kairat I.; Turmetov, Batirkhan Kh.

2016-08-01

In the present paper, we study properties of some integro-differential operators of fractional order. As an application of the properties of these operators for Poisson equation we examine questions on solvability of a fractional analogue of the Neumann problem and analogues of periodic boundary value problems for circular domains. The exact conditions for solvability of these problems are found.

9. The Dependent Poisson Race Model and Modeling Dependence in Conjoint Choice Experiments

ERIC Educational Resources Information Center

Ruan, Shiling; MacEachern, Steven N.; Otter, Thomas; Dean, Angela M.

2008-01-01

Conjoint choice experiments are used widely in marketing to study consumer preferences amongst alternative products. We develop a class of choice models, belonging to the class of Poisson race models, that describe a "random utility" which lends itself to a process-based description of choice. The models incorporate a dependence structure which…

10. Non-Gaussian inference from non-linear and non-Poisson biased distributed data

Ata, Metin; Kitaura, Francisco-Shu; Müller, Volker

2014-05-01

We study the statistical inference of the cosmological dark matter density field from non-Gaussian, non-linear and non-Poisson biased distributed tracers. We have implemented a Bayesian posterior sampling computer-code solving this problem and tested it with mock data based on N-body simulations.

11. The Poisson-Boltzmann theory for the two-plates problem: some exact results.

PubMed

Xing, Xiang-Jun

2011-12-01

The general solution to the nonlinear Poisson-Boltzmann equation for two parallel charged plates, either inside a symmetric electrolyte, or inside a 2q:-q asymmetric electrolyte, is found in terms of Weierstrass elliptic functions. From this we derive some exact asymptotic results for the interaction between charged plates, as well as the exact form of the renormalized surface charge density.

12. A parallel 3D poisson solver for space charge simulation in cylindrical coordinates.

SciTech Connect

Xu, J.; Ostroumov, P. N.; Nolen, J.; Physics

2008-02-01

This paper presents the development of a parallel three-dimensional Poisson solver in cylindrical coordinate system for the electrostatic potential of a charged particle beam in a circular tube. The Poisson solver uses Fourier expansions in the longitudinal and azimuthal directions, and Spectral Element discretization in the radial direction. A Dirichlet boundary condition is used on the cylinder wall, a natural boundary condition is used on the cylinder axis and a Dirichlet or periodic boundary condition is used in the longitudinal direction. A parallel 2D domain decomposition was implemented in the (r,{theta}) plane. This solver was incorporated into the parallel code PTRACK for beam dynamics simulations. Detailed benchmark results for the parallel solver and a beam dynamics simulation in a high-intensity proton LINAC are presented. When the transverse beam size is small relative to the aperture of the accelerator line, using the Poisson solver in a Cartesian coordinate system and a Cylindrical coordinate system produced similar results. When the transverse beam size is large or beam center located off-axis, the result from Poisson solver in Cartesian coordinate system is not accurate because different boundary condition used. While using the new solver, we can apply circular boundary condition easily and accurately for beam dynamic simulations in accelerator devices.

13. A generalized Poisson equation and short-range self-interaction energies.

PubMed

Varganov, Sergey A; Gilbert, Andrew T B; Gill, Peter M W

2008-06-28

We generalize the Poisson equation to attenuated Newtonian potentials. If the attenuation is at least exponential, the equation provides a local mapping between the density and its potential. We use this to derive several density functionals for the short-range self-interaction energy.

14. Anatomy of the Generalized Inverse Gaussian-Poisson Distribution with Special Applications to Bibliometric Studies.

ERIC Educational Resources Information Center

Sichel, H. S.

1992-01-01

Discusses the use of the generalized inverse Gaussian-Poisson (GIGP) distribution in bibliometric studies. The main types of size-frequency distributions are described, bibliometric distributions in logarithms are examined; parameter estimation is discussed; and goodness-of-fit tests are considered. Examples of applications are included. (17…

15. Poisson ratio and excess low-frequency vibrational states in glasses.

PubMed

Duval, Eugène; Deschamps, Thierry; Saviot, Lucien

2013-08-14

In glass, starting from a dependence of the Angell's fragility on the Poisson ratio [V. N. Novikov and A. P. Sokolov, Nature 431, 961 (2004)], and a dependence of the Poisson ratio on the atomic packing density [G. N. Greaves, A. L. Greer, R. S. Lakes, and T. Rouxel, Nature Mater. 10, 823 (2011)], we propose that the heterogeneities are predominantly density fluctuations in strong glasses (lower Poisson ratio) and shear elasticity fluctuations in fragile glasses (higher Poisson ratio). Because the excess of low-frequency vibration modes in comparison with the Debye regime (boson peak) is strongly connected to these fluctuations, we propose that they are breathing-like (with change of volume) in strong glasses and shear-like (without change of volume) in fragile glasses. As a verification, it is confirmed that the excess modes in the strong silica glass are predominantly breathing-like. Moreover, it is shown that the excess breathing-like modes in a strong polymeric glass are replaced by shear-like modes under hydrostatic pressure as the glass becomes more compact.

16. Testing the equality of two Poisson means using the rate ratio.

PubMed

Ng, Hon Keung Tony; Tang, Man-Lai

2005-03-30

In this article, we investigate procedures for comparing two independent Poisson variates that are observed over unequal sampling frames (i.e. time intervals, populations, areas or any combination thereof). We consider two statistics (with and without the logarithmic transformation) for testing the equality of two Poisson rates. Two methods for implementing these statistics are reviewed. They are (1) the sample-based method, and (2) the constrained maximum likelihood estimation (CMLE) method. We conduct an empirical study to evaluate the performance of different statistics and methods. Generally, we find that the CMLE method works satisfactorily only for the statistic without the logarithmic transformation (denoted as W(2)) while sample-based method performs better for the statistic using the logarithmic transformation (denoted as W(3)). It is noteworthy that both statistics perform well for moderate to large Poisson rates (e.g. > or =10). For small Poisson rates (e.g. <10), W(2) can be liberal (e.g. actual type I error rate/nominal level > or =1.2) while W(3) can be conservative (e.g. actual type I error rate/nominal level < or =0.8). The corresponding sample size formulae are provided and valid in the sense that the simulated powers associated with the approximate sample size formulae are generally close to the pre-chosen power level. We illustrate our methodologies with a real example from a breast cancer study.

17. Birth and Death Process Modeling Leads to the Poisson Distribution: A Journey Worth Taking

ERIC Educational Resources Information Center

Rash, Agnes M.; Winkel, Brian J.

2009-01-01

This paper describes details of development of the general birth and death process from which we can extract the Poisson process as a special case. This general process is appropriate for a number of courses and units in courses and can enrich the study of mathematics for students as it touches and uses a diverse set of mathematical topics, e.g.,…

18. Updating a Classic: "The Poisson Distribution and the Supreme Court" Revisited

ERIC Educational Resources Information Center

Cole, Julio H.

2010-01-01

W. A. Wallis studied vacancies in the US Supreme Court over a 96-year period (1837-1932) and found that the distribution of the number of vacancies per year could be characterized by a Poisson model. This note updates this classic study.

19. An unbiased risk estimator for image denoising in the presence of mixed poisson-gaussian noise.

PubMed

Le Montagner, Yoann; Angelini, Elsa D; Olivo-Marin, Jean-Christophe

2014-03-01

The behavior and performance of denoising algorithms are governed by one or several parameters, whose optimal settings depend on the content of the processed image and the characteristics of the noise, and are generally designed to minimize the mean squared error (MSE) between the denoised image returned by the algorithm and a virtual ground truth. In this paper, we introduce a new Poisson-Gaussian unbiased risk estimator (PG-URE) of the MSE applicable to a mixed Poisson-Gaussian noise model that unifies the widely used Gaussian and Poisson noise models in fluorescence bioimaging applications. We propose a stochastic methodology to evaluate this estimator in the case when little is known about the internal machinery of the considered denoising algorithm, and we analyze both theoretically and empirically the characteristics of the PG-URE estimator. Finally, we evaluate the PG-URE-driven parametrization for three standard denoising algorithms, with and without variance stabilizing transforms, and different characteristics of the Poisson-Gaussian noise mixture.

20. Comparison of a hydrogel model to the Poisson-Boltzmann cell model

Claudio, Gil C.; Kremer, Kurt; Holm, Christian

2009-09-01

We have investigated a single charged microgel in aqueous solution with a combined simulational model and Poisson-Boltzmann theory. In the simulations we use a coarse-grained charged bead-spring model in a dielectric continuum, with explicit counterions and full electrostatic interactions under periodic and nonperiodic boundary conditions. The Poisson-Boltzmann hydrogel model is that of a single charged colloid confined to a spherical cell where the counterions are allowed to enter the uniformly charged sphere. In order to investigate the origin of the differences these two models may give, we performed a variety of simulations of different hydrogel models which were designed to test for the influence of charge correlations, excluded volume interactions, arrangement of charges along the polymer chains, and thermal fluctuations in the chains of the gel. These intermediate models systematically allow us to connect the Poisson-Boltzmann cell model to the bead-spring model hydrogel model in a stepwise manner thereby testing various approximations. Overall, the simulational results of all these hydrogel models are in good agreement, especially for the number of confined counterions within the gel. Our results support the applicability of the Poisson-Boltzmann cell model to study ionic properties of hydrogels under dilute conditions.

1. Poisson-Helmholtz-Boltzmann model of the electric double layer: analysis of monovalent ionic mixtures.

PubMed

Bohinc, Klemen; Shrestha, Ahis; Brumen, Milan; May, Sylvio

2012-03-01

In the classical mean-field description of the electric double layer, known as the Poisson-Boltzmann model, ions interact exclusively through their Coulomb potential. Ion specificity can arise through solvent-mediated, nonelectrostatic interactions between ions. We employ the Yukawa pair potential to model the presence of nonelectrostatic interactions. The combination of Yukawa and Coulomb potential on the mean-field level leads to the Poisson-Helmholtz-Boltzmann model, which employs two auxiliary potentials: one electrostatic and the other nonelectrostatic. In the present work we apply the Poisson-Helmholtz-Boltzmann model to ionic mixtures, consisting of monovalent cations and anions that exhibit different Yukawa interaction strengths. As a specific example we consider a single charged surface in contact with a symmetric monovalent electrolyte. From the minimization of the mean-field free energy we derive the Poisson-Boltzmann and Helmholtz-Boltzmann equations. These nonlinear equations can be solved analytically in the weak perturbation limit. This together with numerical solutions in the nonlinear regime suggests an intricate interplay between electrostatic and nonelectrostatic interactions. The structure and free energy of the electric double layer depends sensitively on the Yukawa interaction strengths between the different ion types and on the nonelectrostatic interactions of the mobile ions with the surface.

2. Enhanced Night Vision Via a Combination of Poisson Interpolation and Machine Learning

DTIC Science & Technology

2006-02-01

perceptual cues. We have developed three new image -processing techniques to address these problems. These include non -linear spatio-temporal denoising...methods to low-light visible, near infrared (NIR), and short-wave infrared images (SWIR). In this annual report on the first phase of our research...Poisson Interpolation, Adaptive Filters, Belief Propagation, SWIR, Image Fusion 16. SECURITY CLASSIFICATION OF

3. Analysis of Large Data Logs: An Application of Poisson Sampling on Excite Web Queries.

ERIC Educational Resources Information Center

Ozmutlu, H. Cenk; Spink, Amanda; Ozmutlu, Seda

2002-01-01

Discusses the need for tools that allow effective analysis of search engine queries to provide a greater understanding of Web users' information seeking behavior and describes a study that developed an effective strategy for selecting samples from large-scale data sets. Reports on Poisson sampling with data logs from the Excite search engine.…

4. Effect of storage time and temperature on Poisson ratio of tomato fruit skin

Kuna-Broniowska, I.; Gładyszewska, B.; Ciupak, A.

2012-02-01

The results of studies investigating the effects of storage time and temperature on variations in Poisson ratio of the skin of two greenhouse tomato varieties - Admiro and Encore were presented. In the initial period of the study, Poisson ratio of the skin of tomato fruit cv. Admiro, stored at 13°C, varied between 0.7 and 0.8. After the successive 10 days of the experiment, it decreased to approximately 0.6 and was stabilized until the end of study. By contrast, the skin of tomatoes cv. Encore was characterized by lower values and lower variability of Poisson ratio in the range of 0.4 to 0.5 during storage. The examinations involving tomato fruit cv. Admiro stored at 21°C were completed after 12 days due to fruit softening and progressive difficulty with preparing analytical specimens. The value of Poisson ratio for both varieties stored at room temperature fluctuated throughout the experiment to approximate 0.5.

5. Parallel FFT-based Poisson Solver for Isolated Three-dimensional Systems

SciTech Connect

Budiardja, Reuben D; Cardall, Christian Y

2011-01-01

We describe an implementation to solve Poisson's equation for an isolated system on a unigrid mesh using FFTs. The method solves the equation globally on mesh blocks distributed across multiple processes on a distributed-memory parallel computer. Test results to demonstrate the convergence and scaling properties of the implementation are presented. The solver is offered to interested users as the library PSPFFT.

6. Poisson's ratio for polycrystalline silicon used in disk-shaped microresonators.

PubMed

Meitzler, Allen H

2006-02-01

Integrated circuit technology has been used to fabricate miniature disk resonators of polycrystalline silicon that operate at frequencies above 100 MHz. The ratios of low-order resonant frequencies in these resonators can be used to determine the value of Poisson's ratio and to confirm assumptions regarding homogeneity and isotropy.

7. Relative and Absolute Error Control in a Finite-Difference Method Solution of Poisson's Equation

ERIC Educational Resources Information Center

Prentice, J. S. C.

2012-01-01

An algorithm for error control (absolute and relative) in the five-point finite-difference method applied to Poisson's equation is described. The algorithm is based on discretization of the domain of the problem by means of three rectilinear grids, each of different resolution. We discuss some hardware limitations associated with the algorithm,…

8. Nonlinear analysis of the cold fluid-Poisson plasma by using the characteristic method

Lee, Hee J.

2016-10-01

We show that the Vlasov and Euler equations can be transformed into each other along the same characteristics on the ( x, t) plane. Therefore, the Vlasov-Poisson plasma may have common features that are contained in the cold fluid equations: the Euler equation, the continuity equation, and the Poisson equation. Here, the cold fluid equation does not mean the moment equation of the Boltzmann equation. We show that the compensated electron fluid equations can be solved linearly along the characteristics. We address an ion plasma with Boltzmann-distributed electrons as a Cauchy initial-boundary value problem for which initial data are provided by compatible solutions of the Poisson equation. In this plasma, the set of nonlinear cold fluid equations can be approached by arranging them in Riemann invariant equations or via a hodograph transform. The result of this arrangement is linear equations, thus suggesting a way to investigate a cold fluid nonlinear plasma without directly engaging the nonlinearity. The Poisson equation corresponds to the entropy equation in the gas dynamic equations. Analogously, a power law similar to the polytropic gas law in gas dynamics is assumed between the electric potential and the density.

9. A proximal iteration for deconvolving Poisson noisy images using sparse representations.

PubMed

Dupé, François-Xavier; Fadili, Jalal M; Starck, Jean-Luc

2009-02-01

We propose an image deconvolution algorithm when the data is contaminated by Poisson noise. The image to restore is assumed to be sparsely represented in a dictionary of waveforms such as the wavelet or curvelet transforms. Our key contributions are as follows. First, we handle the Poisson noise properly by using the Anscombe variance stabilizing transform leading to a nonlinear degradation equation with additive Gaussian noise. Second, the deconvolution problem is formulated as the minimization of a convex functional with a data-fidelity term reflecting the noise properties, and a nonsmooth sparsity-promoting penalty over the image representation coefficients (e.g., l(1) -norm). An additional term is also included in the functional to ensure positivity of the restored image. Third, a fast iterative forward-backward splitting algorithm is proposed to solve the minimization problem. We derive existence and uniqueness conditions of the solution, and establish convergence of the iterative algorithm. Finally, a GCV-based model selection procedure is proposed to objectively select the regularization parameter. Experimental results are carried out to show the striking benefits gained from taking into account the Poisson statistics of the noise. These results also suggest that using sparse-domain regularization may be tractable in many deconvolution applications with Poisson noise such as astronomy and microscopy.

10. The Cauchy Problem for the 3-D Vlasov-Poisson System with Point Charges

Marchioro, Carlo; Miot, Evelyne; Pulvirenti, Mario

2011-07-01

In this paper we establish global existence and uniqueness of the solution to the three-dimensional Vlasov-Poisson system in the presence of point charges with repulsive interaction. The present analysis extends an analogous two-dimensional result (Caprino and Marchioro in Kinet. Relat. Models 3(2):241-254, 2010).

11. C1-continuous Virtual Element Method for Poisson-Kirchhoff plate problem

SciTech Connect

2016-09-20

We present a family of C1-continuous high-order Virtual Element Methods for Poisson-Kirchho plate bending problem. The convergence of the methods is tested on a variety of meshes including rectangular, quadrilateral, and meshes obtained by edge removal (i.e. highly irregular meshes). The convergence rates are presented for all of these tests.

12. Multi-Parameter Linear Least-Squares Fitting to Poisson Data One Count at a Time

NASA Technical Reports Server (NTRS)

Wheaton, W.; Dunklee, A.; Jacobson, A.; Ling, J.; Mahoney, W.; Radocinski, R.

1993-01-01

A standard problem in gamma-ray astronomy data analysis is the decomposition of a set of observed counts, described by Poisson statistics, according to a given multi-component linear model, with underlying physical count rates or fluxes which are to be estimated from the data.

13. Developmental regression in autism spectrum disorder.

PubMed

Al Backer, Nouf Backer

2015-01-01

The occurrence of developmental regression in autism spectrum disorder (ASD) is one of the most puzzling phenomena of this disorder. A little is known about the nature and mechanism of developmental regression in ASD. About one-third of young children with ASD lose some skills during the preschool period, usually speech, but sometimes also nonverbal communication, social or play skills are also affected. There is a lot of evidence suggesting that most children who demonstrate regression also had previous, subtle, developmental differences. It is difficult to predict the prognosis of autistic children with developmental regression. It seems that the earlier development of social, language, and attachment behaviors followed by regression does not predict the later recovery of skills or better developmental outcomes. The underlying mechanisms that lead to regression in autism are unknown. The role of subclinical epilepsy in the developmental regression of children with autism remains unclear.

14. Norming clinical questionnaires with multiple regression: the Pain Cognition List.

PubMed

Van Breukelen, Gerard J P; Vlaeyen, Johan W S

2005-09-01

Questionnaires for measuring patients' feelings or beliefs are commonly used in clinical settings for diagnostic purposes, clinical decision making, or treatment evaluation. Raw scores of a patient can be evaluated by comparing them with norms based on a reference population. Using the Pain Cognition List (PCL-2003) as an example, this article shows how clinical questionnaires can be normed with multiple regression of raw scores on demographic and other patient variables. Compared with traditional norm tables for subgroups based on age or gender, this approach offers 2 advantages. First, multiple regression allows determination of which patient variables are relevant to the norming and which are not (validity). Second, by using information from the entire sample, multiple regression leads to continuous and more stable norms for any subgroup defined in terms of prognostic variables (reliability).

15. Cancer in Women over 50 Years of Age: A Focus on Smoking

PubMed Central

Baccaro, Luiz Francisco; Conde, Délio Marques; Costa-Paiva, Lúcia; Machado, Vanessa de Souza Santos; Pinto-Neto, Aarão Mendes

2015-01-01

The increase in life expectancy worldwide has resulted in a greater prevalence of chronic non-communicable diseases. This study aims to evaluate the prevalence and factors associated with the occurrence of cancer among Brazilian women over the age of 50. A cross-sectional study with 622 women over the age of 50 was performed using a population survey. The outcome variable was the occurrence of a malignant tumor in any location. The independent variables were sociodemographic characteristics, self-perception of health, health-related habits and morbidities. Statistical analysis was carried out using the chi-square test and Poisson regression. The mean age of the women was 64.1 years. The prevalence of cancer was 6.8%. The main sites of occurrence of malignant tumors were the breast (31.9%), colorectal (12.7%) and skin (12.7%). In the final statistical model, the only factor associated with cancer was smoking > 15 cigarettes/day either currently or in the past: PR 2.03 (95% CI 1.06–3.89). The results have improved understanding of the prevalence and factors associated with cancer in Brazilian women aged 50 years or more. They should be encouraged to maintain a healthy lifestyle and pay particular attention to modifiable risk factors such as smoking. PMID:25790469

16. Age-adjusted mortality and its association to variations in urban conditions in Shanghai.

PubMed

Takano, Takehito; Fu, Jia; Nakamura, Keiko; Uji, Kazuyuki; Fukuda, Yoshiharu; Watanabe, Masafumi; Nakajima, Hiroshi

2002-09-01

The objective of this study was to explore the association between health and urbanization in a megacity, Shanghai, by calculating the age-adjusted mortality ratio by ward-unit of Shanghai and by examining relationships between mortalities and urban indicators. Crude mortality rates and age-adjusted mortality ratios by ward-unit were calculated. Demographic, residential environment, healthcare, and socioeconomic indicators were formulated for each of the ward-units between 1995 and 1998. Correlation and Poisson regression analyses were performed to examine the association between urban indicators and mortalities. The crude mortality rate by ward-unit in 1997 varied from 6.3 to 9.4 deaths per 1000 population. The age-adjusted mortality ratio in 1997 by ward-units as reference to the average mortality of urban China varied from 57.8 to 113.3 within Shanghai. Age-adjusted mortalities were inversely related with indicators of a larger floor space of dwellings per population, a larger proportion of parks, gardens, and green areas to total land area; a greater number of health professionals per population; and a greater number of employees in retail business per population. Spacious living showed independent association to a higher standard of community health in Shanghai (P < 0.05). Consequences of health policy and the developments of urban infrastructural resources from the viewpoint of the Healthy Cities concept were discussed.

17. Age-Based Methods to Explore Time-Related Variables in Occupational Epidemiology Studies

SciTech Connect

Janice P. Watkins, Edward L. Frome, Donna L. Cragle

2005-08-31

Although age is recognized as the strongest predictor of mortality in chronic disease epidemiology, a calendar-based approach is often employed when evaluating time-related variables. An age-based analysis file, created by determining the value of each time-dependent variable for each age that a cohort member is followed, provides a clear definition of age at exposure and allows development of diverse analytic models. To demonstrate methods, the relationship between cancer mortality and external radiation was analyzed with Poisson regression for 14,095 Oak Ridge National Laboratory workers. Based on previous analysis of this cohort, a model with ten-year lagged cumulative radiation doses partitioned by receipt before (dose-young) or after (dose-old) age 45 was examined. Dose-response estimates were similar to calendar-year-based results with elevated risk for dose-old, but not when film badge readings were weekly before 1957. Complementary results showed increasing risk with older hire ages and earlier birth cohorts, since workers hired after age 45 were born before 1915, and dose-young and dose-old were distributed differently by birth cohorts. Risks were generally higher for smokingrelated than non-smoking-related cancers. It was difficult to single out specific variables associated with elevated cancer mortality because of: (1) birth cohort differences in hire age and mortality experience completeness, and (2) time-period differences in working conditions, dose potential, and exposure assessment. This research demonstrated the utility and versatility of the age-based approach.

18. Regional variation in Moho depth and Poisson's ratio beneath eastern China and its tectonic implications

Wei, Zigen; Chen, Ling; Li, Zhiwei; Ling, Yuan; Li, Jing

2016-01-01

Eastern China comprises a complex amalgamation of geotectonic blocks of different ages and undergone significant modification of lithosphere during the Meso-Cenozoic time. To better characterize its deep structure, we conducted H-κ stacking of receiver functions using teleseismic data collected from 1143 broadband stations and produced a unified and detailed map of Moho depth and average Poisson's ratio (σ) of eastern China. A coexistence of modified and preserved crust with generally in Airy-type isostatic equilibrium was revealed in eastern China, which correlates well with regional geological and tectonic features. Crust is obviously thicker to the west of the North-South Gravity Lineament but exhibits complex variations in σ with an overall felsic to intermediate bulk crustal composition. Moho depth and σ values show striking differences as compared to the surrounding areas in the rifts and tectonic boundary zones, where earthquakes usually occur. Systematic comparison of Moho depth and σ values demonstrated that there are both similarities and differences in the crustal structure among the Northeast China, North China Craton, South China, and the Qinling-Dabie Orogen as well as different areas within these blocks, which may result from their different evolutionary histories and strong tectonic-magma events since the Mesozoic. Using new data from dense temporary arrays, we observed a change of Moho depth by ∼3 km and of σ by ∼0.04 beneath the Tanlu Fault Zone and an alteration of Moho depth by ∼5 km and of σ by ∼0.05 beneath the Xuefeng Mountains. In addition, striking E-W difference in crustal structure occur across the Xuefeng Mountains: to the east, the Moho depth is overall <35 km and σ has values of <0.26; to the west, the Moho depth is generally >40 km and σ shows complex and large-range variation with values between 0.22 and 0.32. These, together with waveform inversion of receiver functions and SKS shear-wave splitting measurements

19. Regression analysis for solving diagnosis problem of children's health

Cherkashina, Yu A.; Gerget, O. M.

2016-04-01

The paper includes results of scientific researches. These researches are devoted to the application of statistical techniques, namely, regression analysis, to assess the health status of children in the neonatal period based on medical data (hemostatic parameters, parameters of blood tests, the gestational age, vascular-endothelial growth factor) measured at 3-5 days of children's life. In this paper a detailed description of the studied medical data is given. A binary logistic regression procedure is discussed in the paper. Basic results of the research are presented. A classification table of predicted values and factual observed values is shown, the overall percentage of correct recognition is determined. Regression equation coefficients are calculated, the general regression equation is written based on them. Based on the results of logistic regression, ROC analysis was performed, sensitivity and specificity of the model are calculated and ROC curves are constructed. These mathematical techniques allow carrying out diagnostics of health of children providing a high quality of recognition. The results make a significant contribution to the development of evidence-based medicine and have a high practical importance in the professional activity of the author.

20. A Multiple Regression Approach to Normalization of Spatiotemporal Gait Features.

PubMed

Wahid, Ferdous; Begg, Rezaul; Lythgo, Noel; Hass, Chris J; Halgamuge, Saman; Ackland, David C

2016-04-01

Normalization of gait data is performed to reduce the effects of intersubject variations due to physical characteristics. This study reports a multiple regression normalization approach for spatiotemporal gait data that takes into account intersubject variations in self-selected walking speed and physical properties including age, height, body mass, and sex. Spatiotemporal gait data including stride length, cadence, stance time, double support time, and stride time were obtained from healthy subjects including 782 children, 71 adults, 29 elderly subjects, and 28 elderly Parkinson's disease (PD) patients. Data were normalized using standard dimensionless equations, a detrending method, and a multiple regression approach. After normalization using dimensionless equations and the detrending method, weak to moderate correlations between walking speed, physical properties, and spatiotemporal gait features were observed (0.01 < |r| < 0.88), whereas normalization using the multiple regression method reduced these correlations to weak values (|r| <0.29). Data normalization using dimensionless equations and detrending resulted in significant differences in stride length and double support time of PD patients; however the multiple regression approach revealed significant differences in these features as well as in cadence, stance time, and stride time. The proposed multiple regression normalization may be useful in machine learning, gait classification, and clinical evaluation of pathological gait patterns.

1. A Fast Poisson Solver with Periodic Boundary Conditions for GPU Clusters in Various Configurations

Rattermann, Dale Nicholas

Fast Poisson solvers using the Fast Fourier Transform on uniform grids are especially suited for parallel implementation, making them appropriate for portability on graphical processing unit (GPU) devices. The goal of the following work was to implement, test, and evaluate a fast Poisson solver for periodic boundary conditions for use on a variety of GPU configurations. The solver used in this research was FLASH, an immersed-boundary-based method, which is well suited for complex, time-dependent geometries, has robust adaptive mesh refinement/de-refinement capabilities to capture evolving flow structures, and has been successfully implemented on conventional, parallel supercomputers. However, these solvers are still computationally costly to employ, and the total solver time is dominated by the solution of the pressure Poisson equation using state-of-the-art multigrid methods. FLASH improves the performance of its multigrid solvers by integrating a parallel FFT solver on a uniform grid during a coarse level. This hybrid solver could then be theoretically improved by replacing the highly-parallelizable FFT solver with one that utilizes GPUs, and, thus, was the motivation for my research. In the present work, the CPU-utilizing parallel FFT solver (PFFT) used in the base version of FLASH for solving the Poisson equation on uniform grids has been modified to enable parallel execution on CUDA-enabled GPU devices. New algorithms have been implemented to replace the Poisson solver that decompose the computational domain and send each new block to a GPU for parallel computation. One-dimensional (1-D) decomposition of the computational domain minimizes the amount of network traffic involved in this bandwidth-intensive computation by limiting the amount of all-to-all communication required between processes. Advanced techniques have been incorporated and implemented in a GPU-centric code design, while allowing end users the flexibility of parameter control at runtime in

2. Effects of Lowering the Minimum Alcohol Purchasing Age on Weekend Assaults Resulting in Hospitalization in New Zealand

PubMed Central

Davie, Gabrielle; McElduff, Patrick; Connor, Jennie; Langley, John

2014-01-01

Objectives. We estimated the effects on assault rates of lowering the minimum alcohol purchasing age in New Zealand from 20 to 18 years. We hypothesized that the law change would increase assaults among young people aged 18 to 19 years (the target group) and those aged 15 to 17 years via illegal sales or alcohol supplied by older friends or family members. Methods. Using Poisson regression, we examined weekend assaults resulting in hospitalization from 1995 to 2011. Outcomes were assessed separately by gender among young people aged 15 to 17 years and those aged 18 to 19 years, with those aged 20 and 21 years included as a control group. Results. Relative to young men aged 20 to 21 years, assaults increased significantly among young men aged 18 to 19 years between 1995 and 1999 (the period before the law change), as well as the postchange periods 2003 to 2007 (incidence rate ratio [IRR] = 1.21; 95% confidence interval [CI] = 1.05, 1.39) and 2008 to 2011 (IRR = 1.20; 95% CI = 1.05, 1.37). Among boys aged 15 to 17 years, assaults increased during the postchange periods 1999 to 2003 (IRR = 1.28; 95% CI = 1.10, 1.49) and 2004 to 2007 (IRR = 1.25; 95% CI = 1.08, 1.45). There were no statistically significant effects among girls and young women. Conclusions. Lowering the minimum alcohol purchasing age increased weekend assaults resulting in hospitalization among young males 15 to 19 years of age. PMID:24922142

3. Process modeling with the regression network.

PubMed

van der Walt, T; Barnard, E; van Deventer, J

1995-01-01

A new connectionist network topology called the regression network is proposed. The structural and underlying mathematical features of the regression network are investigated. Emphasis is placed on the intricacies of the optimization process for the regression network and some measures to alleviate these difficulties of optimization are proposed and investigated. The ability of the regression network algorithm to perform either nonparametric or parametric optimization, as well as a combination of both, is also highlighted. It is further shown how the regression network can be used to model systems which are poorly understood on the basis of sparse data. A semi-empirical regression network model is developed for a metallurgical processing operation (a hydrocyclone classifier) by building mechanistic knowledge into the connectionist structure of the regression network model. Poorly understood aspects of the process are provided for by use of nonparametric regions within the structure of the semi-empirical connectionist model. The performance of the regression network model is compared to the corresponding generalization performance results obtained by some other nonparametric regression techniques.

4. Quantile regression applied to spectral distance decay

USGS Publications Warehouse

2008-01-01

Remotely sensed imagery has long been recognized as a powerful support for characterizing and estimating biodiversity. Spectral distance among sites has proven to be a powerful approach for detecting species composition variability. Regression analysis of species similarity versus spectral distance allows us to quantitatively estimate the amount of turnover in species composition with respect to spectral and ecological variability. In classical regression analysis, the residual sum of squares is minimized for the mean of the dependent variable distribution. However, many ecological data sets are characterized by a high number of zeroes that add noise to the regression model. Quantile regressions can be used to evaluate trend in the upper quantiles rather than a mean trend across the whole distribution of the dependent variable. In this letter, we used ordinary least squares (OLS) and quantile regressions to estimate the decay of species similarity versus spectral distance. The achieved decay rates were statistically nonzero (p < 0.01), considering both OLS and quantile regressions. Nonetheless, the OLS regression estimate of the mean decay rate was only half the decay rate indicated by the upper quantiles. Moreover, the intercept value, representing the similarity reached when the spectral distance approaches zero, was very low compared with the intercepts of the upper quantiles, which detected high species similarity when habitats are more similar. In this letter, we demonstrated the power of using quantile regressions applied to spectral distance decay to reveal species diversity patterns otherwise lost or underestimated by OLS regression. ?? 2008 IEEE.

5. [From clinical judgment to linear regression model.

PubMed

Palacios-Cruz, Lino; Pérez, Marcela; Rivas-Ruiz, Rodolfo; Talavera, Juan O

2013-01-01

When we think about mathematical models, such as linear regression model, we think that these terms are only used by those engaged in research, a notion that is far from the truth. Legendre described the first mathematical model in 1805, and Galton introduced the formal term in 1886. Linear regression is one of the most commonly used regression models in clinical practice. It is useful to predict or show the relationship between two or more variables as long as the dependent variable is quantitative and has normal distribution. Stated in another way, the regression is used to predict a measure based on the knowledge of at least one other variable. Linear regression has as it's first objective to determine the slope or inclination of the regression line: Y = a + bx, where "a" is the intercept or regression constant and it is equivalent to "Y" value when "X" equals 0 and "b" (also called slope) indicates the increase or decrease that occurs when the variable "x" increases or decreases in one unit. In the regression line, "b" is called regression coefficient. The coefficient of determination (R(2)) indicates the importance of independent variables in the outcome.

6. Geodesic least squares regression on information manifolds

SciTech Connect

Verdoolaege, Geert

2014-12-05

We present a novel regression method targeted at situations with significant uncertainty on both the dependent and independent variables or with non-Gaussian distribution models. Unlike the classic regression model, the conditional distribution of the response variable suggested by the data need not be the same as the modeled distribution. Instead they are matched by minimizing the Rao geodesic distance between them. This yields a more flexible regression method that is less constrained by the assumptions imposed through the regression model. As an example, we demonstrate the improved resistance of our method against some flawed model assumptions and we apply this to scaling laws in magnetic confinement fusion.

7. Testing the gonadal regression-cytoprotection hypothesis.

PubMed

Crawford, B A; Spaliviero, J A; Simpson, J M; Handelsman, D J

1998-11-15

Germinal damage is an almost universal accompaniment of cancer treatment as the result of bystander damage to the testis from cytotoxic drugs and/or irradiation. Cancer treatment for the most common cancers of the reproductive age group in men has improved such that most are now treated with curative intent, and many others are treated with likelihood of prolonged survival, so that the preservation of fertility is an important component of posttreatment quality of life. This has led to the consideration of developing adjuvant treatments that may reduce the gonadal toxicity of cancer therapy. One dominant hypothesis has been based on the supposition that the immature testis was resistant to cytotoxin damage. Hence, if hormonal treatment were able to cause spermatogenic regression to an immature state via an effective withdrawal of gonadotrophin secretion, the testis might be maintained temporarily in a protected state during cytotoxin exposure. However, clinical studies have been disappointing but have also been unable to test the hypothesis definitively thus far, due to the inability to completely suppress gonadotrophin secretion. Similarly, experimental models have also given conflicting results and, at best, a modest cytoprotection. To definitively test this hypothesis experimentally, we used the fact that the functionally hpg mouse has complete gonadotrophin deficiency but can undergo the induction of full spermatogenesis by testosterone. Thus, if complete gonadotrophin deficiency were an advantage during cytotoxin exposure, then the hpg mouse should exhibit some degree of germinal protection against cytotoxin-induced damage. We therefore administered three different cytotoxins (200 mg/kg procarbazine, 9 mg/kg doxorubicin, 8 Gy of X irradiation) to produce a range of severity in testicular damage and mechanism of action to either phenotypically normal or hpg mice. Testis weight and homogenization-resistant spermatid numbers were measured to evaluate the

8. HR+/Her2- breast cancer in pre-menopausal women: The impact of younger age on clinical characteristics at diagnosis, disease management and survival.

PubMed

De Camargo Cancela, Marianna; Comber, Harry; Sharp, Linda

2016-12-01

Young women (20-39 years-old) with breast cancer are diagnosed with more aggressive tumours and consequently have poorer survival. However, there is an evidence gap as to whether age has an independent effect on survival of pre-menopausal women diagnosed with HR+/Her2- tumours. The aim of this population-based study was to compare characteristics at diagnosis, determinants of treatment and survival in women aged 20-39 and 40-49 years diagnosed with HR+/Her2- tumours. From the National Cancer Registry Ireland, we identified women aged 20-49 diagnosed with a first invasive HR+/Her2- breast cancer during 2002-2008. Women aged 20-39 were compared to those aged 40-49 years. Poisson regression with robust error variance was used to explore the impact of age on treatment receipt. Associations between age and survival from all causes was investigated using Cox models. In multivariate models, women aged 20-39 significantly more often having no cancer-directed surgery (IRR=1.49, 95%CI 1.07, 2.08). In those having surgery, younger age was associated with significantly higher likelihood of receiving chemotherapy; age was not associated with receipt of adjuvant radiotherapy or endocrine therapy. Women aged 20-39 undergoing surgery were significantly more likely to die than women aged 40-49 (HR=1.84, 95%CI: 1.31, 2.59). Age is an independent prognostic factor in younger women diagnosed with HR+/Her2- breast cancer, supporting the hypothesis that breast cancer in women under 40 has more aggressive behaviour, even within HR+/Her2- tumours. Future research should explore the reasons for poorer survival in order to inform strategies to improve outcomes in this age group.

9. Efficient Levenberg-Marquardt minimization of the maximum likelihood estimator for Poisson deviates

SciTech Connect

Laurence, T; Chromy, B

2009-11-10

Histograms of counted events are Poisson distributed, but are typically fitted without justification using nonlinear least squares fitting. The more appropriate maximum likelihood estimator (MLE) for Poisson distributed data is seldom used. We extend the use of the Levenberg-Marquardt algorithm commonly used for nonlinear least squares minimization for use with the MLE for Poisson distributed data. In so doing, we remove any excuse for not using this more appropriate MLE. We demonstrate the use of the algorithm and the superior performance of the MLE using simulations and experiments in the context of fluorescence lifetime imaging. Scientists commonly form histograms of counted events from their data, and extract parameters by fitting to a specified model. Assuming that the probability of occurrence for each bin is small, event counts in the histogram bins will be distributed according to the Poisson distribution. We develop here an efficient algorithm for fitting event counting histograms using the maximum likelihood estimator (MLE) for Poisson distributed data, rather than the non-linear least squares measure. This algorithm is a simple extension of the common Levenberg-Marquardt (L-M) algorithm, is simple to implement, quick and robust. Fitting using a least squares measure is most common, but it is the maximum likelihood estimator only for Gaussian-distributed data. Non-linear least squares methods may be applied to event counting histograms in cases where the number of events is very large, so that the Poisson distribution is well approximated by a Gaussian. However, it is not easy to satisfy this criterion in practice - which requires a large number of events. It has been well-known for years that least squares procedures lead to biased results when applied to Poisson-distributed data; a recent paper providing extensive characterization of these biases in exponential fitting is given. The more appropriate measure based on the maximum likelihood estimator (MLE

10. Recent lung cancer patterns in younger age-cohorts in Ireland

PubMed Central

Kabir, Zubair; Connolly, Gregory N; Clancy, Luke

2007-01-01

Background Smoking causes 85% of all lung cancers in males and 70% in females. Therefore, birth cohort analysis and annual-percent-changes (APC) in age-specific lung cancer mortality rates, particularly in the youngest age cohorts, can explain the beneficial impacts of both past and recent anti-smoking interventions. Methods A long-term time-trend analysis (1958-2002) in lung cancer mortality rates focusing on the youngest age-cohorts (30-49 years of age) in particular was investigated in Ireland. The rates were standardised to the World Standard Population. Lung cancer mortality data were downloaded from the WHO Cancer Mortality Database to estimate APCs in death rates, using the Joinpoint regression (version 3.0) program. A simple age-cohort modelling (log-linear Poisson model) was also done, using SAS software. Results The youngest birth cohorts (born after 1965) have almost one-fourth lower lung cancer risk relative to those born around the First World War. A more than 50% relative decline in death rates among those between 35 and 39 years of age was observed in both sexes in recent years. The youngest age-cohorts (30-39 years of age) in males also showed a significant decrease in death rates in 1998-2002 by more than 3% every five years from 1958-1962 onwards. However, death rate declines in females are slower. Conclusions The youngest birth cohorts had the lowest lung cancer risk and also showed a significant decreasing lung cancer death rate in the most recent years. Such temporal patterns indicate the beneficial impacts of both recent and past tobacco control efforts in Ireland. However, the decline in younger female cohorts is slower. A comprehensive national tobacco control program enforced on evidence-based policies elsewhere can further accelerate a decline in death rates, especially among the younger generations. PMID:17476821

11. Age Disparity in Palliative Radiation Therapy Among Patients With Advanced Cancer

SciTech Connect

Wong, Jonathan; Xu, Beibei; Yeung, Heidi N.; Roeland, Eric J.; Martinez, Maria Elena; Le, Quynh-Thu; Mell, Loren K.; Murphy, James D.

2014-09-01

12. Slits, plates, and Poisson-Boltzmann theory in a local formulation of nonlocal electrostatics.

PubMed

Paillusson, Fabien; Blossey, Ralf

2010-11-01

Polar liquids like water carry a characteristic nanometric length scale, the correlation length of orientation polarizations. Continuum theories that can capture this feature commonly run under the name of "nonlocal" electrostatics since their dielectric response is characterized by a scale-dependent dielectric function ε(q), where q is the wave vector; the Poisson(-Boltzmann) equation then turns into an integro-differential equation. Recently, "local" formulations have been put forward for these theories and applied to water, solvated ions, and proteins. We review the local formalism and show how it can be applied to a structured liquid in slit and plate geometries, and solve the Poisson-Boltzmann theory for a charged plate in a structured solvent with counterions. Our results establish a coherent picture of the local version of nonlocal electrostatics and show its ease of use when compared to the original formulation.

13. Beyond Poisson-Boltzmann: fluctuations and fluid structure in a self-consistent theory

2016-09-01

Poisson-Boltzmann (PB) theory is the classic approach to soft matter electrostatics and has been applied to numerous physical chemistry and biophysics problems. Its essential limitations are in its neglect of correlation effects and fluid structure. Recently, several theoretical insights have allowed the formulation of approaches that go beyond PB theory in a systematic way. In this topical review, we provide an update on the developments achieved in the self-consistent formulations of correlation-corrected Poisson-Boltzmann theory. We introduce a corresponding system of coupled non-linear equations for both continuum electrostatics with a uniform dielectric constant, and a structured solvent—a dipolar Coulomb fluid—including non-local effects. While the approach is only approximate and also limited to corrections in the so-called weak fluctuation regime, it allows us to include physically relevant effects, as we show for a range of applications of these equations.

14. An empirical Bayesian and Buhlmann approach with non-homogenous Poisson process

Noviyanti, Lienda

2015-12-01

All general insurance companies in Indonesia have to adjust their current premium rates according to maximum and minimum limit rates in the new regulation established by the Financial Services Authority (Otoritas Jasa Keuangan / OJK). In this research, we estimated premium rate by means of the Bayesian and the Buhlmann approach using historical claim frequency and claim severity in a five-group risk. We assumed a Poisson distributed claim frequency and a Normal distributed claim severity. Particularly, we used a non-homogenous Poisson process for estimating the parameters of claim frequency. We found that estimated premium rates are higher than the actual current rate. Regarding to the OJK upper and lower limit rates, the estimates among the five-group risk are varied; some are in the interval and some are out of the interval.

15. Erratum: Poisson's ratio in layered two-dimensional crystals [Phys. Rev. B 93, 075420 (2016)

Woo, Sungjong; Park, Hee Chul; Son, Young-Woo

2016-12-01

We present first-principles calculations of elastic properties of multilayered two-dimensional crystals such as graphene, h-BN and 2H-MoS2 which shows that their Poisson's ratios along out-of-plane direction are negative, near zero and positive, respectively, spanning all possibilities for sign of the ratios. While the in-plane Poisson's ratios are all positive regardless of their disparate electronic and structural properties, the characteristic interlayer interactions as well as layer stacking structures are shown to determine the sign of their out-of-plane ratios. Thorough investigation of elastic properties as a function of the number of layers for each system is also provided, highlighting their intertwined nature between elastic and electronic properties.

16. PB-AM: An open-source, fully analytical linear poisson-boltzmann solver.

PubMed

Felberg, Lisa E; Brookes, David H; Yap, Eng-Hui; Jurrus, Elizabeth; Baker, Nathan A; Head-Gordon, Teresa

2016-11-02

We present the open source distributed software package Poisson-Boltzmann Analytical Method (PB-AM), a fully analytical solution to the linearized PB equation, for molecules represented as non-overlapping spherical cavities. The PB-AM software package includes the generation of outputs files appropriate for visualization using visual molecular dynamics, a Brownian dynamics scheme that uses periodic boundary conditions to simulate dynamics, the ability to specify docking criteria, and offers two different kinetics schemes to evaluate biomolecular association rate constants. Given that PB-AM defines mutual polarization completely and accurately, it can be refactored as a many-body expansion to explore 2- and 3-body polarization. Additionally, the software has been integrated into the Adaptive Poisson-Boltzmann Solver (APBS) software package to make it more accessible to a larger group of scientists, educators, and students that are more familiar with the APBS framework. © 2016 Wiley Periodicals, Inc.

17. Statistical shape analysis using 3D Poisson equation--A quantitatively validated approach.

PubMed

Gao, Yi; Bouix, Sylvain

2016-05-01

Statistical shape analysis has been an important area of research with applications in biology, anatomy, neuroscience, agriculture, paleontology, etc. Unfortunately, the proposed methods are rarely quantitatively evaluated, and as shown in recent studies, when they are evaluated, significant discrepancies exist in their outputs. In this work, we concentrate on the problem of finding the consistent location of deformation between two population of shapes. We propose a new shape analysis algorithm along with a framework to perform a quantitative evaluation of its performance. Specifically, the algorithm constructs a Signed Poisson Map (SPoM) by solving two Poisson equations on the volumetric shapes of arbitrary topology, and statistical analysis is then carried out on the SPoMs. The method is quantitatively evaluated on synthetic shapes and applied on real shape data sets in brain structures.

18. Multitasking domain decomposition fast Poisson solvers on the Cray Y-MP

NASA Technical Reports Server (NTRS)

Chan, Tony F.; Fatoohi, Rod A.

1990-01-01

The results of multitasking implementation of a domain decomposition fast Poisson solver on eight processors of the Cray Y-MP are presented. The object of this research is to study the performance of domain decomposition methods on a Cray supercomputer and to analyze the performance of different multitasking techniques using highly parallel algorithms. Two implementations of multitasking are considered: macrotasking (parallelism at the subroutine level) and microtasking (parallelism at the do-loop level). A conventional FFT-based fast Poisson solver is also multitasked. The results of different implementations are compared and analyzed. A speedup of over 7.4 on the Cray Y-MP running in a dedicated environment is achieved for all cases.

19. Poisson-like height distribution of Ag nanoislands on Si(111) 7 ×7

Chen, Yiyao; Gramlich, M. W.; Hayden, S. T.; Miceli, P. F.

2017-01-01

The height distribution of Ag(111) islands grown on Si(111) 7 ×7 was studied using in situ x-ray reflectivity. This noble metal-on-semiconductor system is of particular interest because the islands exhibit an unusual minimum height that is imposed by the quantum confinement of the conduction electrons. For different coverages and temperatures as well as annealing, it was found that the island heights exhibit a variance that is less than the mean by a constant amount. We argue that this behavior is related to Poisson-like statistics with the imposition of the minimum island height. A modified Poisson height distribution model is presented and shown to provide a good description of the experimentally measured island height distributions. The results, which contribute to a better understanding of the nanoscale growth behavior for an important noble metal, are discussed in terms of mobility that leads to taller islands.

20. Dynamics of a prey-predator system under Poisson white noise excitation

Pan, Shan-Shan; Zhu, Wei-Qiu

2014-10-01

The classical Lotka-Volterra (LV) model is a well-known mathematical model for prey-predator ecosystems. In the present paper, the pulse-type version of stochastic LV model, in which the effect of a random natural environment has been modeled as Poisson white noise, is investigated by using the stochastic averaging method. The averaged generalized Itô stochastic differential equation and Fokker-Planck-Kolmogorov (FPK) equation are derived for prey-predator ecosystem driven by Poisson white noise. Approximate stationary solution for the averaged generalized FPK equation is obtained by using the perturbation method. The effect of prey self-competition parameter ɛ2 s on ecosystem behavior is evaluated. The analytical result is confirmed by corresponding Monte Carlo (MC) simulation.

1. Poisson equation for the three-loop ladder diagram in string theory at genus one

Basu, Anirban

2016-11-01

The three-loop ladder diagram is a graph with six links and four cubic vertices that contributes to the D12ℛ4 amplitude at genus one in type II string theory. The vertices represent the insertion points of vertex operators on the toroidal worldsheet and the links represent scalar Green functions connecting them. By using the properties of the Green function and manipulating the various expressions, we obtain a modular invariant Poisson equation satisfied by this diagram, with source terms involving one-, two- and three-loop diagrams. Unlike the source terms in the Poisson equations for diagrams at lower orders in the momentum expansion or the Mercedes diagram, a particular source term involves a five-point function containing a holomorphic and a antiholomorphic worldsheet derivative acting on different Green functions. We also obtain simple equalities between topologically distinct diagrams, and consider some elementary examples.

2. Criticality in a Vlasov-Poisson system: a fermioniclike universality class.

PubMed

Ivanov, A V; Vladimirov, S V; Robinson, P A

2005-05-01

A model Vlasov-Poisson system is simulated close to the point of marginal stability, thus assuming only the wave-particle resonant interactions are responsible for saturation, and shown to obey the power-law scaling of a second-order phase transition. The set of critical exponents analogous to those of the Ising universality class is calculated and shown to obey the Widom and Rushbrooke scaling and Josephson's hyperscaling relations at the formal dimensionality d=5 below the critical point at nonzero order parameter. However, the two-point correlation function does not correspond to the propagator of Euclidean quantum field theory, which is the Gaussian model for the Ising universality class. Instead, it corresponds to the propagator for the fermionic vector field and to the upper critical dimensionality d(c) = 2. This suggests criticality of collisionless Vlasov-Poisson systems corresponds to a universality class analogous to that of critical phenomena of a fermionic quantum field description.

3. Dependent Neyman type A processes based on common shock Poisson approach

2016-04-01

The Neyman type A process is used for describing clustered data since the Poisson process is insufficient for clustering of events. In a multivariate setting, there may be dependencies between multivarite Neyman type A processes. In this study, dependent form of the Neyman type A process is considered under common shock approach. Then, the joint probability function are derived for the dependent Neyman type A Poisson processes. Then, an application based on forest fires in Turkey are given. The results show that the joint probability function of the dependent Neyman type A processes, which is obtained in this study, can be a good tool for the probabilistic fitness for the total number of burned trees in Turkey.

4. Deterministic conservative solver for the inhomogeneous Fokker-Planck-Landau equation coupled with Poisson equation

Zhang, Chenglong; Gamba, Irene M.

2016-11-01

We propose a deterministic conservative solver for the inhomogeneous Fokker-Planck-Landau equation coupled with Poisson equation. Through time-splitting scheme, a Vlasov-Poisson (collisionless) problem and a homogeneous Landau (collisional) problem are obtained. These two subproblems can be treated separately. We use operator splitting where the transport dynamics for Runge-Kutta Discontinuous Galerkin (RK-DG) method and the collisional dynamics for homogeneous conservative spectral method are adopted respectively. Since two different numerical schemes are applied separately, we have designed a new conservation correction process such that, after projecting the conservative spectral solution onto the DG mesh, there is no loss of moment consvervation. Parallelization is readily implemented. To verify our solver, numerical experiments on linear and nonlinear Landau damping are provided.

5. Stationary and Nontationary Response Probability Density Function of a Beam under Poisson White Noise

Vasta, M.; Di Paola, M.

In this paper an approximate explicit probability density function for the analysis of external oscillations of a linear and geometric nonlinear simply supported beam driven by random pulses is proposed. The adopted impulsive loading model is the Poisson White Noise , that is a process having Dirac's delta occurrences with random intensity distributed in time according to Poisson's law. The response probability density function can be obtained solving the related Kolmogorov-Feller (KF) integro-differential equation. An approximated solution, using path integral method, is derived transforming the KF equation to a first order partial differential equation. The method of characteristic is then applied to obtain an explicit solution. Different levels of approximation, depending on the physical assumption on the transition probability density function, are found and the solution for the response density is obtained as series expansion using convolution integrals.

6. Wavelet-based Poisson solver for use in particle-in-cell simulations.

PubMed

Terzić, Balsa; Pogorelov, Ilya V

2005-06-01

We report on a successful implementation of a wavelet-based Poisson solver for use in three-dimensional particle-in-cell simulations. Our method harnesses advantages afforded by the wavelet formulation, such as sparsity of operators and data sets, existence of effective preconditioners, and the ability simultaneously to remove numerical noise and additional compression of relevant data sets. We present and discuss preliminary results relating to the application of the new solver to test problems in accelerator physics and astrophysics.

7. The accurate solution of Poisson's equation by expansion in Chebyshev polynomials

NASA Technical Reports Server (NTRS)

Haidvogel, D. B.; Zang, T.

1979-01-01

A Chebyshev expansion technique is applied to Poisson's equation on a square with homogeneous Dirichlet boundary conditions. The spectral equations are solved in two ways - by alternating direction and by matrix diagonalization methods. Solutions are sought to both oscillatory and mildly singular problems. The accuracy and efficiency of the Chebyshev approach compare favorably with those of standard second- and fourth-order finite-difference methods.

8. The role of Poisson's binomial distribution in the analysis of TEM images.

PubMed

Tejada, Arturo; den Dekker, Arnold J

2011-11-01

Frank's observation that a TEM bright-field image acquired under non-stationary conditions can be modeled by the time integral of the standard TEM image model [J. Frank, Nachweis von objektbewegungen im lichtoptis- chen diffraktogramm von elektronenmikroskopischen auf- nahmen, Optik 30 (2) (1969) 171-180.] is re-derived here using counting statistics based on Poisson's binomial distribution. The approach yields a statistical image model that is suitable for image analysis and simulation.

9. Guidelines for Use of the Approximate Beta-Poisson Dose-Response Model.

PubMed

Xie, Gang; Roiko, Anne; Stratton, Helen; Lemckert, Charles; Dunn, Peter K; Mengersen, Kerrie

2016-10-05

For dose-response analysis in quantitative microbial risk assessment (QMRA), the exact beta-Poisson model is a two-parameter mechanistic dose-response model with parameters α>0 and β>0, which involves the Kummer confluent hypergeometric function. Evaluation of a hypergeometric function is a computational challenge. Denoting PI(d) as the probability of infection at a given mean dose d, the widely used dose-response model PI(d)=1-(1+dβ)-α is an approximate formula for the exact beta-Poisson model. Notwithstanding the required conditions α<β and β>1, issues related to the validity and approximation accuracy of this approximate formula have remained largely ignored in practice, partly because these conditions are too general to provide clear guidance. Consequently, this study proposes a probability measure Pr(0 < r < 1 | α̂, β̂) as a validity measure (r is a random variable that follows a gamma distribution; α̂ and β̂ are the maximum likelihood estimates of α and β in the approximate model); and the constraint conditions β̂>(22α̂)0.50 for 0.02<α̂<2 as a rule of thumb to ensure an accurate approximation (e.g., Pr(0 < r < 1 | α̂, β̂) >0.99) . This validity measure and rule of thumb were validated by application to all the completed beta-Poisson models (related to 85 data sets) from the QMRA community portal (QMRA Wiki). The results showed that the higher the probability Pr(0 < r < 1 | α̂, β̂), the better the approximation. The results further showed that, among the total 85 models examined, 68 models were identified as valid approximate model applications, which all had a near perfect match to the corresponding exact beta-Poisson model dose-response curve.

10. User`s guide for the POISSON/SUPERFISH Group of Codes

SciTech Connect

Menzel, M.T.; Stokes, H.K.

1987-01-01

The POISSON/SUPERFISH Group Codes are a set of programs written by Ronald Holsinger, with theoretical assistance from Klaus Halbach, to solve two distinct problems--the calculation of magnetostatic and electrostatic fields, and the computation of the resonant frequencies and fields in radio-frequency cavities--in a two-dimensional Cartesian or three-dimensional cylindrical geometry. These codes are widely used for the design of magnets and radio frequency cavities.

11. Auxetic Materials: An Annotated Bibliography of Materials With Negative Poisson’s Ratio

DTIC Science & Technology

1993-03-31

Naval Research Laboratory Washington, DC 20375.5000 AD-A263 749 NIILIMJ917-97-7198 AUXETIC MATERIALS: AN ANNOTATED BIBLIOGRAPHY OF MATERIALS WITH...DATES COVERED 1 15 February 1993 IFTNA. ______________ 4. TITLE ANi. SUBTITLE S. FUNDING NUMBERS Auxetic materials: An annotated bibliography of PE...potential of materials with negative Poisson’s ratio (" auxetic " materials) for application in hydroacoustics. The re- sults of the literature search were

12. A note on an attempt at more efficient Poisson series evaluation. [for lunar libration

NASA Technical Reports Server (NTRS)

Shelus, P. J.; Jefferys, W. H., III

1975-01-01

A substantial reduction has been achieved in the time necessary to compute lunar libration series. The method involves eliminating many of the trigonometric function calls by a suitable transformation and applying a short SNOBOL processor to the FORTRAN coding of the transformed series, which obviates many of the multiplication operations during the course of series evaluation. It is possible to accomplish similar results quite easily with other Poisson series.

13. Statistical error in simulations of Poisson processes: Example of diffusion in solids

Nilsson, Johan O.; Leetmaa, Mikael; Vekilova, Olga Yu.; Simak, Sergei I.; Skorodumova, Natalia V.

2016-08-01

Simulations of diffusion in solids often produce poor statistics of diffusion events. We present an analytical expression for the statistical error in ion conductivity obtained in such simulations. The error expression is not restricted to any computational method in particular, but valid in the context of simulation of Poisson processes in general. This analytical error expression is verified numerically for the case of Gd-doped ceria by running a large number of kinetic Monte Carlo calculations.

14. Acceptance Control Charts with Stipulated Error Probabilities Based on Poisson Count Data

DTIC Science & Technology

1973-01-01

Richard L / Scheaffer ’.* Richard S eavenwort December,... 198 *Department of Industrial and Systems Engineering University of Florida Gainesville...L. Scheaffer N00014-75-C-0783 Richard S. Leavenworth 9. PERFORMING ORGANIZATION NAME AND ADDRESS . PROGRAM ELEMENT. PROJECT, TASK Industrial and...PROBABILITIES BASED ON POISSON COUNT DATA by Suresh 1Ihatre Richard L. Scheaffer S..Richard S. Leavenworth ABSTRACT An acceptance control charting

15. The limiting problem of the drift-diffusion-Poisson model with discontinuous p-n-junctions

Lian, Songzhe; Yuan, Hongjun; Cao, Chunling; Gao, Wenjie

2008-11-01

In this paper, the authors consider the limiting problem of the drift-diffusion-Poisson model for semiconductors. Different from previous papers, the model considered involve some special doping profiles D which have the property that the function is allowed to have a jump-discontinuity and sign changing property but D2 is required to be Lipschitz continuous. The existence, uniqueness and large-time asymptotic behavior of the global (in time) solutions are given.

16. Suppression Situations in Multiple Linear Regression

ERIC Educational Resources Information Center

Shieh, Gwowen

2006-01-01

This article proposes alternative expressions for the two most prevailing definitions of suppression without resorting to the standardized regression modeling. The formulation provides a simple basis for the examination of their relationship. For the two-predictor regression, the author demonstrates that the previous results in the literature are…

17. Regression Analysis: Legal Applications in Institutional Research

ERIC Educational Resources Information Center

Frizell, Julie A.; Shippen, Benjamin S., Jr.; Luna, Andrew L.

2008-01-01

This article reviews multiple regression analysis, describes how its results should be interpreted, and instructs institutional researchers on how to conduct such analyses using an example focused on faculty pay equity between men and women. The use of multiple regression analysis will be presented as a method with which to compare salaries of…

18. Principles of Quantile Regression and an Application

ERIC Educational Resources Information Center

Chen, Fang; Chalhoub-Deville, Micheline

2014-01-01

Newer statistical procedures are typically introduced to help address the limitations of those already in practice or to deal with emerging research needs. Quantile regression (QR) is introduced in this paper as a relatively new methodology, which is intended to overcome some of the limitations of least squares mean regression (LMR). QR is more…

19. Three-Dimensional Modeling in Linear Regression.

ERIC Educational Resources Information Center

Herman, James D.

Linear regression examines the relationship between one or more independent (predictor) variables and a dependent variable. By using a particular formula, regression determines the weights needed to minimize the error term for a given set of predictors. With one predictor variable, the relationship between the predictor and the dependent variable…

20. A Practical Guide to Regression Discontinuity

ERIC Educational Resources Information Center

Jacob, Robin; Zhu, Pei; Somers, Marie-Andrée; Bloom, Howard

2012-01-01

Regression discontinuity (RD) analysis is a rigorous nonexperimental approach that can be used to estimate program impacts in situations in which candidates are selected for treatment based on whether their value for a numeric rating exceeds a designated threshold or cut-point. Over the last two decades, the regression discontinuity approach has…