Science.gov

Sample records for isotonic bivariate statistical

  1. Neural Systems with Numerically Matched Input-Output Statistic: Isotonic Bivariate Statistical Modeling

    PubMed Central

    Fiori, Simone

    2007-01-01

    Bivariate statistical modeling from incomplete data is a useful statistical tool that allows to discover the model underlying two data sets when the data in the two sets do not correspond in size nor in ordering. Such situation may occur when the sizes of the two data sets do not match (i.e., there are “holes” in the data) or when the data sets have been acquired independently. Also, statistical modeling is useful when the amount of available data is enough to show relevant statistical features of the phenomenon underlying the data. We propose to tackle the problem of statistical modeling via a neural (nonlinear) system that is able to match its input-output statistic to the statistic of the available data sets. A key point of the new implementation proposed here is that it is based on look-up-table (LUT) neural systems, which guarantee a computationally advantageous way of implementing neural systems. A number of numerical experiments, performed on both synthetic and real-world data sets, illustrate the features of the proposed modeling procedure. PMID:18566641

  2. Statistical Modeling of Bivariate Data.

    DTIC Science & Technology

    1982-08-01

    end identify by lock nsum br) joint density-quantile function, dependence-density, non-parametric bivariate density estimation, entropy , exponential...estimated, by autoregressive or exponential model estimators I with maximum entropy properties, is investigated in this thesis. The results provide...important and useful procedures for nonparametric bivariate density estimation. The thesis discusses estimators of the entropy H(d) of ul2) which seem to me

  3. Bivariate ensemble model output statistics approach for joint forecasting of wind speed and temperature

    NASA Astrophysics Data System (ADS)

    Baran, Sándor; Möller, Annette

    2017-02-01

    Forecast ensembles are typically employed to account for prediction uncertainties in numerical weather prediction models. However, ensembles often exhibit biases and dispersion errors, thus they require statistical post-processing to improve their predictive performance. Two popular univariate post-processing models are the Bayesian model averaging (BMA) and the ensemble model output statistics (EMOS). In the last few years, increased interest has emerged in developing multivariate post-processing models, incorporating dependencies between weather quantities, such as for example a bivariate distribution for wind vectors or even a more general setting allowing to combine any types of weather variables. In line with a recently proposed approach to model temperature and wind speed jointly by a bivariate BMA model, this paper introduces an EMOS model for these weather quantities based on a bivariate truncated normal distribution. The bivariate EMOS model is applied to temperature and wind speed forecasts of the 8-member University of Washington mesoscale ensemble and the 11-member ALADIN-HUNEPS ensemble of the Hungarian Meteorological Service and its predictive performance is compared to the performance of the bivariate BMA model and a multivariate Gaussian copula approach, post-processing the margins with univariate EMOS. While the predictive skills of the compared methods are similar, the bivariate EMOS model requires considerably lower computation times than the bivariate BMA method.

  4. Bivariate ensemble model output statistics approach for joint forecasting of wind speed and temperature

    NASA Astrophysics Data System (ADS)

    Baran, Sándor; Möller, Annette

    2016-06-01

    Forecast ensembles are typically employed to account for prediction uncertainties in numerical weather prediction models. However, ensembles often exhibit biases and dispersion errors, thus they require statistical post-processing to improve their predictive performance. Two popular univariate post-processing models are the Bayesian model averaging (BMA) and the ensemble model output statistics (EMOS). In the last few years, increased interest has emerged in developing multivariate post-processing models, incorporating dependencies between weather quantities, such as for example a bivariate distribution for wind vectors or even a more general setting allowing to combine any types of weather variables. In line with a recently proposed approach to model temperature and wind speed jointly by a bivariate BMA model, this paper introduces an EMOS model for these weather quantities based on a bivariate truncated normal distribution. The bivariate EMOS model is applied to temperature and wind speed forecasts of the 8-member University of Washington mesoscale ensemble and the 11-member ALADIN-HUNEPS ensemble of the Hungarian Meteorological Service and its predictive performance is compared to the performance of the bivariate BMA model and a multivariate Gaussian copula approach, post-processing the margins with univariate EMOS. While the predictive skills of the compared methods are similar, the bivariate EMOS model requires considerably lower computation times than the bivariate BMA method.

  5. INLAND DISSOLVED SALT CHEMISTRY: STATISTICAL EVALUATION OF BIVARIATE AND TERNARY DIAGRAM MODELS FOR SURFACE AND SUBSURFACE WATERS

    EPA Science Inventory

    We compared the use of ternary and bivariate diagrams to distinguish the effects of atmospheric precipitation, rock weathering, and evaporation on inland surface and subsurface water chemistry. The three processes could not be statistically differentiated using bivariate models e...

  6. Application of bivariate statistics to full wine bottle diamagnetic screening data.

    PubMed

    Harley, S J; Lim, V; Augustine, M P

    2012-01-30

    A bivariate correlated Student distribution is applied to full wine bottle diamagnetic screening measurements. Previous work involving a limited number of rare wines indicated that like wines cluster in a plot of the first two principal component scores derived from a covariance matrix of the diamagnetic screening measurements. This study extends the approach to a much larger, statistically meaningful sixty bottle wine library where bivariate statistics are used to comment on the measured data. The full bottle diamagnetic screening of thirty-six identically labeled, sealed bottles of wine obtained from four different sources combined with principal component analysis data reduction followed by treatment with a bivariate distribution permit the effect of wine transport and storage to be observed. The usefulness and future success of the method towards the identification of counterfeit wines is mentioned.

  7. Bivariate Normal Wind Statistics model: User’s Manual.

    DTIC Science & Technology

    1980-09-01

    BIKI/ SLqTX. SOSTY, PROSTD COMMON /BLK2/ XBAR, YBAR COIIMON /BLK3/ CORR., DENOM DATA ITERM, IYES/-I.’ Y "/ I FORMAT (’ *** USAFETAC/DND WIND STATISTICS...THE FIVE BASIC PARAMETERS *** CCC 70 WRITE (ITERM,77) 77 FORMAT (’ INPUT MEAN X,STDEVX,MEAM YSTDE Y ,*CORR. COEFF.-’ READ (ITERP.8) XBAR, STDEVX. YBAR ...the X- Y axes through a given angle. Subroutine RSPGDR Gives the (conditional) probability of a specified range of wind speeds when the wind direction

  8. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    ERIC Educational Resources Information Center

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  9. Testing independence of bivariate interval-censored data using modified Kendall's tau statistic.

    PubMed

    Kim, Yuneung; Lim, Johan; Park, DoHwan

    2015-11-01

    In this paper, we study a nonparametric procedure to test independence of bivariate interval censored data; for both current status data (case 1 interval-censored data) and case 2 interval-censored data. To do it, we propose a score-based modification of the Kendall's tau statistic for bivariate interval-censored data. Our modification defines the Kendall's tau statistic with expected numbers of concordant and disconcordant pairs of data. The performance of the modified approach is illustrated by simulation studies and application to the AIDS study. We compare our method to alternative approaches such as the two-stage estimation method by Sun et al. (Scandinavian Journal of Statistics, 2006) and the multiple imputation method by Betensky and Finkelstein (Statistics in Medicine, 1999b).

  10. Source apportionment advances using polar plots of bivariate correlation and regression statistics

    NASA Astrophysics Data System (ADS)

    Grange, Stuart K.; Lewis, Alastair C.; Carslaw, David C.

    2016-11-01

    This paper outlines the development of enhanced bivariate polar plots that allow the concentrations of two pollutants to be compared using pair-wise statistics for exploring the sources of atmospheric pollutants. The new method combines bivariate polar plots, which provide source characteristic information, with pair-wise statistics that provide information on how two pollutants are related to one another. The pair-wise statistics implemented include weighted Pearson correlation and slope from two linear regression methods. The development uses a Gaussian kernel to locally weight the statistical calculations on a wind speed-direction surface together with variable-scaling. Example applications of the enhanced polar plots are presented by using routine air quality data for two monitoring sites in London, United Kingdom for a single year (2013). The London examples demonstrate that the combination of bivariate polar plots, correlation, and regression techniques can offer considerable insight into air pollution source characteristics, which would be missed if only scatter plots and mean polar plots were used for analysis. Specifically, using correlation and slopes as pair-wise statistics, long-range transport processes were isolated and black carbon (BC) contributions to PM2.5 for a kerbside monitoring location were quantified. Wider applications and future advancements are also discussed.

  11. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  12. Bivariate distributions in statistical spectroscopy studies: III. Non interacting particle strength densities for one-body transition operators

    NASA Astrophysics Data System (ADS)

    Kota, V. K. B.; Majumdar, D.

    1995-12-01

    In statistical spectroscopy, it was shown by French et al. (Ann. Phys., N.Y. 181, 235 (1988)) that the bivariate strength densities take a convolution form with the non interacting particle (NIP) strength density being convoluted with a spreading bivariate Gaussian due to interactions. Leaving aside the question of determining the parameters of the spreading bivariate Gaussian, one needs good methods for constructing the NIP bivariate strength densities I {O/ h }( E,E') ( h is a one-body hamiltonian and O is a transition operator) in large shell model spaces. A formalism for constructing I {O/ h } is developed for one-body transition operators by using spherical orbits and spherical configurations. For rapid construction and also for applying the statistical theory in large shell model spaces I {O/ h } is decomposed into partial densities defined by unitary orbit configurations (unitary orbit is a set of spherical orbits). Trace propagation formulas for the bivariate moments M rs with r+s ≤2 of the partial NIP strength densities, which will determine the Gaussian representation, are derived. In a large space numerical example with Gamow-Teller β - transition operator, the superposition of unitary orbit partial bivariate Gaussian densities is shown to give a good representation of the exact NIP strength densities. Trace propagation formulas for M rs with r+<—4 are also derived in m-particle scalar spaces which are useful for many purposes.

  13. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  14. Univariate description and bivariate statistical inference: the first step delving into data.

    PubMed

    Zhang, Zhongheng

    2016-03-01

    In observational studies, the first step is usually to explore data distribution and the baseline differences between groups. Data description includes their central tendency (e.g., mean, median, and mode) and dispersion (e.g., standard deviation, range, interquartile range). There are varieties of bivariate statistical inference methods such as Student's t-test, Mann-Whitney U test and Chi-square test, for normal, skews and categorical data, respectively. The article shows how to perform these analyses with R codes. Furthermore, I believe that the automation of the whole workflow is of paramount importance in that (I) it allows for others to repeat your results; (II) you can easily find out how you performed analysis during revision; (III) it spares data input by hand and is less error-prone; and (IV) when you correct your original dataset, the final result can be automatically corrected by executing the codes. Therefore, the process of making a publication quality table incorporating all abovementioned statistics and P values is provided, allowing readers to customize these codes to their own needs.

  15. Statistical analysis of bivariate failure time data with Marshall-Olkin Weibull models.

    PubMed

    Li, Yang; Sun, Jianguo; Song, Shuguang

    2012-06-01

    This paper discusses parametric analysis of bivariate failure time data, which often occur in medical studies among others. For this, as in the case of univariate failure time data, exponential and Weibull models are probably the most commonly used ones. However, it is surprising that there seem no general estimation procedures available for fitting the bivariate Weibull model to bivariate right-censored failure time data except some methods for special situations. We present and investigate two general but simple estimation procedures, one being a graphical approach and the other being a marginal approach, for the problem. An extensive simulation study is conducted to assess the performances of the proposed approaches and shows that they work well for practical situations. An illustrative example is provided.

  16. On statistical tests for homogeneity of two bivariate zero-inflated Poisson populations.

    PubMed

    Yuen, Hak-Keung; Chow, Shein-Chung; Tse, Siu-Keung

    2015-01-01

    The problem of testing treatment difference in the occurrence of a study endpoint in a randomized parallel-group comparative clinical trial with repeated responses under the assumption that the responses follow a bivariate zero-inflated Poisson (ZIP) distribution is considered. Likelihood ratio test for homogeneity of two bivariate ZIP populations is derived. Approximate formula for sample size calculation is also obtained, which achieves a desired power for detecting a clinically meaningful difference under an alternative hypothesis. An example concerning the comparison of treatment effect in an addictive clinical trial in terms of the number of days of illicit drug use during a month is given for illustrative purposes.

  17. Meta-analysis for diagnostic accuracy studies: a new statistical model using beta-binomial distributions and bivariate copulas.

    PubMed

    Kuss, Oliver; Hoyer, Annika; Solms, Alexander

    2014-01-15

    There are still challenges when meta-analyzing data from studies on diagnostic accuracy. This is mainly due to the bivariate nature of the response where information on sensitivity and specificity must be summarized while accounting for their correlation within a single trial. In this paper, we propose a new statistical model for the meta-analysis for diagnostic accuracy studies. This model uses beta-binomial distributions for the marginal numbers of true positives and true negatives and links these margins by a bivariate copula distribution. The new model comes with all the features of the current standard model, a bivariate logistic regression model with random effects, but has the additional advantages of a closed likelihood function and a larger flexibility for the correlation structure of sensitivity and specificity. In a simulation study, which compares three copula models and two implementations of the standard model, the Plackett and the Gauss copula do rarely perform worse but frequently better than the standard model. We use an example from a meta-analysis to judge the diagnostic accuracy of telomerase (a urinary tumor marker) for the diagnosis of primary bladder cancer for illustration.

  18. The return period analysis of natural disasters with statistical modeling of bivariate joint probability distribution.

    PubMed

    Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng

    2013-01-01

    New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis.

  19. Standardization by Bivariate Adjustment of Internal Assessments: Sex Bias and Other Statistical Matters.

    ERIC Educational Resources Information Center

    Daley, D. J.

    1985-01-01

    ASAT scores are examined statistically with respect to gender differences, and a rationale for removing the gender bias from the aggregate achievement assessments is described. The inadequacy of relying solely on aptitude tests to produce comparisons of achievements is noted. (Author/MLW)

  20. The Effect of Statistical Velocity Variation on the Gaussian Bivariate Probability of Hit for Small Caliber Systems

    DTIC Science & Technology

    1975-05-01

    for V + a v and V -a v. When a v << V, Equations (3.5) become, a (vx) = x () - i(7) = (a) - x (a) (3.6a) a( vya ) = y+ (a) - y(() = y(a) - y_) (3.6b...Statistics with Tables, Handbook Publishers, Inc., Sandusky, 9 hio, 1953. Burington, R.S., Handbook of Mathematical Tables and Formulas, McGraw- Hill ...Mathematical Tables and Formulas, McGraw- Hill , New York, 1965. 10. Tables of Normal Probability Functions, National Bur- eau of Standards, Applied

  1. Flash flood susceptibility analysis and its mapping using different bivariate models in Iran: a comparison between Shannon(')s entropy, statistical index, and weighting factor models.

    PubMed

    Khosravi, Khabat; Pourghasemi, Hamid Reza; Chapi, Kamran; Bahri, Masoumeh

    2016-12-01

    Flooding is a very common worldwide natural hazard causing large-scale casualties every year; Iran is not immune to this thread as well. Comprehensive flood susceptibility mapping is very important to reduce losses of lives and properties. Thus, the aim of this study is to map susceptibility to flooding by different bivariate statistical methods including Shannon's entropy (SE), statistical index (SI), and weighting factor (Wf). In this regard, model performance evaluation is also carried out in Haraz Watershed, Mazandaran Province, Iran. In the first step, 211 flood locations were identified by the documentary sources and field inventories, of which 70% (151 positions) were used for flood susceptibility modeling and 30% (60 positions) for evaluation and verification of the model. In the second step, ten influential factors in flooding were chosen, namely slope angle, plan curvature, altitude, topographic wetness index (TWI), stream power index (SPI), distance from river, rainfall, geology, land use, and normalized difference vegetation index (NDVI). In the next step, flood susceptibility maps were prepared by these four methods in ArcGIS. As the last step, receiver operating characteristic (ROC) curve was drawn and the area under the curve (AUC) was calculated for quantitative assessment of each model. The results showed that the best model to estimate the susceptibility to flooding in Haraz Watershed was SI model with the prediction and success rates of 99.71 and 98.72%, respectively, followed by Wf and SE models with the AUC values of 98.1 and 96.57% for the success rate, and 97.6 and 92.42% for the prediction rate, respectively. In the SI and Wf models, the highest and lowest important parameters were the distance from river and geology. Flood susceptibility maps are informative for managers and decision makers in Haraz Watershed in order to contemplate measures to reduce human and financial losses.

  2. Landslide susceptibility assessment in Lianhua County (China): A comparison between a random forest data mining technique and bivariate and multivariate statistical models

    NASA Astrophysics Data System (ADS)

    Hong, Haoyuan; Pourghasemi, Hamid Reza; Pourtaghi, Zohre Sadat

    2016-04-01

    Landslides are an important natural hazard that causes a great amount of damage around the world every year, especially during the rainy season. The Lianhua area is located in the middle of China's southern mountainous area, west of Jiangxi Province, and is known to be an area prone to landslides. The aim of this study was to evaluate and compare landslide susceptibility maps produced using the random forest (RF) data mining technique with those produced by bivariate (evidential belief function and frequency ratio) and multivariate (logistic regression) statistical models for Lianhua County, China. First, a landslide inventory map was prepared using aerial photograph interpretation, satellite images, and extensive field surveys. In total, 163 landslide events were recognized in the study area, with 114 landslides (70%) used for training and 49 landslides (30%) used for validation. Next, the landslide conditioning factors-including the slope angle, altitude, slope aspect, topographic wetness index (TWI), slope-length (LS), plan curvature, profile curvature, distance to rivers, distance to faults, distance to roads, annual precipitation, land use, normalized difference vegetation index (NDVI), and lithology-were derived from the spatial database. Finally, the landslide susceptibility maps of Lianhua County were generated in ArcGIS 10.1 based on the random forest (RF), evidential belief function (EBF), frequency ratio (FR), and logistic regression (LR) approaches and were validated using a receiver operating characteristic (ROC) curve. The ROC plot assessment results showed that for landslide susceptibility maps produced using the EBF, FR, LR, and RF models, the area under the curve (AUC) values were 0.8122, 0.8134, 0.7751, and 0.7172, respectively. Therefore, we can conclude that all four models have an AUC of more than 0.70 and can be used in landslide susceptibility mapping in the study area; meanwhile, the EBF and FR models had the best performance for Lianhua

  3. Bivariate extreme value distributions

    NASA Technical Reports Server (NTRS)

    Elshamy, M.

    1992-01-01

    In certain engineering applications, such as those occurring in the analyses of ascent structural loads for the Space Transportation System (STS), some of the load variables have a lower bound of zero. Thus, the need for practical models of bivariate extreme value probability distribution functions with lower limits was identified. We discuss the Gumbel models and present practical forms of bivariate extreme probability distributions of Weibull and Frechet types with two parameters. Bivariate extreme value probability distribution functions can be expressed in terms of the marginal extremel distributions and a 'dependence' function subject to certain analytical conditions. Properties of such bivariate extreme distributions, sums and differences of paired extremals, as well as the corresponding forms of conditional distributions, are discussed. Practical estimation techniques are also given.

  4. Bivariate distributions in statistical spectroscopy studies: IV. Interacting particle Gamow-Teller strength densities and β-decay rates of fp-shell nuclei for presupernova stars

    NASA Astrophysics Data System (ADS)

    Kota, V. K. B.; Majumdar, D.

    1995-12-01

    A method to calculate temperature dependent β-decay rates is developed by writing the expression for the rates explicitly in terms of bivariate GT strength densities ( I {/O H } ( GT)) for a given hamiltonian H=h+V and state densities of the parent nucleus besides having the usual phase space factors. The theory developed in the preceding paper (III) for constructing NIP strength densities is applied for generating I {/O h } ( GT) and then I {/O H } ( GT) is constructed using the bivariate convolution form I {/O H } ( GT)=Σ S I {/O(GT) h,S }⊗ρ{/O(GT) V, S }; BIV-G . The spreading bivariate Gaussian ρ{/O(GT) V}; BIV-G, for fp-shell nuclei, is constructed by assuming that the marginal centroids are zero, the marginal variances are same as the corresponding state density variances and fixing the bivariate correlation coefficientbar ζ using experimental β-decay half lifes. With the deduced values ofbar ζ bar ζ ˜ 0.67, β-S-decay rates for61,62Fe and62 64Co isotopes are calculated at presupernova matter densities ρ=107 109 gm/cc, temperatures T=(3 5)×109 ∘K and electron fractions Ye=0.43 0.5. The convolution form for I {O(GT)/ H } led to a simple expression for calculating GT non-energy weighted sum rule strength and it describes (within 10%) the shell model results of fp-shell nuclei.

  5. Local osmosis and isotonic transport.

    PubMed

    Mathias, R T; Wang, H

    2005-11-01

    Osmotically driven water flow, u (cm/s), between two solutions of identical osmolarity, c(o) (300 mM: in mammals), has a theoretical isotonic maximum given by u = j/c(o), where j (moles/cm(2)/s) is the rate of salt transport. In many experimental studies, transport was found to be indistinguishable from isotonic. The purpose of this work is to investigate the conditions for u to approach isotonic. A necessary condition is that the membrane salt/water permeability ratio, epsilon, must be small: typical physiological values are epsilon = 10(-3) to 10(-5), so epsilon is generally small but this is not sufficient to guarantee near-isotonic transport. If we consider the simplest model of two series membranes, which secrete a tear or drop of sweat (i.e., there are no externally-imposed boundary conditions on the secretion), diffusion is negligible and the predicted osmolarities are: basal = c(o), intracellular approximately (1 + epsilon)c(o), secretion approximately (1 + 2epsilon)c(o), and u approximately (1 - 2epsilon)j/c(o). Note that this model is also appropriate when the transported solution is experimentally collected. Thus, in the absence of external boundary conditions, transport is experimentally indistinguishable from isotonic. However, if external boundary conditions set salt concentrations to c(o) on both sides of the epithelium, then fluid transport depends on distributed osmotic gradients in lateral spaces. If lateral spaces are too short and wide, diffusion dominates convection, reduces osmotic gradients and fluid flow is significantly less than isotonic. Moreover, because apical and basolateral membrane water fluxes are linked by the intracellular osmolarity, water flow is maximum when the total water permeability of basolateral membranes equals that of apical membranes. In the context of the renal proximal tubule, data suggest it is transporting at near optimal conditions. Nevertheless, typical physiological values suggest the newly filtered fluid is

  6. Five-Parameter Bivariate Probability Distribution

    NASA Technical Reports Server (NTRS)

    Tubbs, J.; Brewer, D.; Smith, O. W.

    1986-01-01

    NASA technical memorandum presents four papers about five-parameter bivariate gamma class of probability distributions. With some overlap of subject matter, papers address different aspects of theories of these distributions and use in forming statistical models of such phenomena as wind gusts. Provides acceptable results for defining constraints in problems designing aircraft and spacecraft to withstand large wind-gust loads.

  7. Covariate analysis of bivariate survival data

    SciTech Connect

    Bennett, L.E.

    1992-01-01

    The methods developed are used to analyze the effects of covariates on bivariate survival data when censoring and ties are present. The proposed method provides models for bivariate survival data that include differential covariate effects and censored observations. The proposed models are based on an extension of the univariate Buckley-James estimators which replace censored data points by their expected values, conditional on the censoring time and the covariates. For the bivariate situation, it is necessary to determine the expectation of the failure times for one component conditional on the failure or censoring time of the other component. Two different methods have been developed to estimate these expectations. In the semiparametric approach these expectations are determined from a modification of Burke's estimate of the bivariate empirical survival function. In the parametric approach censored data points are also replaced by their conditional expected values where the expected values are determined from a specified parametric distribution. The model estimation will be based on the revised data set, comprised of uncensored components and expected values for the censored components. The variance-covariance matrix for the estimated covariate parameters has also been derived for both the semiparametric and parametric methods. Data from the Demographic and Health Survey was analyzed by these methods. The two outcome variables are post-partum amenorrhea and breastfeeding; education and parity were used as the covariates. Both the covariate parameter estimates and the variance-covariance estimates for the semiparametric and parametric models will be compared. In addition, a multivariate test statistic was used in the semiparametric model to examine contrasts. The significance of the statistic was determined from a bootstrap distribution of the test statistic.

  8. Assessment of Mass-Transport Deposits occurrence offshore Espírito Santo Basin (SE Brazil) using a bivariate statistical method

    NASA Astrophysics Data System (ADS)

    Piedade, Aldina; Alves, Tiago; Luís Zêzere, José

    2016-04-01

    Mass Transport Deposits (MTDs) are one of the most important process shaping passive and active margins. It is frequently happening and its characteristics, features and processes has been very well documented from diverse approaches and methodologies. In this work a methodology for evaluation of MTDs occurrence is tested in an area offshore Espírito Santo Basin, SE Brazil. MTDs inventory was made on three-dimensional (3D) seismic volume interpreting a high amplitude reflection which correspond to the top and base of the MTDs. The inventory consists of four MTDs which were integrated into a GIS database. MTDs favourability scores are computed using algorithms based on statistical/probabilistic analysis (Information Value Method) over unique condition terrain units in a raster basis. Terrain attributes derived from the Digital Terrain Model (DTM) are interpreted as proxies of driving factors of MTDs and are used as predictors in our models which are based on a set of different MTDs inventories. Three models are elaborated independently according to the area of the MTDs body (Model 1, Model 2 and Model 3). The final result is prepared by sorting all pixels according to the pixel favourability value in descending order. The robustness and accuracy of the MTDs favourability models are evaluated by the success-rate curves, which are used for the quantitative interpretation of the models expressing the goodness of fit of the MTDs. In addition, a sensitivity analysis was performed and the predisposing factors which have highest prediction performance on MTDs occurrence were identified. The obtained results allow to conclude the method is valid to apply to submarine slopes as it is demonstrated by the highest obtained goodness of fit (0.862). This work is very pioneer, the methodology used was never applied to submarine environment. It is a very promising and valid methodology within the prediction of submarine slopes regarding failing and instability to the industry. In

  9. Stability of Bivariate GWAS Biomarker Detection

    PubMed Central

    Bedő, Justin; Rawlinson, David; Goudey, Benjamin; Ong, Cheng Soon

    2014-01-01

    Given the difficulty and effort required to confirm candidate causal SNPs detected in genome-wide association studies (GWAS), there is no practical way to definitively filter false positives. Recent advances in algorithmics and statistics have enabled repeated exhaustive search for bivariate features in a practical amount of time using standard computational resources, allowing us to use cross-validation to evaluate the stability. We performed 10 trials of 2-fold cross-validation of exhaustive bivariate analysis on seven Wellcome–Trust Case–Control Consortium GWAS datasets, comparing the traditional test for association, the high-performance GBOOST method and the recently proposed GSS statistic (Available at http://bioinformatics.research.nicta.com.au/software/gwis/). We use Spearman's correlation to measure the similarity between the folds of cross validation. To compare incomplete lists of ranks we propose an extension to Spearman's correlation. The extension allows us to consider a natural threshold for feature selection where the correlation is zero. This is the first reported cross-validation study of exhaustive bivariate GWAS feature selection. We found that stability between ranked lists from different cross-validation folds was higher for GSS in the majority of diseases. A thorough analysis of the correlation between SNP-frequency and univariate score demonstrated that the test for association is highly confounded by main effects: SNPs with high univariate significance replicably dominate the ranked results. We show that removal of the univariately significant SNPs improves replicability but risks filtering pairs involving SNPs with univariate effects. We empirically confirm that the stability of GSS and GBOOST were not affected by removal of univariately significant SNPs. These results suggest that the GSS and GBOOST tests are successfully targeting bivariate association with phenotype and that GSS is able to reliably detect a larger set of SNP

  10. Some properties of a 5-parameter bivariate probability distribution

    NASA Technical Reports Server (NTRS)

    Tubbs, J. D.; Brewer, D. W.; Smith, O. E.

    1983-01-01

    A five-parameter bivariate gamma distribution having two shape parameters, two location parameters and a correlation parameter was developed. This more general bivariate gamma distribution reduces to the known four-parameter distribution. The five-parameter distribution gives a better fit to the gust data. The statistical properties of this general bivariate gamma distribution and a hypothesis test were investigated. Although these developments have come too late in the Shuttle program to be used directly as design criteria for ascent wind gust loads, the new wind gust model has helped to explain the wind profile conditions which cause large dynamic loads. Other potential applications of the newly developed five-parameter bivariate gamma distribution are in the areas of reliability theory, signal noise, and vibration mechanics.

  11. Collective structure of the N=40 isotones

    SciTech Connect

    Gaudefroy, L.; Peru, S.; Pillet, N.; Hilaire, S.; Delaroche, J.-P.; Girod, M.; Obertelli, A.

    2009-12-15

    The structure of even-even N=40 isotones is studied from drip line to drip line through the systematic investigation of their quadrupole modes of excitation. Calculations are performed within the Hartree-Fock-Bogoliubov approach using the Gogny D1S effective interaction. Where relevant, these calculations are extended beyond mean field within a generator-coordinate-based method. An overall good agreement with available experimental data is reported, showing that collectivity increases from the neutron to the proton drip line. Whereas {sup 60}Ca and {sup 68}Ni display a calculated spherical shape in their ground states, all other isotones show a prolate-deformed ground-state band and a quasi-{gamma} band. Coexistence features are predicted in the neutron-deficient N=40 isotones above {sup 74}Se.

  12. Bivariate Kumaraswamy distribution with an application on earthquake data

    SciTech Connect

    Özel, Gamze

    2015-03-10

    Bivariate Kumaraswamy (BK) distribution whose marginals are Kumaraswamy distributions has been recently introduced. However, its statistical properties are not studied in detail. In this study, statistical properties of the BK distribution are investigated. We suggest that the BK could provide suitable description for the earthquakes characteristics of Turkey. We support this argument using earthquakesoccurred in Turkey between 1900 and 2009. We also find that the BK distribution simulates earthquakes well.

  13. A new bivariate negative binomial regression model

    NASA Astrophysics Data System (ADS)

    Faroughi, Pouya; Ismail, Noriszura

    2014-12-01

    This paper introduces a new form of bivariate negative binomial (BNB-1) regression which can be fitted to bivariate and correlated count data with covariates. The BNB regression discussed in this study can be fitted to bivariate and overdispersed count data with positive, zero or negative correlations. The joint p.m.f. of the BNB1 distribution is derived from the product of two negative binomial marginals with a multiplicative factor parameter. Several testing methods were used to check overdispersion and goodness-of-fit of the model. Application of BNB-1 regression is illustrated on Malaysian motor insurance dataset. The results indicated that BNB-1 regression has better fit than bivariate Poisson and BNB-2 models with regards to Akaike information criterion.

  14. Nonparametric Analysis of Bivariate Gap Time with Competing Risks

    PubMed Central

    Huang, Chiung-Yu; Wang, Chenguang; Wang, Mei-Cheng

    2016-01-01

    Summary This article considers nonparametric methods for studying recurrent disease and death with competing risks. We first point out that comparisons based on the well-known cumulative incidence function can be confounded by different prevalence rates of the competing events, and that comparisons of the conditional distribution of the survival time given the failure event type are more relevant for investigating the prognosis of different patterns of recurrence disease. We then propose nonparametric estimators for the conditional cumulative incidence function as well as the conditional bivariate cumulative incidence function for the bivariate gap times, that is, the time to disease recurrence and the residual lifetime after recurrence. To quantify the association between the two gap times in the competing risks setting, a modified Kendall’s tau statistic is proposed. The proposed estimators for the conditional bivariate cumulative incidence distribution and the association measure account for the induced dependent censoring for the second gap time. Uniform consistency and weak convergence of the proposed estimators are established. Hypothesis testing procedures for two-sample comparisons are discussed. Numerical simulation studies with practical sample sizes are conducted to evaluate the performance of the proposed nonparametric estimators and tests. An application to data from a pancreatic cancer study is presented to illustrate the methods developed in this article. PMID:26990686

  15. [The influence of an isotonic solution containing benzalkonium chloride and a hypertonic seawater solution on the function of ciliary epithelium from the nasal cavity in vitro].

    PubMed

    Laberko, E L; Bogomil'sky, M R; Soldatsky, Yu L; Pogosova, I E

    2016-01-01

    The objective of the present study was to evaluate the influence of an isotonic saline solution containing benzalconium chloride and of a hypertonic seawater solution on the function of ciliary epithelium in the nasal cavity in vitro. To this effect, we investigated the cytological material obtained from 35 children presenting with adenoid tissue hypertrophy. The tissue samples were taken from the nasal cavity by the standard method. A cellular biopsy obtained from each patient was distributed between three tubes that contained isotonic saline solution supplemented by benzalconium chloride (0.1 mg/ml), a hypertonic seawater solution, and a standard physiological saline solution. It was shown that the number of the viable cells in both isotonic solutions was statistically comparable and significantly higher than in the hypertonic solution (p<0.05). The ciliary beat frequency of the cells embedded in the two isotonic solutions was not significantly different but considerably exceeded that in the hypertonic seawater solution (p<0.05). Thus, the present study has demonstrated the absence of the ciliotoxic influence of isotonic saline solution containing benzalconium chloride at a concentration of 0.1 mg/ml and the strong ciliotoxic effect of the hypertonic seawater solution. This finding gives reason to recommend isotonic solutions for the regular application whereas hypertonic solutions can be prescribed only during infectious and/or inflammatory ENT diseases.

  16. The Effects of Selection Strategies for Bivariate Loglinear Smoothing Models on NEAT Equating Functions

    ERIC Educational Resources Information Center

    Moses, Tim; Holland, Paul W.

    2010-01-01

    In this study, eight statistical strategies were evaluated for selecting the parameterizations of loglinear models for smoothing the bivariate test score distributions used in nonequivalent groups with anchor test (NEAT) equating. Four of the strategies were based on significance tests of chi-square statistics (Likelihood Ratio, Pearson,…

  17. Approximation of Bivariate Functions via Smooth Extensions

    PubMed Central

    Zhang, Zhihua

    2014-01-01

    For a smooth bivariate function defined on a general domain with arbitrary shape, it is difficult to do Fourier approximation or wavelet approximation. In order to solve these problems, in this paper, we give an extension of the bivariate function on a general domain with arbitrary shape to a smooth, periodic function in the whole space or to a smooth, compactly supported function in the whole space. These smooth extensions have simple and clear representations which are determined by this bivariate function and some polynomials. After that, we expand the smooth, periodic function into a Fourier series or a periodic wavelet series or we expand the smooth, compactly supported function into a wavelet series. Since our extensions are smooth, the obtained Fourier coefficients or wavelet coefficients decay very fast. Since our extension tools are polynomials, the moment theorem shows that a lot of wavelet coefficients vanish. From this, with the help of well-known approximation theorems, using our extension methods, the Fourier approximation and the wavelet approximation of the bivariate function on the general domain with small error are obtained. PMID:24683316

  18. Mineral Composition and Nutritive Value of Isotonic and Energy Drinks.

    PubMed

    Leśniewicz, Anna; Grzesiak, Magdalena; Żyrnicki, Wiesław; Borkowska-Burnecka, Jolanta

    2016-04-01

    Several very popular brands of isotonic and energy drinks consumed for fluid and electrolyte supplementation and stimulation of mental or physical alertness were chosen for investigation. Liquid beverages available in polyethylene bottles and aluminum cans as well as products in the form of tablets and powder in sachets were studied. The total concentrations of 21 elements (Ag, Al, B, Ba, Ca, Cd, Co, Cr, Cu, Fe, Mg, Mn, Mo, Na, Ni, P, Pb, Sr, Ti, V, and Zn), both essential and toxic, were simultaneously determined in preconcentrated drink samples by inductively coupled plasma-optical emission spectrometry (ICP-OES) equipped with pneumatic and ultrasonic nebulizers. Differences between the mineral compositions of isotonic and energy drinks were evaluated and discussed. The highest content of Na was found in both isotonic and energy drinks, whereas quite high concentrations of Mg were found in isotonic drinks, and the highest amount of calcium was quantified in energy drinks. The concentrations of B, Co, Cu, Ni, and P were higher in isotonic drinks, but energy drinks contained greater quantities of Ag, Cr, Zn, Mn, and Mo and toxic elements, as Cd and Pb. A comparison of element contents with micronutrient intake and tolerable levels was performed to evaluate contribution of the investigated beverages to the daily diet. The consumption of 250 cm(3) of an isotonic drink provides from 0.32% (for Mn) up to 14.8% (for Na) of the recommended daily intake. For the energy drinks, the maximum recommended daily intake fulfillment ranged from 0.02% (for V) to 19.4 or 19.8% (for Mg and Na).

  19. Bivariate linkage analysis of cholesterol and triglyceride levels in the Framingham Heart Study

    PubMed Central

    Zhang, Xuyang; Wang, Kai

    2003-01-01

    We performed a bivariate analysis on cholesterol and triglyceride levels on data from the Framingham Heart Study using a new score statistic developed for the detection of potential pleiotropic, or cluster, genes. Univariate score statistics were also computed for each trait. At a significance level 0.001, linkage signals were found at markers GATA48B01 on chromosome 1, GATA21C12 on chromosome 8, and ATA55A11 on chromosome 16 using the bivariate analysis. At the same significance level, linkage signals were found at markers 036yb8 on chromosome 3 and GATA3F02 on chromosome 12 using the univariate analysis. A strong linkage signal was also found at marker GATA112F07 by both the bivariate analysis and the univariate analysis, a marker for which evidence for linkage had been reported previously in a related study. PMID:14975130

  20. A microscopic explanation of the isotonic multiplet at N=90

    NASA Astrophysics Data System (ADS)

    Gupta, J. B.

    2014-08-01

    The shape phase transition from spherical to soft deformed at N=88-90 was observed long ago. After the prediction of the X(5) symmetry, for which analytical solution of the nuclear Hamiltonian is given [1], good examples of X(5) nuclei were identified in the N=90 isotones of Nd, Sm, Gd and Dy, in the recent works. The N=90 isotones have almost the similar deformed level structure, forming the isotonic multiplet in Z=50-66, N=82-104 quadrant. This is explained microscopically in terms of the Nilsson level diagram. Using the Dynamic Pairing-Plus-Quadrupole model of Kumar-Baranger, the quadrupole deformation and the occupancies of the neutrons and protons in these nuclei have been calculated, which support the formation of N=88, 90 isotonic multiplets. The existence of F-spin multiplets in Z=66-82, N=82-104 quadrant, identified in earlier works on the Interacting Boson Model, is also explained in our study.

  1. Systematics of. cap alpha. decay of even--even isotones

    SciTech Connect

    Poplavskii-breve, I.V.

    1987-02-01

    On the basis of an analysis of experimental data we have investigated for the first time the ..cap alpha.. decay of even--even isotones. We have established that the ..cap alpha..-decay energy of isotones depends on the number of protons approximately according to a linear law. We have shown that the Geiger--Nuttall law is valid both for isotopes and isobars, and also for isotones. The deviations from the Geiger--Nuttall law are due to the shell structure of the nucleus. The regularities observed in the ..cap alpha.. decay of isotones have been used to estimate the magnitudes of the ..cap alpha..-decay energies, the kinetic energies of the emitted ..cap alpha.. particles, and the partial half-lives for ..cap alpha.. decay of the known and unknown neutron--deficient nuclei /sup 202//sup ,//sup 204/Ra, /sup 210/Th, /sup 228//sup ,//sup 230/Pu, /sup 234//sup ,//sup 236/Cm, /sup 242//sup ,//sup 244/Fm, /sup 250//sup ,//sup 258/No, and /sup 254//sup ,//sup 256/Ku.

  2. An Annotated Bibliography of Isotonic Weight-Training Methods.

    ERIC Educational Resources Information Center

    Wysong, John V.

    This literature study was conducted to compare and evaluate various types and techniques of weight lifting so that a weight lifting program could be selected or devised for a secondary school. Annotations of 32 research reports, journal articles, and monographs on isotonic strength training are presented. The literature in the first part of the…

  3. A note on mixture bivariate model

    NASA Astrophysics Data System (ADS)

    Cao, Hanwen; Tian, Wei; Deng, Chengzhi

    2011-10-01

    L. Sendur and I. W. Selesnick suggest four jointly non-Gaussian bivariate models to characterize the dependency between a coefficient and its parent, and respectively derive the corresponding MAP estimators based on noisy wavelet coefficients in detail in [6]. Among the four models, the second is a mixture model and it is quite complicated to evaluate parameters, so L. Sendur and I.W. Selesnick didn't give a concrete method. In this letter, a concrete mixture bivariate model will be described by drawing inspiration from Model 2. Expectation maximization (EM) algorithm is employed to find the parameters of new model. The simulation results show that the values of PSNR have a bit improvement compared with Model 1. The results can be viewed as a supplementary of model 2 in [6].

  4. Current misuses of multiple regression for investigating bivariate hypotheses: an example from the organizational domain.

    PubMed

    O'Neill, Thomas A; McLarnon, Matthew J W; Schneider, Travis J; Gardner, Robert C

    2014-09-01

    By definition, multiple regression (MR) considers more than one predictor variable, and each variable's beta will depend on both its correlation with the criterion and its correlation with the other predictor(s). Despite ad nauseam coverage of this characteristic in organizational psychology and statistical texts, researchers' applications of MR in bivariate hypothesis testing has been the subject of recent and renewed interest. Accordingly, we conducted a targeted survey of the literature by coding articles, covering a five-year span from two top-tier organizational journals, that employed MR for testing bivariate relations. The results suggest that MR coefficients, rather than correlation coefficients, were most common for testing hypotheses of bivariate relations, yet supporting theoretical rationales were rarely offered. Regarding the potential impact on scientific advancement, in almost half of the articles reviewed (44 %), at least one conclusion of each study (i.e., that the hypothesis was or was not supported) would have been different, depending on the author's use of correlation or beta to test the bivariate hypothesis. It follows that inappropriate decisions to interpret the correlation versus the beta will affect the accumulation of consistent and replicable scientific evidence. We conclude with recommendations for improving bivariate hypothesis testing.

  5. Binary Classifier Calibration using an Ensemble of Near Isotonic Regression Models.

    PubMed

    Naeini, Mahdi Pakdaman; Cooper, Gregory F

    2016-12-01

    Learning accurate probabilistic models from data is crucial in many practical tasks in data mining. In this paper we present a new non-parametric calibration method called ensemble of near isotonic regression (ENIR). The method can be considered as an extension of BBQ [20], a recently proposed calibration method, as well as the commonly used calibration method based on isotonic regression (IsoRegC) [27]. ENIR is designed to address the key limitation of IsoRegC which is the monotonicity assumption of the predictions. Similar to BBQ, the method post-processes the output of a binary classifier to obtain calibrated probabilities. Thus it can be used with many existing classification models to generate accurate probabilistic predictions. We demonstrate the performance of ENIR on synthetic and real datasets for commonly applied binary classification models. Experimental results show that the method outperforms several common binary classifier calibration methods. In particular on the real data, ENIR commonly performs statistically significantly better than the other methods, and never worse. It is able to improve the calibration power of classifiers, while retaining their discrimination power. The method is also computationally tractable for large scale datasets, as it is O(N log N) time, where N is the number of samples.

  6. Predicting Number of Zombies in a DDoS Attacks Using Isotonic Regression

    NASA Astrophysics Data System (ADS)

    Gupta, B. B.; Jamali, Nadeem

    Anomaly based DDoS detection systems construct profile of the traffic normally seen in the network, and identify anomalies whenever traffic deviate from normal profile beyond a threshold. This deviation in traffic beyond threshold is used in the past for DDoS detection but not for finding number of zombies. This chapter presents an approach that utilizes this deviation in traffic to predict number of zombies using isotonic regression model. A relationship is established between number of zombies and observed deviation in sample entropy. Internet type topologies used for simulation are generated using Transit-Stub model of GT-ITM topology generator. NS-2 network simulator on Linux platform is used as simulation test bed for launching DDoS attacks with varied number of zombies. Various statistical performance measures are used to measure the performance of the regression model. The simulation results are promising as we are able to predict number of zombies efficiently with very less error rate using isotonic regression model.

  7. Binary Classifier Calibration using an Ensemble of Near Isotonic Regression Models

    PubMed Central

    Naeini, Mahdi Pakdaman; Cooper, Gregory F.

    2017-01-01

    Learning accurate probabilistic models from data is crucial in many practical tasks in data mining. In this paper we present a new non-parametric calibration method called ensemble of near isotonic regression (ENIR). The method can be considered as an extension of BBQ [20], a recently proposed calibration method, as well as the commonly used calibration method based on isotonic regression (IsoRegC) [27]. ENIR is designed to address the key limitation of IsoRegC which is the monotonicity assumption of the predictions. Similar to BBQ, the method post-processes the output of a binary classifier to obtain calibrated probabilities. Thus it can be used with many existing classification models to generate accurate probabilistic predictions. We demonstrate the performance of ENIR on synthetic and real datasets for commonly applied binary classification models. Experimental results show that the method outperforms several common binary classifier calibration methods. In particular on the real data, ENIR commonly performs statistically significantly better than the other methods, and never worse. It is able to improve the calibration power of classifiers, while retaining their discrimination power. The method is also computationally tractable for large scale datasets, as it is O(N log N) time, where N is the number of samples. PMID:28316511

  8. Correlation estimation with singly truncated bivariate data.

    PubMed

    Im, Jongho; Ahn, Eunyong; Beck, Namseon; Kim, Jae Kwang; Park, Taesung

    2017-02-27

    Correlation coefficient estimates are often attenuated for truncated samples in the sense that the estimates are biased towards zero. Motivated by real data collected in South Sudan, we consider correlation coefficient estimation with singly truncated bivariate data. By considering a linear regression model in which a truncated variable is used as an explanatory variable, a consistent estimator for the regression slope can be obtained from the ordinary least squares method. A consistent estimator of the correlation coefficient is then obtained by multiplying the regression slope estimator by the variance ratio of the two variables. Results from two limited simulation studies confirm the validity and robustness of the proposed method. The proposed method is applied to the South Sudanese children's anthropometric and nutritional data collected by World Vision. Copyright © 2017 John Wiley & Sons, Ltd.

  9. The bivariate combined model for spatial data analysis.

    PubMed

    Neyens, Thomas; Lawson, Andrew B; Kirby, Russell S; Faes, Christel

    2016-08-15

    To describe the spatial distribution of diseases, a number of methods have been proposed to model relative risks within areas. Most models use Bayesian hierarchical methods, in which one models both spatially structured and unstructured extra-Poisson variance present in the data. For modelling a single disease, the conditional autoregressive (CAR) convolution model has been very popular. More recently, a combined model was proposed that 'combines' ideas from the CAR convolution model and the well-known Poisson-gamma model. The combined model was shown to be a good alternative to the CAR convolution model when there was a large amount of uncorrelated extra-variance in the data. Less solutions exist for modelling two diseases simultaneously or modelling a disease in two sub-populations simultaneously. Furthermore, existing models are typically based on the CAR convolution model. In this paper, a bivariate version of the combined model is proposed in which the unstructured heterogeneity term is split up into terms that are shared and terms that are specific to the disease or subpopulation, while spatial dependency is introduced via a univariate or multivariate Markov random field. The proposed method is illustrated by analysis of disease data in Georgia (USA) and Limburg (Belgium) and in a simulation study. We conclude that the bivariate combined model constitutes an interesting model when two diseases are possibly correlated. As the choice of the preferred model differs between data sets, we suggest to use the new and existing modelling approaches together and to choose the best model via goodness-of-fit statistics. Copyright © 2016 John Wiley & Sons, Ltd.

  10. Isotonic similarities in isotope shifts from Hg to Ra.

    NASA Astrophysics Data System (ADS)

    Stroke, H. H.

    2003-04-01

    Isotope shifts (IS) in atomic spectra of heavy elements reflect largely the variation in of the nuclear charge distribution. Our early systematic measurements of IS for an extended range of stable and radioactive isotopes and nuclear isomers in Tl and Hg^1 showed that by displaying the relative IS, normalized to a chosen pair of isotopes, there was a striking similarity for the IS of isotones. This essentially divides out the electronic factor in the IS and allows the comparison of Δ for neighboring Z as N is varied. Following our further studies on Pb and Bi^2 and those on Fr at ISOLDE by the Orsay spectroscopy group^3, we found that the isotonic similarity extended to these elements. Since then, many additional measurements were made, principally at ISOLDE^4, extending to Ra the elements studied. The isotonic shift similarities persist from Z=80 to 88. We noted that the relative isotope and isomer shifts can be used to investigate the polarization of the nucleus by the added neutrons, a model used in a calculation by Barrett.^5 . The new data may serve further in this direction. ^1W,J.Tomlinson, H.H. Stroke, Nucl.Phys. 60, 614 (1964). ^2M. Barboza-Flores et al., Z.Phys. A 321, 85 (1985), ^3S. Liberman et al., Phys .Rev. A 22, 2732 (1980). ^4E,g. M.R. Pearson et al., J.Phys. G 26, 1829 (2000). ^5R.C. Barrett, Nucl. Phys. 88, 128 (1966).

  11. Statistics

    Cancer.gov

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  12. Analysis of some bivariate non-linear interpolatory subdivision schemes

    NASA Astrophysics Data System (ADS)

    Dadourian, Karine; Liandrat, Jacques

    2008-07-01

    This paper is devoted to the convergence analysis of a class of bivariate subdivision schemes that can be defined as a specific perturbation of a linear subdivision scheme. We study successively the univariate and bivariate case and apply the analysis to the so called Powerp scheme (Serna and Marquina, J Comput Phys 194:632-658, 2004).

  13. Evaluation dam overtopping risk based on univariate and bivariate flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Goodarzi, E.; Mirzaei, M.; Shui, L. T.; Ziaei, M.

    2011-11-01

    There is a growing tendency to assess the safety levels of existing dams based on risk and uncertainty analysis using mathematical and statistical methods. This research presents the application of risk and uncertainty analysis to dam overtopping based on univariate and bivariate flood frequency analyses by applying Gumbel logistic distribution for the Doroudzan earth-fill dam in south of Iran. The bivariate frequency analysis resulted in six inflow hydrographs with a joint return period of 100-yr. The overtopping risks were computed for all of those hydrographs considering quantile of flood peak discharge (in particular 100-yr), initial depth of water in the reservoir, and discharge coefficient of spillway as uncertain variables. The maximum height of the water, as most important factor in the overtopping analysis, was evaluated using reservoir routing and the Monte Carlo and Latin hypercube techniques were applied for uncertainty analysis. Finally, the achieved results using both univariate and bivariate frequency analysis have been compared to show the significance of bivariate analyses on dam overtopping.

  14. Bivariate gamma distributions for image registration and change detection.

    PubMed

    Chatelain, Florent; Tourneret, Jean-Yves; Inglada, Jordi; Ferrari, André

    2007-07-01

    This paper evaluates the potential interest of using bivariate gamma distributions for image registration and change detection. The first part of this paper studies estimators for the parameters of bivariate gamma distributions based on the maximum likelihood principle and the method of moments. The performance of both methods are compared in terms of estimated mean square errors and theoretical asymptotic variances. The mutual information is a classical similarity measure which can be used for image registration or change detection. The second part of the paper studies some properties of the mutual information for bivariate Gamma distributions. Image registration and change detection techniques based on bivariate gamma distributions are finally investigated. Simulation results conducted on synthetic and real data are very encouraging. Bivariate gamma distributions are good candidates allowing us to develop new image registration algorithms and new change detectors.

  15. A laterality effect in isometric and isotonic labial tracking.

    PubMed

    Sussman, H M; Westbury, J R

    1978-09-01

    Hemispheric dominance for sensorimotor control of lip activity was investigated by use of a pursuit auditory tracking task. This task involves continuous frequency matching of a computer-generated target tone and a subject-controlled cursor tone. Thirty right-handed subjects were tested under isometric lip and hand control, and 20 right-handed subjects under isotonic lip control. Subjects tracked 10 1-min trials under each laterality condition--cursor/right ear, target/left ear, and vice versa. In both experiments tracking performance was better when the lip-controlled cursor tone was presented to the right ear (hence direct contralateral route to left hemisphere). A significant (p less than 0.05) cursor/right-ear advantage was found under isometric hand-tracking. Analysis routines examined relative laterality advantages across several time intervals within each 1-min trial. Consistent lateralization scores in favor of cursor/right-ear presentations (REAs) were independent of the time interval measured. For isometric tracking, 58% of subjects having laterality advantages (p less than 0.10) revealed REAs. For isotonic tracking, 71% of subjects revealed REAs. Implications of the latter finding are discussed relative to a left hemisphere mechanism specialized to integrate movement-generated auditory feedback with dynamic kinesthetic information from the articulators.

  16. Bayesian Isotonic Regression Dose-response (BIRD) Model.

    PubMed

    Li, Wen; Fu, Haoda

    2016-12-21

    Understanding dose-response relationship is a crucial step in drug development. There are a few parametric methods to estimate dose-response curves, such as the Emax model and the logistic model. These parametric models are easy to interpret and, hence, widely used. However, these models often require the inclusion of patients on high-dose levels; otherwise, the model parameters cannot be reliably estimated. To have robust estimation, nonparametric models are used. However, these models are not able to estimate certain important clinical parameters, such as ED50 and Emax. Furthermore, in many therapeutic areas, dose-response curves can be assumed as non-decreasing functions. This creates an additional challenge for nonparametric methods. In this paper, we propose a new Bayesian isotonic regression dose-response model which features advantages from both parametric and nonparametric models. The ED50 and Emax can be derived from this model. Simulations are provided to evaluate the Bayesian isotonic regression dose-response model performance against two parametric models. We apply this model to a data set from a diabetes dose-finding study.

  17. The bivariate Rogers Szegö polynomials

    NASA Astrophysics Data System (ADS)

    Chen, William Y. C.; Saad, Husam L.; Sun, Lisa H.

    2007-06-01

    We present an operator approach to deriving Mehler's formula and the Rogers formula for the bivariate Rogers-Szegö polynomials hn(x, y|q). The proof of Mehler's formula can be considered as a new approach to the nonsymmetric Poisson kernel formula for the continuous big q-Hermite polynomials Hn(x; a|q) due to Askey, Rahman and Suslov. Mehler's formula for hn(x, y|q) involves a 3phi2 sum and the Rogers formula involves a 2phi1 sum. The proofs of these results are based on parameter augmentation with respect to the q-exponential operator and the homogeneous q-shift operator in two variables. By extending recent results on the Rogers-Szegö polynomials hn(x|q) due to Hou, Lascoux and Mu, we obtain another Rogers-type formula for hn(x, y|q). Finally, we give a change of base formula for Hn(x; a|q) which can be used to evaluate some integrals by using the Askey-Wilson integral.

  18. An Analysis of Isotonic and Isokinetic Strength-Training Methods and Techniques.

    ERIC Educational Resources Information Center

    Metcalfe, Randall E.

    This annotated bibliography documents traditional isotonic strength training and nontraditional isotonic strength training (isokinetics) to aid the athletic coach in deciding which type and scheme of training will best develop strength. A glossary of terms is provided. Appendices include muscle action charts and tables, body position charts, a…

  19. Meta-analysis of diagnostic test data: a bivariate Bayesian modeling approach.

    PubMed

    Verde, Pablo E

    2010-12-30

    In the last decades, the amount of published results on clinical diagnostic tests has expanded very rapidly. The counterpart to this development has been the formal evaluation and synthesis of diagnostic results. However, published results present substantial heterogeneity and they can be regarded as so far removed from the classical domain of meta-analysis, that they can provide a rather severe test of classical statistical methods. Recently, bivariate random effects meta-analytic methods, which model the pairs of sensitivities and specificities, have been presented from the classical point of view. In this work a bivariate Bayesian modeling approach is presented. This approach substantially extends the scope of classical bivariate methods by allowing the structural distribution of the random effects to depend on multiple sources of variability. Meta-analysis is summarized by the predictive posterior distributions for sensitivity and specificity. This new approach allows, also, to perform substantial model checking, model diagnostic and model selection. Statistical computations are implemented in the public domain statistical software (WinBUGS and R) and illustrated with real data examples.

  20. Irregularity in Kπ=8- rotational bands of N =150 isotones

    NASA Astrophysics Data System (ADS)

    Fu, X. M.; Xu, F. R.; Jiao, C. F.; Liang, W. Y.; Pei, J. C.; Liu, H. L.

    2014-05-01

    Motivated by new experimental spectra in transfermium mass region, we have investigated broken-pair high-K multiparticle excited states and their rotational bands for the N =150 isotones around Z =100, by using the configuration-constrained pairing-deformation-frequency self-consistent total-Routhian-surface (TRS) model. In order to avoid the spurious phase transition encountered in the Bardeen-Copper-Schrieffer (BCS) pairing, the particle-number-conserving method has been employed for pairing calculations. Pairing correlations are remarkably reduced for the rotations of broken-pair multiparticle states. The present configuration-constrained TRS calculations reproduce reasonably existing experimental data. The abnormal feature in the recently observed Kπ=8- bands of No252 and Fm250 would be associated with configuration mixing.

  1. Evolution of Collectivity in the N = 80 Isotones

    NASA Astrophysics Data System (ADS)

    Bauer, C.; Stegmann, R.; Rainovski, G.; Pietralla, N.; Blazhev, A.; Bönig, S.; Damyanova, A.; Danchev, M.; Gladnishki, K. A.; Lutter, R.; Marsh, B. A.; Möller, T.; Pakarinen, J.; Radeck, D.; Rapisarda, E.; Reiter, P.; Scheck, M.; Seidlitz, M.; Siebeck, B.; Stahl, C.; Thoele, P.; Thomas, T.; Thürauf, M.; Warr, N.; Werner, V.; de Witte, H.

    2015-11-01

    Recent data on transition strengths, namely the hitherto unknown B(E2) values of radioactive 140Nd and 142Sm in the N=80 isotones, have suggested that the proton 1g7/2 subshell closure at Z=58 has an impact on the properties of low-lying collective states. The unstable, neutron-rich nuclei 140Nd and 142Sm were investigated via projectile Coulomb excitation at the REX-ISOLDE facility at CERN with the high-purity Germanium detector array MINIBALL. The measurement of 140Nd and the preliminary result for 142Sm demonstrate that the reduced collectivity of 138Ce is a local effect, possibly due to the Z=58 subshell closure, and requests refined theoretical calculations.

  2. Phenomenological models of the dynamics of muscle during isotonic shortening.

    PubMed

    Yeo, Sang Hoon; Monroy, Jenna A; Lappin, A Kristopher; Nishikawa, Kiisa C; Pai, Dinesh K

    2013-09-27

    We investigated the effectiveness of simple, Hill-type, phenomenological models of the force-length-velocity relationship for simulating measured length trajectories during muscle shortening, and, if so, what forms of the model are most useful. Using isotonic shortening data from mouse soleus and toad depressor mandibulae muscles, we showed that Hill-type models can indeed simulate the shortening trajectories with sufficiently good accuracy. However, we found that the standard form of the Hill-type muscle model, called the force-scaling model, is not a satisfactory choice. Instead, the results support the use of less frequently used models, the f-max scaling model and force-scaling with parallel spring, to simulate the shortening dynamics of muscle.

  3. Regime-Switching Bivariate Dual Change Score Model.

    PubMed

    Chow, Sy-Miin; Grimm, Kevin J; Filteau, Guillaume; Dolan, Conor V; McArdle, John J

    2013-07-01

    Mixture structural equation model with regime switching (MSEM-RS) provides one possible way of representing over-time heterogeneities in dynamic processes by allowing a system to manifest qualitatively or quantitatively distinct change processes conditional on the latent "regime" the system is in at a particular time point. Unlike standard mixture structural equation models such as growth mixture models, MSEM-RS allows individuals to transition between latent classes over time. This class of models, often referred to as regime-switching models in the time series and econometric applications, can be specified as regime-switching mixture structural equation models when the number of repeated measures involved is not large. We illustrate the empirical utility of such models using one special case-a regime-switching bivariate dual change score model in which two growth processes are allowed to manifest regime-dependent coupling relations with one another. The proposed model is illustrated using a set of longitudinal reading and arithmetic performance data from the Early Childhood Longitudinal Study, Kindergarten Class of 1998-99 study (ECLS-K; U.S. Department of Education, National Center for Education Statistics, 2010).

  4. A Vehicle for Bivariate Data Analysis

    ERIC Educational Resources Information Center

    Roscoe, Matt B.

    2016-01-01

    Instead of reserving the study of probability and statistics for special fourth-year high school courses, the Common Core State Standards for Mathematics (CCSSM) takes a "statistics for all" approach. The standards recommend that students in grades 6-8 learn to summarize and describe data distributions, understand probability, draw…

  5. Cumulative Incidence Association Models for Bivariate Competing Risks Data.

    PubMed

    Cheng, Yu; Fine, Jason P

    2012-03-01

    Association models, like frailty and copula models, are frequently used to analyze clustered survival data and evaluate within-cluster associations. The assumption of noninformative censoring is commonly applied to these models, though it may not be true in many situations. In this paper, we consider bivariate competing risk data and focus on association models specified for the bivariate cumulative incidence function (CIF), a nonparametrically identifiable quantity. Copula models are proposed which relate the bivariate CIF to its corresponding univariate CIFs, similarly to independently right censored data, and accommodate frailty models for the bivariate CIF. Two estimating equations are developed to estimate the association parameter, permitting the univariate CIFs to be estimated either parametrically or nonparametrically. Goodness-of-fit tests are presented for formally evaluating the parametric models. Both estimators perform well with moderate sample sizes in simulation studies. The practical use of the methodology is illustrated in an analysis of dementia associations.

  6. Properties of the Bivariate Delayed Poisson Process

    DTIC Science & Technology

    1974-07-01

    Initial Conditions. The purpose and methodology of stationary Initial conditions for uni- varlate point processes have been described in Lawrance ... Lawrance , A. J. (1972). Some models for stationary series of univariate events. In Stochastic Point Processes: Statistical Analysis, Theory and

  7. FUNSTAT and statistical image representations

    NASA Technical Reports Server (NTRS)

    Parzen, E.

    1983-01-01

    General ideas of functional statistical inference analysis of one sample and two samples, univariate and bivariate are outlined. ONESAM program is applied to analyze the univariate probability distributions of multi-spectral image data.

  8. Statistical modeling of space shuttle environmental data

    NASA Technical Reports Server (NTRS)

    Tubbs, J. D.; Brewer, D. W.

    1983-01-01

    Statistical models which use a class of bivariate gamma distribution are examined. Topics discussed include: (1) the ratio of positively correlated gamma varieties; (2) a method to determine if unequal shape parameters are necessary in bivariate gamma distribution; (3) differential equations for modal location of a family of bivariate gamma distribution; and (4) analysis of some wind gust data using the analytical results developed for modeling application.

  9. Bayesian framework for parametric bivariate accelerated lifetime modeling and its application to hospital acquired infections.

    PubMed

    Bilgili, D; Ryu, D; Ergönül, Ö; Ebrahimi, N

    2016-03-01

    Infectious diseases that can be spread directly or indirectly from one person to another are caused by pathogenic microorganisms such as bacteria, viruses, parasites, or fungi. Infectious diseases remain one of the greatest threats to human health and the analysis of infectious disease data is among the most important application of statistics. In this article, we develop Bayesian methodology using parametric bivariate accelerated lifetime model to study dependency between the colonization and infection times for Acinetobacter baumannii bacteria which is leading cause of infection among the hospital infection agents. We also study their associations with covariates such as age, gender, apache score, antibiotics use 3 months before admission and invasive mechanical ventilation use. To account for singularity, we use Singular Bivariate Extreme Value distribution to model residuals in Bivariate Accelerated lifetime model under the fully Bayesian framework. We analyze a censored data related to the colonization and infection collected in five major hospitals in Turkey using our methodology. The data analysis done in this article is for illustration of our proposed method and can be applied to any situation that our model can be used.

  10. Shape changes in neutron rich N = 43 isotones

    NASA Astrophysics Data System (ADS)

    Sethi, Jasmine; Forney, A.; Walters, W. B.; Harker, J.; Chiara, C. J.; Stefanescu, I.; Janssens, R. V. F.; Zhu, S.; Carpenter, M. P.

    2016-09-01

    et al. Nuclei in the transitional region with 28 < Z < 50 and 40 < N < 50 are very sensitive to shape changes with addition of individual nucleons due to close-lying neutron orbitals in the fpg model space. A systematic comparison of the structure of N = 43 isotones, focussing on new results on 75Ge and 73Zn will be presented. Both nuclei were populated in deep inelastic scattering reactions, 76Ge+208Pb and 76Ge+238 U , at 25 % above the Coulomb barrier, using Gammasphere and ATLAS facility at ANL. A number of new transitions and levels have been identified in both nuclei. The experimental results and their comparison to the theoretical calculations will be presented. This work is supported by the U.S. Department of Energy, Office of Nuclear Physics under Contract Numbers DE-AC02-06CH11357 and DE- AC02-05CH11231 and under Grant Numbers DE-FG02-94ER40834 and by the Polish Ministry of Science Grant Numbers 1P03B05929 and NN202103333. This research used resources of ANL's ATLAS facility, which is a DOE Office of Science User Facility.

  11. Nonlinear analysis of bivariate data with cross recurrence plots

    NASA Astrophysics Data System (ADS)

    Marwan, Norbert; Kurths, Jürgen

    2002-09-01

    We use the extension of the method of recurrence plots to cross recurrence plots (CRP) which enables a nonlinear analysis of bivariate data. To quantify CRPs, we develop further three measures of complexity mainly basing on diagonal structures in CRPs. The CRP analysis of prototypical model systems with nonlinear interactions demonstrates that this technique enables to find these nonlinear interrelations from bivariate time series, whereas linear correlation tests do not. Applying the CRP analysis to climatological data, we find a complex relationship between rainfall and El Niño data.

  12. Probabilistic floodplain hazard mapping: managing uncertainty by using a bivariate approach for flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Candela, Angela; Tito Aronica, Giuseppe

    2014-05-01

    Floods are a global problem and are considered the most frequent natural disaster world-wide. Many studies show that the severity and frequency of floods have increased in recent years and underline the difficulty to separate the effects of natural climatic changes and human influences as land management practices, urbanization etc. Flood risk analysis and assessment is required to provide information on current or future flood hazard and risks in order to accomplish flood risk mitigation, to propose, evaluate and select measures to reduce it. Both components of risk can be mapped individually and are affected by multiple uncertainties as well as the joint estimate of flood risk. Major sources of uncertainty include statistical analysis of extremes events, definition of hydrological input, channel and floodplain topography representation, the choice of effective hydraulic roughness coefficients. The classical procedure to estimate flood discharge for a chosen probability of exceedance is to deal with a rainfall-runoff model associating to risk the same return period of original rainfall, in accordance with the iso-frequency criterion. Alternatively, a flood frequency analysis to a given record of discharge data is applied, but again the same probability is associated to flood discharges and respective risk. Moreover, since flood peaks and corresponding flood volumes are variables of the same phenomenon, they should be, directly, correlated and, consequently, multivariate statistical analyses must be applied. This study presents an innovative approach to obtain flood hazard maps where hydrological input (synthetic flood design event) to a 2D hydraulic model has been defined by generating flood peak discharges and volumes from: a) a classical univariate approach, b) a bivariate statistical analysis, through the use of copulas. The univariate approach considers flood hydrographs generation by an indirect approach (rainfall-runoff transformation using input rainfall

  13. A New Measure Of Bivariate Asymmetry And Its Evaluation

    SciTech Connect

    Ferreira, Flavio Henn; Kolev, Nikolai Valtchev

    2008-11-06

    In this paper we propose a new measure of bivariate asymmetry, based on conditional correlation coefficients. A decomposition of the Pearson correlation coefficient in terms of its conditional versions is studied and an example of application of the proposed measure is given.

  14. Evaluating Univariate, Bivariate, and Multivariate Normality Using Graphical Procedures.

    ERIC Educational Resources Information Center

    Burdenski, Thomas K., Jr.

    This paper reviews graphical and nongraphical procedures for evaluating multivariate normality by guiding the reader through univariate and bivariate procedures that are necessary, but insufficient, indications of a multivariate normal distribution. A data set using three dependent variables for two groups provided by D. George and P. Mallery…

  15. Multimodal Bivariate Thematic Maps: Auditory and Haptic Display.

    ERIC Educational Resources Information Center

    Jeong, Wooseob; Gluck, Myke

    2002-01-01

    Explores the possibility of multimodal bivariate thematic maps by utilizing auditory and haptic (sense of touch) displays. Measured completion time of tasks and the recall (retention) rate in two experiments, and findings confirmed the possibility of using auditory and haptic displays in geographic information systems (GIS). (Author/LRW)

  16. Experimental study of the variation of alpha elastic scattering cross sections along isotopic and isotonic chains at low energies

    SciTech Connect

    Kiss, G. G.; Gyuerky, Gy.; Elekes, Z.; Fueloep, Zs.; Somorjai, E.; Galaviz, D.; Sonnabend, K.; Zilges, A.; Mohr, P.; Goerres, J.; Wiescher, M.; Oezkan, N.; Gueray, T.; Yalcin, C.; Avrigeanu, M.

    2008-05-21

    To improve the reliability of statistical model calculations in the region of heavy proton rich isotopes alpha elastic scattering experiments have been performed at ATOMKI, Debrecen, Hungary. The experiments were carried out at several energies above and below the Coulomb barrier with high precision. The measured angular distributions can be used for testing the predictions of the global and regional optical potential parameter sets. Moreover, we derived the variation of the elastic alpha scattering cross section along the Z = 50 ({sup 112}Sn-{sup 124}Sn) isotopic and N = 50 ({sup 89}Y-{sup 92}Mo) isotonic chains. In this paper we summarize the efforts to provide high precision experimental angular distributions for several A{approx_equal}100 nuclei to test the global optical potential parameterizations applied to p-process network calculations.

  17. Constructing a bivariate distribution function with given marginals and correlation: application to the galaxy luminosity function

    NASA Astrophysics Data System (ADS)

    Takeuchi, Tsutomu T.

    2010-08-01

    We provide an analytic method to construct a bivariate distribution function (DF) with given marginal distributions and correlation coefficient. We introduce a convenient mathematical tool, called a copula, to connect two DFs with any prescribed dependence structure. If the correlation of two variables is weak (Pearson's correlation coefficient |ρ| < 1/3), the Farlie-Gumbel-Morgenstern (FGM) copula provides an intuitive and natural way to construct such a bivariate DF. When the linear correlation is stronger, the FGM copula cannot work anymore. In this case, we propose using a Gaussian copula, which connects two given marginals and is directly related to the linear correlation coefficient between two variables. Using the copulas, we construct the bivariate luminosity function (BLF) and discuss its statistical properties. We focus especially on the far-infrared-far-ulatraviolet (FUV-FIR) BLF, since these two wavelength regions are related to star-formation (SF) activity. Though both the FUV and FIR are related to SF activity, the univariate LFs have a very different functional form: the former is well described by the Schechter function whilst the latter has a much more extended power-law-like luminous end. We construct the FUV-FIR BLFs using the FGM and Gaussian copulas with different strengths of correlation, and examine their statistical properties. We then discuss some further possible applications of the BLF: the problem of a multiband flux-limited sample selection, the construction of the star-formation rate (SFR) function, and the construction of the stellar mass of galaxies (M*)-specific SFR (SFR/M*) relation. The copulas turn out to be a very useful tool to investigate all these issues, especially for including complicated selection effects.

  18. Work capacity during 30 days of bed rest with isotonic and isokinetic exercise training

    NASA Technical Reports Server (NTRS)

    Greenleaf, J. E.; Bernauer, E. M.; Ertl, A. C.; Trowbridge, T. S.; Wade, C. E.

    1989-01-01

    Results are presented from a study to determine whether or not short-term variable intensity isotonic and intermittent high-intensity isokinetic short-duration leg exercise is effective for the maintenance of peak O2 (VO2) uptake and muscular strength and endurance, respectively, during 30 days of -6 deg head-down bed rest deconditioning. The results show no significant changes in leg peak torque, leg mean total work, arm total peak torque, or arm mean total work for members of the isotonic, isokinetic, and controls groups. Changes are observed, however, in peak VO2 levels. The results suggest that near-peak variabile intensity, isotonic leg excercise maintains peak VO2 during 30 days of bed rest, while peak intermittent, isokinetic leg excercise protocol does not.

  19. Effects of isotonic and isometric exercises with mist sauna bathing on cardiovascular, thermoregulatory, and metabolic functions

    NASA Astrophysics Data System (ADS)

    Iwase, Satoshi; Kawahara, Yuko; Nishimura, Naoki; Nishimura, Rumiko; Sugenoya, Junichi; Miwa, Chihiro; Takada, Masumi

    2014-08-01

    To clarify the effects of isometric and isotonic exercise during mist sauna bathing on the cardiovascular function, thermoregulatory function, and metabolism, six healthy young men (22 ± 1 years old, height 173 ± 4 cm, weight 65.0 ± 5.0 kg) were exposed to a mist sauna for 10 min at a temperature of 40 °C, and relative humidity of 100 % while performing or not performing ˜30 W of isometric or isotonic exercise. The effect of the exercise was assessed by measuring tympanic temperature, heart rate, systolic and diastolic blood pressure, chest sweat rate, chest skin blood flow, and plasma catecholamine and cortisol, glucose, lactate, and free fatty acid levels. Repeated measures ANOVA showed no significant differences in blood pressure, skin blood flow, sweat rate, and total amount of sweating. Tympanic temperature increased more during isotonic exercise, and heart rate increase was more marked during isotonic exercise. The changes in lactate indicated that fatigue was not very great during isometric exercise. The glucose level indicated greater energy expenditure during isometric exercise. The free fatty acid and catecholamine levels indicated that isometric exercise did not result in very great energy expenditure and stress, respectively. The results for isotonic exercise of a decrease in lactate level and an increase in plasma free fatty acid level indicated that fatigue and energy expenditure were rather large while the perceived stress was comparatively low. We concluded that isotonic exercise may be a more desirable form of exercise during mist sauna bathing given the changes in glucose and free fatty acid levels.

  20. Effects of isotonic and isometric exercises with mist sauna bathing on cardiovascular, thermoregulatory, and metabolic functions.

    PubMed

    Iwase, Satoshi; Kawahara, Yuko; Nishimura, Naoki; Nishimura, Rumiko; Sugenoya, Junichi; Miwa, Chihiro; Takada, Masumi

    2014-08-01

    To clarify the effects of isometric and isotonic exercise during mist sauna bathing on the cardiovascular function, thermoregulatory function, and metabolism, six healthy young men (22 ± 1 years old, height 173 ± 4 cm, weight 65.0 ± 5.0 kg) were exposed to a mist sauna for 10 min at a temperature of 40 °C, and relative humidity of 100 % while performing or not performing ∼30 W of isometric or isotonic exercise. The effect of the exercise was assessed by measuring tympanic temperature, heart rate, systolic and diastolic blood pressure, chest sweat rate, chest skin blood flow, and plasma catecholamine and cortisol, glucose, lactate, and free fatty acid levels. Repeated measures ANOVA showed no significant differences in blood pressure, skin blood flow, sweat rate, and total amount of sweating. Tympanic temperature increased more during isotonic exercise, and heart rate increase was more marked during isotonic exercise. The changes in lactate indicated that fatigue was not very great during isometric exercise. The glucose level indicated greater energy expenditure during isometric exercise. The free fatty acid and catecholamine levels indicated that isometric exercise did not result in very great energy expenditure and stress, respectively. The results for isotonic exercise of a decrease in lactate level and an increase in plasma free fatty acid level indicated that fatigue and energy expenditure were rather large while the perceived stress was comparatively low. We concluded that isotonic exercise may be a more desirable form of exercise during mist sauna bathing given the changes in glucose and free fatty acid levels.

  1. Irrigation solutions in open fractures of the lower extremities: evaluation of isotonic saline and distilled water

    PubMed Central

    Olufemi, Olukemi Temiloluwa; Adeyeye, Adeolu Ikechukwu

    2017-01-01

    Introduction: Open fractures are widely considered as orthopaedic emergencies requiring immediate intervention. The initial management of these injuries usually affects the ultimate outcome because open fractures may be associated with significant morbidity. Wound irrigation forms one of the pivotal principles in the treatment of open fractures. The choice of irrigation fluid has since been a source of debate. This study aimed to evaluate and compare the effects of isotonic saline and distilled water as irrigation solutions in the management of open fractures of the lower extremities. Wound infection and wound healing rates using both solutions were evaluated. Methods: This was a prospective hospital-based study of 109 patients who presented to the Accident and Emergency department with open lower limb fractures. Approval was sought and obtained from the Ethics Committee of the Hospital. Patients were randomized into either the isotonic saline (NS) or the distilled water (DW) group using a simple ballot technique. Twelve patients were lost to follow-up, while 97 patients were available until conclusion of the study. There were 50 patients in the isotonic saline group and 47 patients in the distilled water group. Results: Forty-one (42.3%) of the patients were in the young and economically productive strata of the population. There was a male preponderance with a 1.7:1 male-to-female ratio. The wound infection rate was 34% in the distilled water group and 44% in the isotonic saline group (p = 0.315). The mean time ± SD to wound healing was 2.7 ± 1.5 weeks in the distilled water group and 3.1 ± 1.8 weeks in the isotonic saline group (p = 0.389). Conclusions: It was concluded from this study that the use of distilled water compares favourably with isotonic saline as an irrigation solution in open fractures of the lower extremities. PMID:28134091

  2. Microscopic study of deformation systematics in some isotones in the A ≈ 100 mass region

    NASA Astrophysics Data System (ADS)

    Bharti, Arun; Sharma, Chetan; Singh, Suram; Khosa, S. K.

    2012-09-01

    Variation after projection (VAP) calculations in conjunction with Hartree-Bogoliubov (HB) ansatz have been carried out for N=60, 62 isotones in the mass region A=100. In this framework, the yrast spectra with JΠ >= 10+ B(E2) transition probabilities, quadrupole deformation parameter and occupation numbers for various shell model orbits have been obtained. The results of calculations indicate that the simultaneous increase in polarization of p1/2, p3/2 and f5/2 proton sub-shells is a significant factor into the development of the deformation in neutron rich isotones in the mass region A=100.

  3. Bivariate Mixed Effects Analysis of Clustered Data with Large Cluster Sizes.

    PubMed

    Zhang, Daowen; Sun, Jie Lena; Pieper, Karen

    2016-10-01

    Linear mixed effects models are widely used to analyze a clustered response variable. Motivated by a recent study to examine and compare the hospital length of stay (LOS) between patients undertaking percutaneous coronary intervention (PCI) and coronary artery bypass graft (CABG) from several international clinical trials, we proposed a bivariate linear mixed effects model for the joint modeling of clustered PCI and CABG LOS's where each clinical trial is considered a cluster. Due to the large number of patients in some trials, commonly used commercial statistical software for fitting (bivariate) linear mixed models failed to run since it could not allocate enough memory to invert large dimensional matrices during the optimization process. We consider ways to circumvent the computational problem in the maximum likelihood (ML) inference and restricted maximum likelihood (REML) inference. Particularly, we developed an expected and maximization (EM) algorithm for the REML inference and presented an ML implementation using existing software. The new REML EM algorithm is easy to implement and computationally stable and efficient. With this REML EM algorithm, we could analyze the LOS data and obtained meaningful results.

  4. Coherent states associated with the wavefunctions and the spectrum of the isotonic oscillator

    NASA Astrophysics Data System (ADS)

    Thirulogasanthar, K.; Saad, Nasser

    2004-04-01

    Classes of coherent states are presented by replacing the labelling parameter z of Klauder-Perelomov type coherent states by confluent hypergeometric functions with specific parameters. Temporally stable coherent states for the isotonic oscillator Hamiltonian are presented and these states are identified as a particular case of the so-called Mittag-Leffler coherent states.

  5. A Comparison of Isotonic, Isokinetic, and Plyometric Training Methods for Vertical Jump Improvement.

    ERIC Educational Resources Information Center

    Miller, Christine D.

    This annotated bibliography documents three training methods used to develop vertical jumping ability and power: isotonic, isokinetics, and plyometric training. Research findings on all three forms of training are summarized and compared. A synthesis of conclusions drawn from the annotated writings is presented. The report includes a glossary of…

  6. Nebulized Isotonic Saline versus Water following a Laryngeal Desiccation Challenge in Classically Trained Sopranos

    ERIC Educational Resources Information Center

    Tanner, Kristine; Roy, Nelson; Merrill, Ray M.; Muntz, Faye; Houtz, Daniel R.; Sauder, Cara; Elstad, Mark; Wright-Costa, Julie

    2010-01-01

    Purpose: To examine the effects of nebulized isotonic saline (IS) versus sterile water (SW) on self-perceived phonatory effort (PPE) and phonation threshold pressure (PTP) following a surface laryngeal dehydration challenge in classically trained sopranos. Method: In a double-blind, within-subject crossover design, 34 sopranos breathed dry air…

  7. PCR diagnosis of Pneumocystis pneumonia: a bivariate meta-analysis.

    PubMed

    Lu, Yuan; Ling, Guoya; Qiang, Chenyi; Ming, Qinshou; Wu, Cong; Wang, Ke; Ying, Zouxiao

    2011-12-01

    We undertook a bivariate meta-analysis to assess the overall accuracy of respiratory specimen PCR assays for diagnosing Pneumocystis pneumonia. The summary sensitivity and specificity were 0.99 (95% confidence interval, 0.96 to 1.00) and 0.90 (0.87 to 0.93). Subgroup analyses showed that quantitative PCR analysis and the major surface glycoprotein gene target had the highest specificity value (0.93). Respiratory specimen PCR results are sufficient to confirm or exclude the disease for at-risk patients suspected of having Pneumocystis pneumonia.

  8. Computational approach to Thornley's problem by bivariate operational calculus

    NASA Astrophysics Data System (ADS)

    Bazhlekova, E.; Dimovski, I.

    2012-10-01

    Thornley's problem is an initial-boundary value problem with a nonlocal boundary condition for linear onedimensional reaction-diffusion equation, used as a mathematical model of spiral phyllotaxis in botany. Applying a bivariate operational calculus we find explicit representation of the solution, containing two convolution products of special solutions and the arbitrary initial and boundary functions. We use a non-classical convolution with respect to the space variable, extending in this way the classical Duhamel principle. The special solutions involved are represented in the form of fast convergent series. Numerical examples are considered to show the application of the present technique and to analyze the character of the solution.

  9. Bivariate correlation coefficients in family-type clustered studies.

    PubMed

    Luo, Jingqin; D'Angela, Gina; Gao, Feng; Ding, Jimin; Xiong, Chengjie

    2015-11-01

    We propose a unified approach based on a bivariate linear mixed effects model to estimate three types of bivariate correlation coefficients (BCCs), as well as the associated variances between two quantitative variables in cross-sectional data from a family-type clustered design. These BCCs are defined at different levels of experimental units including clusters (e.g., families) and subjects within clusters and assess different aspects on the relationships between two variables. We study likelihood-based inferences for these BCCs, and provide easy implementation using standard software SAS. Unlike several existing BCC estimators in the literature on clustered data, our approach can seamlessly handle two major analytic challenges arising from a family-type clustered design: (1) many families may consist of only one single subject; (2) one of the paired measurements may be missing for some subjects. Hence, our approach maximizes the use of data from all subjects (even those missing one of the two variables to be correlated) from all families, regardless of family size. We also conduct extensive simulations to show that our estimators are superior to existing estimators in handling missing data or/and imbalanced family sizes and the proposed Wald test maintains good size and power for hypothesis testing. Finally, we analyze a real-world Alzheimer's disease dataset from a family clustered study to investigate the BCCs across different modalities of disease markers including cognitive tests, cerebrospinal fluid biomarkers, and neuroimaging biomarkers.

  10. Effect of active pre-shortening on isometric and isotonic performance of single frog muscle fibres.

    PubMed Central

    Granzier, H L; Pollack, G H

    1989-01-01

    1. We studied the effects of shortening history on isometric force and isotonic velocity in single intact frog fibres. Fibres were isometrically tetanized. When force reached a plateau, shortening was imposed, after which the fibre was held isometric again. Isometric force after shortening could then be compared with controls in which no shortening had taken place. 2. Sarcomere length was measured simultaneously with two independent methods: a laser-diffraction method and a segment-length method that detects the distance between two markers attached to the surface of the fibre, about 800 microns apart. 3. The fibre was mounted between two servomotors. One was used to impose the load clamp while the other cancelled the translation that occurred during this load clamp. Thus, translation of the segment under investigation could be minimized. 4. Initial experiments were performed at the fibre level. We found that active preshortening reduced isometric force considerably, thereby confirming earlier work of others. Force reductions as large as 70% were observed. 5. Under conditions in which there were large effects of shortening at the fibre level, we measured sarcomere length changes in the central region of the fibre. These sarcomeres shortened much less than the fibre's average. In fact, when the load was high, these sarcomeres lengthened while the fibre as a whole shortened. Thus, while the fibre-length signal implied that sarcomeres might have shortened to some intermediate length, in reality some sarcomeres were much longer, others much shorter. 6. Experiments performed at the sarcomere level revealed that isometric force was unaffected by previous sarcomere shortening provided the shortening occurred against either a low load or over a short distance. However, if the work done during shortening was high, force after previous shortening was less than if sarcomeres had remained at the final length throughout contraction. The correlation between the force deficit and

  11. Nonlinear deformation of skeletal muscles in a passive state and in isotonic contraction

    NASA Astrophysics Data System (ADS)

    Shil'ko, S. V.; Chernous, D. A.; Pleskachevskii, Yu. M.

    2012-07-01

    A procedure for a two-level modeling of deformation of skeletal muscles is offered. Based on a phenomenological model of an individual muscle fiber, consisting of a viscous, a contractive, and two nonlinearly elastic elements (the first level), various means for describing a skeletal muscle as a whole (the second, macroscopic level) are considered. A method for identification of a muscle model by utilizing experimental elongation diagrams in a passive state and in isotonic contraction is put forward. The results of a biomechanical analysis are compared with known experimental data for the isotonic and isometric activation regimes of tailor's muscle of a frog. It is established that preferable is the description of a muscle that takes into account the different lengths of muscle fibers and their twist.

  12. Quality and microbial safety evaluation of new isotonic beverages upon thermal treatments.

    PubMed

    Gironés-Vilaplana, Amadeo; Huertas, Juan-Pablo; Moreno, Diego A; Periago, Paula M; García-Viguera, Cristina

    2016-03-01

    In the present study, it was evaluated how two different thermal treatments (Mild and Severe) may affect the anthocyanin content, antioxidant capacity (ABTS(+), DPPH, and FRAP), quality (CIELAB colour parameters), and microbiological safety of a new isotonic drink made of lemon and maqui berry over a commercial storage simulation using a shelf life of 56days at two preservation temperature (7°C and 37°C). Both heat treatments did not affect drastically the anthocyanins content and their percentage of retention. The antioxidant capacity, probably because of the short time, was also not affected. The CIELAB colour parameters were affected by the heat, although the isotonic drinks remained with attractive red colour during shelf life. From a microbiological point of view, the Mild heat treatment with storage at 7°C is the ideal for the preservation of microbial growth, being useful for keeping the quality and safety of beverages in commercial life.

  13. Internal dose assessment for 211At α-emitter in isotonic solution as radiopharmaceutical

    NASA Astrophysics Data System (ADS)

    Yuminov, O. A.; Fotina, O. V.; Priselkova, A. B.; Tultaev, A. V.; Platonov, S. Yu.; Eremenko, D. O.; Drozdov, V. A.

    2003-12-01

    The functional fitness of the α-emitter 211At for radiotherapy of the thyroid gland cancer is evaluated. Radiation doses are calculated using the MIRD method and previously obtained pharmacokinetic data for 211At in isotonic solution and for 123I as sodium iodide. Analysis of the 211At radiation dose to the thyroid gland suggests that this radiopharmaceutical may be predominantly used for the treatment of the thyroid cancer.

  14. Microscopic investigation of structural evolution in even-even N = 60 isotones

    SciTech Connect

    Oudih, M. R.; Fellah, M.; Allal, N. H.; Benhamouda, N.

    2012-10-20

    The ground state properties of even-even N=60 isotones from the neutron-rich to the proton-rich side are investigated within the self-consistent Skyrme-Hartree-Fock-Bogoliubov theory in the triaxial landscape. Quantities such as binding energies and root-mean-square radii are investigated and compared with available experimental data. The evolution of the potential energy surfaces in the ({beta},{gamma}) deformation plane is presented and discussed.

  15. The Effects of K(+) Channel Blockade on Eccentric and Isotonic Twitch and Fatiguing Contractions in situ.

    PubMed

    van Lunteren, Erik; Moyer, Michelle

    2012-01-01

    K(+) channel blockers like 3,4-diaminopyridine (DAP) can double isometric muscle force. Functional movements require more complex concentric and eccentric contractions, however the effects of K(+) channel blockade on these types of contractions in situ are unknown. Extensor digitorum longus (EDL) muscles were stimulated in situ with and without DAP in anesthetized rats and fatigability was addressed using a series of either concentric or eccentric contractions. During isotonic protocols (5-100% load), DAP significantly shifted shortening- and maximum shortening velocity-load curves upward and to the right and increased power and work. Maximum shortening, maximum shortening velocity, and power doubled while work increased by ∼250% during isotonic contraction at 50% load. During isotonic fatigue, DAP significantly augmented maximum shortening, work, shortening velocity, and power. During constant velocity eccentric protocols (2-12 mm/s), DAP increased muscle force during eccentric contractions at 6, 8, 10, and 12 mm/s. During eccentric contraction at a constant velocity of 6 mm/s while varying the stimulation frequency, DAP significantly increased muscle force during 20, 40, and 70 Hz. The effects of DAP on muscle contractile performance during eccentric fatigue varied with level of fatigue. DAP-induced contractile increases during isotonic contractions were similar to those produced during previously studied isometric contractions, while the DAP effect during eccentric contractions was more modest. These findings are especially important in attempting to optimize functional electrical stimulation parameters for spinal cord injury patients while also preventing rapid fatigue of those muscles.

  16. Effect of isotonic and isokinetic exercise on muscle activity and balance of the ankle joint

    PubMed Central

    Kim, Mi-Kyoung; Yoo, Kyung-Tae

    2015-01-01

    [Purpose] This study was performed to examine how the balance of lower limbs and the muscle activities of the tibialis anterior (TA), the medial gastrocnemius (GCM), and the peroneus longus (PL) are influenced by isotonic and isokinetic exercise of the ankle joint. [Subjects] The subjects of this study were healthy adults (n=20), and they were divided into two groups (isotonic=10, isokinetic=10). [Methods] Isotonic group performed 3 sets of 10 contractions at 50% of MVIC and Isokinetic group performed 3 sets of 60°/sec. Muscle activity was measured by EMG and balance was measured by one-leg standing test. [Results] For muscle activity, a main effect of group was found in the non-dominant TA, and the dominant TA, GCM and PL. For balance, a main effect of time was found in both groups for the sway area measured support was provided by the non-dominant side. [Conclusion] In terms of muscle activity, the two groups showed a significant difference, and the isokinetic group showed higher muscle activities. In terms of balance, there was a significant difference between the pre-test and the post-test. The results of this study may help in the selection of exercises for physical therapy, because they show that muscle activity and balance vary according to the type of exercise. PMID:25729181

  17. Spectrum-based estimators of the bivariate Hurst exponent.

    PubMed

    Kristoufek, Ladislav

    2014-12-01

    We discuss two alternate spectrum-based estimators of the bivariate Hurst exponent in the power-law cross-correlations setting, the cross-periodogram and local X-Whittle estimators, as generalizations of their univariate counterparts. As the spectrum-based estimators are dependent on a part of the spectrum taken into consideration during estimation, a simulation study showing performance of the estimators under varying bandwidth parameter as well as correlation between processes and their specification is provided as well. These estimators are less biased than the already existent averaged periodogram estimator, which, however, has slightly lower variance. The spectrum-based estimators can serve as a good complement to the popular time domain estimators.

  18. Spectrum-based estimators of the bivariate Hurst exponent

    NASA Astrophysics Data System (ADS)

    Kristoufek, Ladislav

    2014-12-01

    We discuss two alternate spectrum-based estimators of the bivariate Hurst exponent in the power-law cross-correlations setting, the cross-periodogram and local X -Whittle estimators, as generalizations of their univariate counterparts. As the spectrum-based estimators are dependent on a part of the spectrum taken into consideration during estimation, a simulation study showing performance of the estimators under varying bandwidth parameter as well as correlation between processes and their specification is provided as well. These estimators are less biased than the already existent averaged periodogram estimator, which, however, has slightly lower variance. The spectrum-based estimators can serve as a good complement to the popular time domain estimators.

  19. Predicting the Size of Sunspot Cycle 24 on the Basis of Single- and Bi-Variate Geomagnetic Precursor Methods

    NASA Technical Reports Server (NTRS)

    Wilson, Robert M.; Hathaway, David H.

    2009-01-01

    Examined are single- and bi-variate geomagnetic precursors for predicting the maximum amplitude (RM) of a sunspot cycle several years in advance. The best single-variate fit is one based on the average of the ap index 36 mo prior to cycle minimum occurrence (E(Rm)), having a coefficient of correlation (r) equal to 0.97 and a standard error of estimate (se) equal to 9.3. Presuming cycle 24 not to be a statistical outlier and its minimum in March 2008, the fit suggests cycle 24 s RM to be about 69 +/- 20 (the 90% prediction interval). The weighted mean prediction of 11 statistically important single-variate fits is 116 +/- 34. The best bi-variate fit is one based on the maximum and minimum values of the 12-mma of the ap index; i.e., APM# and APm*, where # means the value post-E(RM) for the preceding cycle and * means the value in the vicinity of cycle minimum, having r = 0.98 and se = 8.2. It predicts cycle 24 s RM to be about 92 +/- 27. The weighted mean prediction of 22 statistically important bi-variate fits is 112 32. Thus, cycle 24's RM is expected to lie somewhere within the range of about 82 to 144. Also examined are the late-cycle 23 behaviors of geomagnetic indices and solar wind velocity in comparison to the mean behaviors of cycles 2023 and the geomagnetic indices of cycle 14 (RM = 64.2), the weakest sunspot cycle of the modern era.

  20. Slowing of velocity during isotonic shortening in single isolated smooth muscle cells. Evidence for an internal load

    PubMed Central

    1990-01-01

    In single smooth muscle cells, shortening velocity slows continuously during the course of an isotonic (fixed force) contraction (Warshaw, D.M. 1987. J. Gen. Physiol. 89:771-789). To distinguish among several possible explanations for this slowing, single smooth muscle cells were isolated from the gastric muscularis of the toad (Bufo marinus) and attached to an ultrasensitive force transducer and a length displacement device. Cells were stimulated electrically and produced maximum stress of 144 mN/mm2. Cell force was then reduced to and maintained at preset fractions of maximum, and cell shortening was allowed to occur. Cell stiffness, a measure of relative numbers of attached crossbridges, was measured during isotonic shortening by imposing 50-Hz sinusoidal force oscillations. Continuous slowing of shortening velocity was observed during isotonic shortening at all force levels. This slowing was not related to the time after the onset of stimulation or due to reduced isometric force generating capacity. Stiffness did not change significantly over the course of an isotonic shortening response, suggesting that the observed slowing was not the result of reduced numbers of cycling crossbridges. Furthermore, isotonic shortening velocity was better described as a function of the extent of shortening than as a function of the time after the onset of the release. Therefore, we propose that slowing during isotonic shortening in single isolated smooth muscle cells is the result of an internal load that opposes shortening and increases as cell length decreases. PMID:2121895

  1. Selected Bivariate Frequency Tables: U.S. Army Men and Women

    DTIC Science & Technology

    1981-01-01

    Paired Male and Female Bivariates: Dimensions Comparable Bust/Chest Circumference and Biceps Circumference: 3.. . ........................... Men 20 4... Female Bivariates: Dimensions Not Comparable Ankle Circumference and Waist Circumference: 85.. .............. ............. Men 104 86... more graphically visible presentation of the possibili- ties and problems involved in devising a combined sizing program for both men and women. These

  2. Sample size and power calculations for correlations between bivariate longitudinal data.

    PubMed

    Comulada, W Scott; Weiss, Robert E

    2010-11-30

    The analysis of a baseline predictor with a longitudinally measured outcome is well established and sample size calculations are reasonably well understood. Analysis of bivariate longitudinally measured outcomes is gaining in popularity and methods to address design issues are required. The focus in a random effects model for bivariate longitudinal outcomes is on the correlations that arise between the random effects and between the bivariate residuals. In the bivariate random effects model, we estimate the asymptotic variances of the correlations and we propose power calculations for testing and estimating the correlations. We compare asymptotic variance estimates to variance estimates obtained from simulation studies and compare our proposed power calculations for correlations on bivariate longitudinal data to power calculations for correlations on cross-sectional data.

  3. Joint association analysis of bivariate quantitative and qualitative traits.

    PubMed

    Yuan, Mengdie; Diao, Guoqing

    2011-11-29

    Univariate genome-wide association analysis of quantitative and qualitative traits has been investigated extensively in the literature. In the presence of correlated phenotypes, it is more intuitive to analyze all phenotypes simultaneously. We describe an efficient likelihood-based approach for the joint association analysis of quantitative and qualitative traits in unrelated individuals. We assume a probit model for the qualitative trait, under which an unobserved latent variable and a prespecified threshold determine the value of the qualitative trait. To jointly model the quantitative and qualitative traits, we assume that the quantitative trait and the latent variable follow a bivariate normal distribution. The latent variable is allowed to be correlated with the quantitative phenotype. Simultaneous modeling of the quantitative and qualitative traits allows us to make more precise inference on the pleiotropic genetic effects. We derive likelihood ratio tests for the testing of genetic effects. An application to the Genetic Analysis Workshop 17 data is provided. The new method yields reasonable power and meaningful results for the joint association analysis of the quantitative trait Q1 and the qualitative trait disease status at SNPs with not too small MAF.

  4. Chromosome heteromorphism quantified by high-resolution bivariate flow karyotyping.

    PubMed Central

    Trask, B; van den Engh, G; Mayall, B; Gray, J W

    1989-01-01

    Maternal and paternal homologues of many chromosome types can be differentiated on the basis of their peak position in Hoechst 33258 versus chromomycin A3 bivariate flow karyotypes. We demonstrate here the magnitude of DNA content differences among normal chromosomes of the same type. Significant peak-position differences between homologues were observed for an average of four chromosome types in each of the karyotypes of 98 different individuals. The frequency of individuals with differences in homologue peak positions varied among chromosome types: e.g., chromosome 15, 61%; chromosome 3, 4%. Flow karyotypes of 33 unrelated individuals were compared to determine the range of peak position among normal chromosomes. Chromosomes Y, 21, 22, 15, 16, 13, 14, and 19 were most heteromorphic, and chromosomes 2-8 and X were least heteromorphic. The largest chromosome 21 was 45% larger than the smallest 21 chromosome observed. The base composition of the variable regions differed among chromosome types. DNA contents of chromosome variants determined from flow karyotypes were closely correlated to measurements of DNA content made of gallocyanin chrome alum-stained metaphase chromosomes on slides. Fluorescence in situ hybridization with chromosome-specific repetitive sequences indicated that variability in their copy number is partly responsible for peak-position variability in some chromosomes. Heteromorphic chromosomes are identified for which parental flow karyotype information will be essential if de novo rearrangements resulting in small DNA content changes are to be detected with flow karyotyping. Images Figure 5 PMID:2479266

  5. A Bivariate Test of Goodness of Fit Based on a Gradually Increasing Number of Order Statistics

    DTIC Science & Technology

    1975-03-01

    variable Y(n) as [YjtnpjMJ- DLj ) (IDiK^l) , Y2(mp2*(i-l)L2.j) HiX^lAÜ^)] the vector Y(n) contains the order sv.atistics we are interested in, and we...il mpm ■■■■■ ■• i-w HI ipiW 12 .j np^O- DLj Tl(j) ’ Fln(—^i > ’ tl(j) " fln(Tl0)) nU^C^l) (2.5) mp ♦(i-l)L T2(l’^-G2n(-4...2 loj?[p ♦ — ♦ j ] j-1 ’ /i Zi-t^l.J) where e2(l,j) is in the open intervnl (B21(j) ,B22(J)) dlJ ^Cj) . and therefore Ig^Ojd.j) .J

  6. Landslide susceptibility mapping using a bivariate statistical model in a tropical hilly area of southeastern Brazil

    NASA Astrophysics Data System (ADS)

    Araújo, J. P. C.; DA Silva, L. M.; Dourado, F. A. D.; Fernandes, N.

    2015-12-01

    Landslides are the most damaging natural hazard in the mountainous region of Rio de Janeiro State in Brazil, responsible for thousands of deaths and important financial and environmental losses. However, this region has currently few landslide susceptibility maps implemented on an adequate scale. Identification of landslide susceptibility areas is fundamental in successful land use planning and management practices to reduce risk. This paper applied the Bayes' theorem based on weight of evidence (WoE) using 8 landslide-related factors in a geographic information system (GIS) for landslide susceptibility mapping. 378 landslide locations were identified and mapped on a selected basin in the city of Nova Friburgo, triggered by the January 2011 rainfall event. The landslide scars were divided into two subsets: training and validation subsets. The 8 landslide-related factors weighted by WoE were performed using chi-square test to indicate which variables are conditionally independent of each other to be used in the final map. Finally, the maps of weighted factors were summed up to construct the landslide susceptibility map and validated by the validation landslide subset. According to the results, slope, aspect and contribution area showed the higher positive spatial correlation with landslides. In the landslide susceptibility map, 21% of the area presented very low and low susceptibilities with 3% of the validation scars, 41% presented medium susceptibility with 22% of the validation scars and 38% presented high and very high susceptibilities with 75% of the validation scars. The very high susceptibility class stands for 16% of the basin area and has 54% of the all scars. The approach used in this study can be considered very useful since 75% of the area affected by landslides was included in the high and very high susceptibility classes.

  7. Serotoninergic Modulation of Basal Cardiovascular Responses and Responses Induced by Isotonic Extracellular Volume Expansion in Rats

    PubMed Central

    Semionatto, Isadora Ferraz; Raminelli, Adrieli Oliveira; Alves, Angelica Cristina; Capitelli, Caroline Santos; Chriguer, Rosangela Soares

    2017-01-01

    Background Isotonic blood volume expansion (BVE) induced alterations of sympathetic and parasympathetic activity in the heart and blood vessels, which can be modulated by serotonergic pathways. Objective To evaluate the effect of saline or serotonergic agonist (DOI) administration in the hypothalamic paraventricular nucleus (PVN) on cardiovascular responses after BVE. Methods We recorded pulsatile blood pressure through the femoral artery to obtain the mean arterial pressure (MAP), systolic (SBP) and diastolic blood pressure (DBP), heart rate (HR) and the sympathetic-vagal ratio (LF/HF) of Wistar rats before and after they received bilateral microinjections of saline or DOI into the PVN, followed by BVE. Results No significant differences were observed in the values of the studied variables in the different treatments from the control group. However, when animals are treated with DOI followed by BVE there is a significant increase in relation to the BE control group in all the studied variables: MBP (114.42±7.85 vs 101.34±9.17); SBP (147.23±14.31 vs 129.39±10.70); DBP (98.01 ±4.91 vs 87.31±8.61); HR (421.02±43.32 vs 356.35±41.99); and LF/HF ratio (2.32±0.80 vs 0.27±0.32). Discussion The present study showed that the induction of isotonic BVE did not promote alterations in MAP, HR and LF/HF ratio. On the other hand, the injection of DOI into PVN of the hypothalamus followed by isotonic BVE resulted in a significant increase of all variables. Conclusion These results suggest that serotonin induced a neuromodulation in the PVN level, which promotes an inhibition of the baroreflex response to BVE. Therefore, the present study suggests the involvement of the serotonergic system in the modulation of vagal reflex response at PVN in the normotensive rats. PMID:28099586

  8. Time course of isotonic shortening and the underlying contraction mechanism in airway smooth muscle.

    PubMed

    Syyong, Harley T; Raqeeb, Abdul; Paré, Peter D; Seow, Chun Y

    2011-09-01

    Although the structure of the contractile unit in smooth muscle is poorly understood, some of the mechanical properties of the muscle suggest that a sliding-filament mechanism, similar to that in striated muscle, is also operative in smooth muscle. To test the applicability of this mechanism to smooth muscle function, we have constructed a mathematical model based on a hypothetical structure of the smooth muscle contractile unit: a side-polar myosin filament sandwiched by actin filaments, each attached to the equivalent of a Z disk. Model prediction of isotonic shortening as a function of time was compared with data from experiments using ovine tracheal smooth muscle. After equilibration and establishment of in situ length, the muscle was stimulated with ACh (100 μM) until force reached a plateau. The muscle was then allowed to shorten isotonically against various loads. From the experimental records, length-force and force-velocity relationships were obtained. Integration of the hyperbolic force-velocity relationship and the linear length-force relationship yielded an exponential function that approximated the time course of isotonic shortening generated by the modeled sliding-filament mechanism. However, to obtain an accurate fit, it was necessary to incorporate a viscoelastic element in series with the sliding-filament mechanism. The results suggest that a large portion of the shortening is due to filament sliding associated with muscle activation and that a small portion is due to continued deformation associated with an element that shows viscoelastic or power-law creep after a step change in force.

  9. Projected quasiparticle calculations for the N =82 odd-proton isotones

    SciTech Connect

    Losano, L. ); Dias, H. )

    1991-12-01

    The structure of low-lying states in odd-mass {ital N}=82 isotones (135{le}{ital A}{le}145) is investigated in terms of a number-projected one- and three-quasiparticles Tamm-Dancoff approximation. A surface-delta interaction is taken as the residual nucleon-nucleon interaction. Excitation energies, dipole and quadrupole moments, and {ital B}({ital M}1) and {ital B}({ital E}2) values are calculated and compared with the experimental data.

  10. {alpha}-decay studies of the exotic N=125, 126, and 127 isotones

    SciTech Connect

    Xu Chang; Ren Zhongzhou

    2007-08-15

    The {alpha}-decay half-lives of the exotic N=125, 126, and 127 isotones (Po, Rn, Ra, Th, and U) are systematically studied by the density-dependent cluster model (DDCM). The influence of the neutron shell closure N=126 on the {alpha}-cluster formation and penetration probabilities is analyzed and discussed in detail. By combining the DDCM and a two-level microscopic model together, the experimental half-lives of {alpha} transitions to both the ground state and the excited state in the daughter nuclei are reproduced very well.

  11. Bonn potential and shell-model calculations for N=126 isotones

    SciTech Connect

    Coraggio, L.; Covello, A.; Gargano, A.; Itaco, N.; Kuo, T. T. S.

    1999-12-01

    We have performed shell-model calculations for the N=126 isotones {sup 210}Po, {sup 211}At, and {sup 212}Rn using a realistic effective interaction derived from the Bonn-A nucleon-nucleon potential by means of a G-matrix folded-diagram method. The calculated binding energies, energy spectra, and electromagnetic properties show remarkably good agreement with the experimental data. The results of this paper complement those of our previous study on neutron hole Pb isotopes, confirming that realistic effective interactions are now able to reproduce with quantitative accuracy the spectroscopic properties of complex nuclei. (c) 1999 The American Physical Society.

  12. Asymptotics of bivariate generating functions with algebraic singularities

    NASA Astrophysics Data System (ADS)

    Greenwood, Torin

    Flajolet and Odlyzko (1990) derived asymptotic formulae the coefficients of a class of uni- variate generating functions with algebraic singularities. Gao and Richmond (1992) and Hwang (1996, 1998) extended these results to classes of multivariate generating functions, in both cases by reducing to the univariate case. Pemantle and Wilson (2013) outlined new multivariate ana- lytic techniques and used them to analyze the coefficients of rational generating functions. After overviewing these methods, we use them to find asymptotic formulae for the coefficients of a broad class of bivariate generating functions with algebraic singularities. Beginning with the Cauchy integral formula, we explicity deform the contour of integration so that it hugs a set of critical points. The asymptotic contribution to the integral comes from analyzing the integrand near these points, leading to explicit asymptotic formulae. Next, we use this formula to analyze an example from current research. In the following chapter, we apply multivariate analytic techniques to quan- tum walks. Bressler and Pemantle (2007) found a (d + 1)-dimensional rational generating function whose coefficients described the amplitude of a particle at a position in the integer lattice after n steps. Here, the minimal critical points form a curve on the (d + 1)-dimensional unit torus. We find asymptotic formulae for the amplitude of a particle in a given position, normalized by the number of steps n, as n approaches infinity. Each critical point contributes to the asymptotics for a specific normalized position. Using Groebner bases in Maple again, we compute the explicit locations of peak amplitudes. In a scaling window of size the square root of n near the peaks, each amplitude is asymptotic to an Airy function.

  13. Search for the Skyrme-Hartree-Fock solutions for chiral rotation in N=75 isotones

    SciTech Connect

    Olbratowski, P.; Dobaczewski, J.; Dudek, J.

    2006-05-15

    A search for self-consistent solutions for the chiral rotational bands in the N=75 isotones {sup 130}Cs, {sup 132}La, {sup 134}Pr, and {sup 136}Pm is performed within the Skyrme-Hartree-Fock cranking approach using SKM* and SLy4 parametrizations. The dependence of the solutions on the time-odd contributions in the energy functional is studied. From among the four isotones considered, self-consistent chiral solutions are obtained only in {sup 132}La. The microscopic calculations are compared with the {sup 132}La experimental data and with results of a classical model that contains all the mechanisms underlying the chirality of the collective rotational motion. Strong similarities between the Hartree-Fock and classical model results are found. The suggestion formulated earlier by the authors that the chiral rotation cannot exist below a certain critical frequency is further illustrated and discussed, together with the microscopic origin of a transition from planar to chiral rotation in nuclei. We also formulate the separability rule by which the tilted-axis-cranking solutions can be inferred from three independent principal-axis-cranking solutions corresponding to three different axes of rotation.

  14. Modeling force-velocity relation in skeletal muscle isotonic contraction using an artificial neural network.

    PubMed

    Dariani, Sharareh; Keshavarz, Mansoor; Parviz, Mohsen; Raoufy, Mohammad Reza; Gharibzadeh, Shahriar

    2007-01-01

    The aim of this study is to design an artificial neural network (ANN) to model force-velocity relation in skeletal muscle isotonic contraction. We obtained the data set, including physiological and morphometric parameters, by myography and morphometric measurements on frog gastrocnemius muscle. Then, we designed a multilayer perceptron ANN, the inputs of which are muscle volume, muscle optimum length, tendon length, preload, and afterload. The output of the ANN is contraction velocity. The experimental data were divided randomly into two parts. The first part was used to train the ANN. In order to validate the model, the second part of experimental data, which was not used in training, was employed to the ANN and then, its output was compared with Hill model and the experimental data. The behavior of ANN in high forces was more similar to experimental data, but in low forces the Hill model had better results. Furthermore, extrapolation of ANN performance showed that our model is more or less able to simulate eccentric contraction. Our results indicate that ANNs represent a powerful tool to capture some essential features of muscle isotonic contraction.

  15. Insulin and glucose responses during bed rest with isotonic and isometric exercise

    NASA Technical Reports Server (NTRS)

    Dolkas, C. B.; Greenleaf, J. E.

    1977-01-01

    The effects of daily intensive isotonic (68% maximum oxygen uptake) and isometric (21% maximum extension force) leg exercise on plasma insulin and glucose responses to an oral glucose tolerance test (OGTT) during 14-day bed-rest (BR) periods were investigated in seven young healthy men. The OGTT was given during ambulatory control and on day 10 of the no-exercise, isotonic, and isometric exercise BR periods during the 15-wk study. The subjects were placed on a controlled diet starting 10 days before each BR period. During BR, basal plasma glucose concentration remained unchanged with no exercise, but increased (P less 0.05) to 87-89 mg/100 ml with both exercise regimens on day 2, and then fell slightly below control levels on day 13. The fall in glucose content during BR was independent of the exercise regimen and was an adjustment for the loss of plasma volume. The intensity of the responses of insulin and glucose to the OGTT was inversely proportional to the total daily energy expenditure during BR. It was estimated that at least 1020 kcal/day must be provided by supplemental exercise to restore the hyperinsulinemia to control levels.

  16. Estimation of flood design hydrographs using bivariate analysis (copula) and distributed hydrological modelling

    NASA Astrophysics Data System (ADS)

    Candela, A.; Brigandí, G.; Aronica, G. T.

    2014-01-01

    In this paper a procedure to derive Flood Design Hydrographs (FDH) using a bivariate representation of rainfall forcing (rainfall duration and intensity) using copulas, which describe and model the correlation between these two variables independently of the marginal laws involved, coupled with a distributed rainfall-runoff model is presented. Rainfall-runoff modelling for estimating the hydrological response at the outlet of a watershed used a conceptual fully distributed procedure based on the soil conservation service - curve number method as excess rainfall model and a distributed unit hydrograph with climatic dependencies for the flow routing. Travel time computation, based on the definition of a distributed unit hydrograph, has been performed, implementing a procedure using flow paths determined from a digital elevation model (DEM) and roughness parameters obtained from distributed geographical information. In order to estimate the return period of the FDH which give the probability of occurrence of a hydrograph flood peaks and flow volumes obtained through R-R modeling has been statistically treated via copulas. The shape of hydrograph has been generated on the basis of a modeled flood events, via cluster analysis. The procedure described above was applied to a case study of Imera catchment in Sicily, Italy. The methodology allows a reliable and estimation of the Design Flood Hydrograph and can be used for all the flood risk applications, i.e. evaluation, management, mitigation, etc.

  17. A bivariate mann-whitney approach for unraveling genetic variants and interactions contributing to comorbidity.

    PubMed

    Wen, Yalu; Schaid, Daniel J; Lu, Qing

    2013-04-01

    Although comorbidity among complex diseases (e.g., drug dependence syndromes) is well documented, genetic variants contributing to the comorbidity are still largely unknown. The discovery of genetic variants and their interactions contributing to comorbidity will likely shed light on underlying pathophysiological and etiological processes, and promote effective treatments for comorbid conditions. For this reason, studies to discover genetic variants that foster the development of comorbidity represent high-priority research projects, as manifested in the behavioral genetics studies now underway. The yield from these studies can be enhanced by adopting novel statistical approaches, with the capacity of considering multiple genetic variants and possible interactions. For this purpose, we propose a bivariate Mann-Whitney (BMW) approach to unravel genetic variants and interactions contributing to comorbidity, as well as those unique to each comorbid condition. Through simulations, we found BMW outperformed two commonly adopted approaches in a variety of underlying disease and comorbidity models. We further applied BMW to datasets from the Study of Addiction: Genetics and Environment, investigating the contribution of 184 known nicotine dependence (ND) and alcohol dependence (AD) single nucleotide polymorphisms (SNPs) to the comorbidity of ND and AD. The analysis revealed a candidate SNP from CHRNA5, rs16969968, associated with both ND and AD, and replicated the findings in an independent dataset with a P-value of 1.06 × 10(-03) .

  18. The Bivariate Plotting Procedure for Hearing Assessment of Adults Who Are Severely to Profoundly Mentally Retarded.

    ERIC Educational Resources Information Center

    Cattey, Tommy J.

    1985-01-01

    Puretone auditory assessment of 21 adults with severe to profound mental retardation indicated that a bivariate plotting procedure of predicting hearing sensitivity from the acoustic reflexes should be included in an audiological test battery for this population. (CL)

  19. Comparison of contractile responses of single human motor units in the toe extensors during unloaded and loaded isotonic and isometric conditions.

    PubMed

    Leitch, Michael; Macefield, Vaughan G

    2015-08-01

    Much of the repertoire of muscle function performed in everyday life involves isotonic dynamic movements, either with or without an additional load, yet most studies of single motor units measure isometric forces. To assess the effects of muscle load on the contractile response, we measured the contractile properties of single motor units supplying the toe extensors, assessed by intraneural microstimulation of single human motor axons, in isotonic, loaded isotonic, and isometric conditions. Tungsten microelectrodes were inserted into the common peroneal nerve, and single motor axons (n = 10) supplying the long toe extensors were electrically stimulated through the microelectrode. Displacement was measured from the distal phalanx of the toe with either an angular displacement transducer for the unloaded (i.e., no additional load) and loaded (addition of a 4-g mass) isotonic conditions or a force transducer for the isometric conditions. Mean twitch profiles were measured at 1 Hz for all conditions: rise time, fall time, and duration were shortest for the unloaded isotonic conditions and longest for the isometric conditions. Peak displacements were lower in the loaded than unloaded isotonic conditions, and the half-maximal response in the loaded condition was achieved at lower frequencies than in the unloaded isotonic condition. We have shown that the contractile responses of single motor units supplying the human toe extensors are influenced by how they are measured: twitches are much slower when measured in loaded than unloaded isotonic conditions and slowest when measured in isometric conditions.

  20. Anthropometry of Women of the U.S. Army--1977. Report Number 3. Bivariate Frequency Tables

    DTIC Science & Technology

    1977-07-01

    an index of the tables included here. The Survey and Its Sample The methodology of this survey has been described in the first report of this series... INDEX OF BIVARIATES ....... .................... . 344 t MEDING PAz BLANKNOT FILMED LIST OF FIGURES Figure P~ 1 A Schematic Diagram of Bivariate...frequencies or numbers of women may be expressed as percentages of the sample (as in Tables 4 through 100) or may be given as actual numbers (as in

  1. Causal networks clarify productivity-richness interrelations, bivariate plots do not

    USGS Publications Warehouse

    Grace, James B.; Adler, Peter B.; Harpole, W. Stanley; Borer, Elizabeth T.; Seabloom, Eric W.

    2014-01-01

    We urge ecologists to consider productivity–richness relationships through the lens of causal networks to advance our understanding beyond bivariate analysis. Further, we emphasize that models based on a causal network conceptualization can also provide more meaningful guidance for conservation management than can a bivariate perspective. Measuring only two variables does not permit the evaluation of complex ideas nor resolve debates about underlying mechanisms.

  2. SEMG analysis of astronaut upper arm during isotonic muscle actions with normal standing posture

    NASA Astrophysics Data System (ADS)

    Qianxiang, Zhou; Chao, Ma; Xiaohui, Zheng

    sEMG analysis of astronaut upper arm during isotonic muscle actions with normal standing posture*1 Introduction Now the research on the isotonic muscle actions by using Surface Electromyography (sEMG) is becoming a pop topic in fields of astronaut life support training and rehabilitations. And researchers paid more attention on the sEMG signal processes for reducing the influence of noise which is produced during monitoring process and the fatigue estimation of isotonic muscle actions with different force levels by using the parameters which are obtained from sEMG signals such as Condition Velocity(CV), Median Frequency(MDF), Mean Frequency(MNF) and so on. As the lucubrated research is done, more and more research on muscle fatigue issue of isotonic muscle actions are carried out with sEMG analysis and subjective estimate system of Borg scales at the same time. In this paper, the relationship between the variable for fatigue based on sEMG and the Borg scale during the course of isotonic muscle actions of the upper arm with different contraction levels are going to be investigated. Methods 13 young male subjects(23.4±2.45years, 64.7±5.43Kg, 171.7±5.41cm) with normal standing postures were introduced to do isotonic actions of the upper arm with different force levels(10% MVC, 30%MVC and 50%MVC). And the MVC which means maximal voluntary contraction was obtained firstly in the experiment. Also the sEMG would be recorded during the experiments; the Borg scales would be recorded for each contraction level. By using one-third band octave method, the fatigue variable (p) based on sEMG were set up and it was expressed as p = i g(fi ) · F (fi ). And g(fi ) is defined as the frequent factor which was 0.42+0.5 cos(π fi /f0 )+0.08 cos(2π fi /f0 ), 0 < FI fi 0, orf0 ≤> f0 . According to the equations, the p could be computed and the relationship between variable p and the Borg scale would be investigated. Results In the research, three kinds of fitted curves between

  3. Isotonic and hypertonic saline droplet deposition in a human upper airway model.

    PubMed

    Zhang, Zhe; Kleinstreuer, Clement; Kim, Chong S

    2006-01-01

    The evaporative and hygroscopic effects and deposition of isotonic and hypertonic saline droplets have been simulated from the mouth to the first four generations of the tracheobronchial tree under laminar-transitional-turbulent inspiratory flow conditions. Specifically, the local water vapor transport, droplet evaporation rate, and deposition fractions are analyzed. The effects of inhalation flow rates, thermodynamic air properties and NaCl-droplet concentrations of interest are discussed as well. The validated computer simulation results indicate that the increase of NaCl-solute concentration, increase of inlet relative humidity, or decrease of inlet air temperature may reduce water evaporation and increase water condensation at saline droplet surfaces, resulting in higher droplet depositions due to the increasing particle diameter and density. However, solute concentrations below 10% may not have a very pronounced effect on droplet deposition in the human upper airways.

  4. A quantum exactly solvable nonlinear oscillator related to the isotonic oscillator

    NASA Astrophysics Data System (ADS)

    Cariñena, J. F.; Perelomov, A. M.; Rañada, M. F.; Santander, M.

    2008-02-01

    A nonpolynomial one-dimensional quantum potential representing an oscillator, which can be considered as placed in the middle between the harmonic oscillator and the isotonic oscillator (harmonic oscillator with a centripetal barrier), is studied. First the general case, that depends on a parameter a, is considered and then a particular case is studied with great detail. It is proven that it is Schrödinger solvable and then the wavefunctions Ψn and the energies En of the bound states are explicitly obtained. Finally, it is proven that the solutions determine a family of orthogonal polynomials {\\cal P}_n(x) related to the Hermite polynomials and such that: (i) every {\\cal P}_n is a linear combination of three Hermite polynomials and (ii) they are orthogonal with respect to a new measure obtained by modifying the classic Hermite measure.

  5. Local suppression of collectivity in the N=80 isotones at the Z=58 subshell closure

    NASA Astrophysics Data System (ADS)

    Bauer, C.; Rainovski, G.; Pietralla, N.; Bianco, D.; Blazhev, A.; Bloch, T.; Bönig, S.; Damyanova, A.; Danchev, M.; Gladnishki, K. A.; Kröll, T.; Leske, J.; Lo Iudice, N.; Möller, T.; Moschner, K.; Pakarinen, J.; Reiter, P.; Scheck, M.; Seidlitz, M.; Siebeck, B.; Stahl, C.; Stegmann, R.; Stora, T.; Stoyanov, Ch.; Tarpanov, D.; Vermeulen, M. J.; Voulot, D.; Warr, N.; Wenander, F.; Werner, V.; De Witte, H.

    2013-08-01

    Background: Recent data on N=80 isotones have suggested that the proton π(1g7/2) subshell closure at Z=58 has an impact on the properties of low-lying collective states.Purpose: Knowledge of the B(E2;21+→01+) value of 140Nd is needed in order to test this conjecture.Method: The unstable, neutron-rich nucleus 140Nd was investigated via projectile Coulomb excitation at the REX-ISOLDE facility with the MINIBALL spectrometer.Results: The B(E2) value of 33(2) W.u. expands the N=80 systematics beyond the Z=58 subshell closure.Conclusions: The measurement demonstrates that the reduced collectivity of 138Ce is a local effect possibly due to the Z=58 subshell closure and requests refined theoretical calculations. The latter predict a smoothly increasing trend.

  6. Interplay between pairing and tensor effects in the N = 82 even-even isotone chain

    NASA Astrophysics Data System (ADS)

    Anguiano, M.; Bernard, R. N.; Lallena, A. M.; Co', G.; De Donno, V.

    2016-11-01

    The combined effects of the pairing and tensor terms of the nuclear interaction are investigated by analyzing the ground state properties of the nuclei belonging to the isotonic chain N = 82. The pairing effects have been taken into account by considering both Hartree-Fock-Bogoliubov and Hartree-Fock plus Bardeen-Cooper-Schrieffer approaches using the same finite-range nuclear interaction, specifically a force of Gogny type. Our results reproduce very well the available experimental data of binding energies and charge radii. The study of the particle number fluctuation indicates that the presence of the tensor terms in the interaction reduces the pairing effects and produces new shell closures in some isotopes. The experimental behavior of the energy difference between neutron single particle states up to A = 140 is properly described only if the tensor force is considered.

  7. Evolution of collectivity in the N =100 isotones near 170Yb

    NASA Astrophysics Data System (ADS)

    Karayonchev, V.; Régis, J.-M.; Jolie, J.; Blazhev, A.; Altenkirch, R.; Ansari, S.; Dannhoff, M.; Diel, F.; Esmaylzadeh, A.; Fransen, C.; Gerst, R.-B.; Moschner, K.; Müller-Gatermann, C.; Saed-Samii, N.; Stegemann, S.; Warr, N.; Zell, K. O.

    2017-03-01

    An experiment using the electronic γ -γ fast-timing technique was performed to measure lifetimes of the yrast states in 170Yb. The lifetime of the yrast 2+ state was determined using the slope method. The value of τ =2.33 (3 ) ns is in good agreement with the lifetimes measured using other techniques. The lifetimes of the first 4+ and 6+ states are determined using the generalized centroid difference method. The derived B (E 2 ) values are compared to calculations done using the confined beta soft model and show good agreement with the experimental values. These calculations were extended to the isotonic chain N =100 around 170Yb and show a good quantitative description of the collectivity observed along it.

  8. Synchronous behavior of spontaneous oscillations of sarcomeres in skeletal myofibrils under isotonic conditions.

    PubMed

    Yasuda, K; Shindo, Y; Ishiwata, S

    1996-04-01

    An isotonic control system for studying dynamic properties of single myofibrils was developed to evaluate the change of sarcomere lengths in glycerinated skeletal myofibrils under conditions of spontaneous oscillatory contraction (SPOC) in the presence of inorganic phosphate and a high ADP-to-ATP ratio. Sarcomere length oscillated spontaneously with a peak-to-peak amplitude of about 0.5 microns under isotonic conditions in which the external loads were maintained constant at values between 1.5 x 10(4) and 3.5 x 10(4) N/m2. The shortening and yielding of sarcomeres occurred in concert, in contrast to the previously reported conditions (isomeric or auxotonic) under which the myofibrillar tension is allowed to oscillate. This synchronous SPOC appears to be at a higher level of synchrony than in the organized state of SPOC previously observed under auxotonic conditions. The period of sarcomere length oscillation did not largely depend on external load. The active tension under SPOC conditions increased as the sarcomere length increased from 2.1 to 3.2 microns, although it was still smaller than the tension under normal Ca2+ contraction (which is on the order of 10(5) N/m2). The synchronous SPOC implies that there is a mechanism for transmitting information between sarcomeres such that the state of activation of sarcomeres is affected by the state of adjacent sarcomeres. We conclude that the change of myofibrillar tension is not responsible for the SPOC of each sarcomere but that it affects the level of synchrony of sarcomere oscillations.

  9. Synchronous behavior of spontaneous oscillations of sarcomeres in skeletal myofibrils under isotonic conditions.

    PubMed Central

    Yasuda, K; Shindo, Y; Ishiwata, S

    1996-01-01

    An isotonic control system for studying dynamic properties of single myofibrils was developed to evaluate the change of sarcomere lengths in glycerinated skeletal myofibrils under conditions of spontaneous oscillatory contraction (SPOC) in the presence of inorganic phosphate and a high ADP-to-ATP ratio. Sarcomere length oscillated spontaneously with a peak-to-peak amplitude of about 0.5 microns under isotonic conditions in which the external loads were maintained constant at values between 1.5 x 10(4) and 3.5 x 10(4) N/m2. The shortening and yielding of sarcomeres occurred in concert, in contrast to the previously reported conditions (isomeric or auxotonic) under which the myofibrillar tension is allowed to oscillate. This synchronous SPOC appears to be at a higher level of synchrony than in the organized state of SPOC previously observed under auxotonic conditions. The period of sarcomere length oscillation did not largely depend on external load. The active tension under SPOC conditions increased as the sarcomere length increased from 2.1 to 3.2 microns, although it was still smaller than the tension under normal Ca2+ contraction (which is on the order of 10(5) N/m2). The synchronous SPOC implies that there is a mechanism for transmitting information between sarcomeres such that the state of activation of sarcomeres is affected by the state of adjacent sarcomeres. We conclude that the change of myofibrillar tension is not responsible for the SPOC of each sarcomere but that it affects the level of synchrony of sarcomere oscillations. Images FIGURE 2 PMID:8785342

  10. Nucleon-pair states of even-even N =82 isotones

    NASA Astrophysics Data System (ADS)

    Cheng, Y. Y.; Zhao, Y. M.; Arima, A.

    2016-08-01

    In this paper we study low-lying states of five N =82 isotones, 134Te, 136Xe, 138Ba, 140Ce and 142Nd, within the framework of the nucleon-pair approximation (NPA). For the low-lying yrast states of 136Xe and 138Ba, we calculate the overlaps between the wave functions obtained in the full shell-model (SM) space and those obtained in the truncated NPA space, and find that most of these overlaps are very close to 1. Very interestingly and surprisingly, for most of these yrast states, the SM wave functions are found to be well approximated by one-dimensional, optimized pair basis states, which indicates a simple picture of "nucleon-pair states". The positive-parity yrast states with spin J >6 in these nuclei, as well as the 82+ state, are found to be well described by breaking one or two S pair(s) of the 61+ or 62+ state (low-lying, seniority-two, spin-maximum, and positive-parity); similarly, negative-parity yrast states with spin J >9 are well represented by breaking one or two S pair(s) of the 91- state (low-lying, seniority-two, spin-maximum, and negative-parity). It is shown that the low-lying negative-parity yrast states of 136Xe and 138Ba are reasonably described to be one-octupole-phonon excited states. The evolution of the 61+ and 62+ states for the five isotones are also systematically investigated.

  11. Interpreting Bivariate Regression Coefficients: Going beyond the Average

    ERIC Educational Resources Information Center

    Halcoussis, Dennis; Phillips, G. Michael

    2010-01-01

    Statistics, econometrics, investment analysis, and data analysis classes often review the calculation of several types of averages, including the arithmetic mean, geometric mean, harmonic mean, and various weighted averages. This note shows how each of these can be computed using a basic regression framework. By recognizing when a regression model…

  12. A Comparative Study on Extreme Precipitation of the Han River Basin using a Bivariate Goodness-of-fit Measure for Regional Frequency Analysis

    NASA Astrophysics Data System (ADS)

    Ahn, Hyunjun; Jung, Younghun; Joo, Kyungwon; Kim, Taereem; Heo, Jun-Haeng

    2016-04-01

    In statistical hydrology, frequency analysis has been widely used for design of water resource systems. The traditional at-site analysis is recommended when the sample size is bigger than twice target return period (2T). However, in reality, the sample size of subject site is usually smaller than the target return periods such as 100- and 200-year ones. To overcome such a weakness, regional frequency analysis has been suggested and performed since 1960. To estimate robust precipitation quantiles in regional frequency analysis, it is important to select an appropriate probability distribution for a given region. Typically, goodness-of-fit measure developed by Hosking and Wallis based on the L-moment ratio diagram is used to select an appropriate probability distribution. Recently, several studies have been carried out on goodness-of-fit test for regional frequency analysis such as a bivariate goodness-of-fit measure to choose more appropriate probability distribution. In this study, regional frequency analysis is conducted for 1-hour maximum rainfall data (1961~2015) of the Han River basin in Korea. In this application, appropriate probability distributions are selected using the traditional goodness-of-fit and a bivariate goodness-of-fit measures, and then extreme precipitation quantiles from both methods are compared to suggest better method. Keywords: regional frequency analysis; goodness-of-fit measure; a bivariate goodness-of-fit measure; extreme precipitation events

  13. A Comparative Study on DDF Curve with Bivariate and Univariate Model

    NASA Astrophysics Data System (ADS)

    Joo, K.; Choi, S.; Heo, J.

    2012-12-01

    DDF(or IDF) curve is consisted with rainfall depth(or intensity), duration and frequency, and it is useful to see how rainfall changes in various conditions. Furthermore, recently, multivariate frequency analysis is applied to hydrology because of its scalability. In this study, to obtain DDF curve, rainfall quantile is estimated by both of univariate and bivariate(rainfall depth and duration) frequency analysis. For bivariate model, three copula models which are Frank, Gumbel-Hougaard, and Joe, are used in this study. Copula model has been studied widely for various fields, and it is flexible for marginal distribution than other conventional bivariate models. Hourly recorded data(1961~2010) of Seoul weather station from Korea Meteorological Administration (KMA) is applied for frequency analysis, and inter-event time definition is used for identification of rainfall events. For estimate parameters of copula models, maximum pseudo-likelihood estimation method which is semi-parametric method is used. Gumbel distribution is examined and used for rainfall depth, and generalized extreme value (GEV) distribution is examined and used for duration. As a result, 4 DDF curves are obtained (univariate, 3 copula models). In compared to univariate model, rainfall quantile of bivariate model unaffected by duration. In detail, Frank model shows closest trend along the duration, and Joe model doesn`t show the little changes along the duration. Change of rainfall quantile from bivariate model along the duration is less significant than univariate model as varying nonexceedance probability.

  14. Bivariable analysis of ventricular late potentials in high resolution ECG records

    NASA Astrophysics Data System (ADS)

    Orosco, L.; Laciar, E.

    2007-11-01

    In this study the bivariable analysis for ventricular late potentials detection in high-resolution electrocardiographic records is proposed. The standard time-domain analysis and the application of the time-frequency technique to high-resolution ECG records are briefly described as well as their corresponding results. In the proposed technique the time-domain parameter, QRSD and the most significant time-frequency index, ENQRS are used like variables. A bivariable index is defined, that combines the previous parameters. The propose technique allows evaluating the risk of ventricular tachycardia in post-myocardial infarct patients. The results show that the used bivariable index allows discriminating between the patient's population with ventricular tachycardia and the subjects of the control group. Also, it was found that the bivariable technique obtains a good valuation as diagnostic test. It is concluded that comparatively, the valuation of the bivariable technique as diagnostic test is superior to that of the time-domain method and the time-frequency technique evaluated individually.

  15. A rank test for bivariate time-to-event outcomes when one event is a surrogate.

    PubMed

    Shaw, Pamela A; Fay, Michael P

    2016-08-30

    In many clinical settings, improving patient survival is of interest but a practical surrogate, such as time to disease progression, is instead used as a clinical trial's primary endpoint. A time-to-first endpoint (e.g., death or disease progression) is commonly analyzed but may not be adequate to summarize patient outcomes if a subsequent event contains important additional information. We consider a surrogate outcome very generally as one correlated with the true endpoint of interest. Settings of interest include those where the surrogate indicates a beneficial outcome so that the usual time-to-first endpoint of death or surrogate event is nonsensical. We present a new two-sample test for bivariate, interval-censored time-to-event data, where one endpoint is a surrogate for the second, less frequently observed endpoint of true interest. This test examines whether patient groups have equal clinical severity. If the true endpoint rarely occurs, the proposed test acts like a weighted logrank test on the surrogate; if it occurs for most individuals, then our test acts like a weighted logrank test on the true endpoint. If the surrogate is a useful statistical surrogate, our test can have better power than tests based on the surrogate that naively handles the true endpoint. In settings where the surrogate is not valid (treatment affects the surrogate but not the true endpoint), our test incorporates the information regarding the lack of treatment effect from the observed true endpoints and hence is expected to have a dampened treatment effect compared with tests based on the surrogate alone. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.

  16. Investigating NARCCAP Precipitation Extremes via Bivariate Extreme Value Theory (Invited)

    NASA Astrophysics Data System (ADS)

    Weller, G. B.; Cooley, D. S.; Sain, S. R.; Bukovsky, M. S.; Mearns, L. O.

    2013-12-01

    We introduce methodology from statistical extreme value theory to examine the ability of reanalysis-drive regional climate models to simulate past daily precipitation extremes. Going beyond a comparison of summary statistics such as 20-year return values, we study whether the most extreme precipitation events produced by climate model simulations exhibit correspondence to the most extreme events seen in observational records. The extent of this correspondence is formulated via the statistical concept of tail dependence. We examine several case studies of extreme precipitation events simulated by the six models of the North American Regional Climate Change Assessment Program (NARCCAP) driven by NCEP reanalysis. It is found that the NARCCAP models generally reproduce daily winter precipitation extremes along the Pacific coast quite well; in contrast, simulation of past daily summer precipitation extremes in a central US region is poor. Some differences in the strength of extremal correspondence are seen in the central region between models which employ spectral nudging and those which do not. We demonstrate how these techniques may be used to draw a link between extreme precipitation events and large-scale atmospheric drivers, as well as to downscale extreme precipitation simulated by a future run of a regional climate model. Specifically, we examine potential future changes in the nature of extreme precipitation along the Pacific coast produced by the pineapple express (PE) phenomenon. A link between extreme precipitation events and a "PE Index" derived from North Pacific sea-surface pressure fields is found. This link is used to study PE-influenced extreme precipitation produced by a future-scenario climate model run.

  17. Modeling animal-vehicle collisions using diagonal inflated bivariate Poisson regression.

    PubMed

    Lao, Yunteng; Wu, Yao-Jan; Corey, Jonathan; Wang, Yinhai

    2011-01-01

    Two types of animal-vehicle collision (AVC) data are commonly adopted for AVC-related risk analysis research: reported AVC data and carcass removal data. One issue with these two data sets is that they were found to have significant discrepancies by previous studies. In order to model these two types of data together and provide a better understanding of highway AVCs, this study adopts a diagonal inflated bivariate Poisson regression method, an inflated version of bivariate Poisson regression model, to fit the reported AVC and carcass removal data sets collected in Washington State during 2002-2006. The diagonal inflated bivariate Poisson model not only can model paired data with correlation, but also handle under- or over-dispersed data sets as well. Compared with three other types of models, double Poisson, bivariate Poisson, and zero-inflated double Poisson, the diagonal inflated bivariate Poisson model demonstrates its capability of fitting two data sets with remarkable overlapping portions resulting from the same stochastic process. Therefore, the diagonal inflated bivariate Poisson model provides researchers a new approach to investigating AVCs from a different perspective involving the three distribution parameters (λ(1), λ(2) and λ(3)). The modeling results show the impacts of traffic elements, geometric design and geographic characteristics on the occurrences of both reported AVC and carcass removal data. It is found that the increase of some associated factors, such as speed limit, annual average daily traffic, and shoulder width, will increase the numbers of reported AVCs and carcass removals. Conversely, the presence of some geometric factors, such as rolling and mountainous terrain, will decrease the number of reported AVCs.

  18. New K isomers in the neutron-rich N =100 isotones 162Sm, 163Eu, and 164Gd

    NASA Astrophysics Data System (ADS)

    Yokoyama, R.; Go, S.; Kameda, D.; Kubo, T.; Inabe, N.; Fukuda, N.; Takeda, H.; Suzuki, H.; Yoshida, K.; Kusaka, K.; Tanaka, K.; Yanagisawa, Y.; Ohtake, M.; Sato, H.; Shimizu, Y.; Baba, H.; Kurokawa, M.; Nishimura, D.; Ohnishi, T.; Iwasa, N.; Chiba, A.; Yamada, T.; Ideguchi, E.; Fujii, T.; Nishibata, H.; Ieki, K.; Murai, D.; Momota, S.; Sato, Y.; Hwang, J. W.; Kim, S.; Tarasov, O. B.; Morrissey, D. J.; Sherrill, B. M.; Simpson, G.; Praharaj, C. R.

    2017-03-01

    Very neutron-rich Z ˜60 isotopes produced by in-flight fission of a 345 MeV/nucleon 238U beam at the RI Beam Factory, RIKEN Nishina Center, have been studied by delayed γ -ray spectroscopy. New isomers were discovered in the neutron-rich N =100 isotones 162Sm, 163Eu, and 164Gd. Half-lives, γ -ray energies, and relative intensities of these isomers were obtained. Level schemes were proposed for these nuclei and the first 2+ and 4+ states were assigned for the even-even nuclei. The first 2+ and 4+ state energies decrease as the proton numbers get smaller. The energies and the half-lives of the new isomers are very similar to those of 4- isomers known in less neutron-rich N =100 isotones 168Er and 170Yb. A deformed Hartree-Fock with angular momentum projection model suggests Kπ=4- two-quasiparticle states with ν 7 /2 [633 ]⊗ν 1 /2 [521 ] configurations with similar excitation energy. The results suggest that neutron-rich N =100 nuclei are well deformed and the deformation gets larger as Z decreases to 62. The onset of K isomers with the same configuration at almost the same energy in N =100 isotones indicates that the neutron single-particle structures of neutron-rich isotones down to Z =62 do not change significantly from those of the Z =70 stable nuclei. Systematics of the excitation energies of new isomers can be explained without the predicted N =100 shell gap.

  19. A Review of Classification Techniques of EMG Signals during Isotonic and Isometric Contractions

    PubMed Central

    Nazmi, Nurhazimah; Abdul Rahman, Mohd Azizi; Yamamoto, Shin-Ichiroh; Ahmad, Siti Anom; Zamzuri, Hairi; Mazlan, Saiful Amri

    2016-01-01

    In recent years, there has been major interest in the exposure to physical therapy during rehabilitation. Several publications have demonstrated its usefulness in clinical/medical and human machine interface (HMI) applications. An automated system will guide the user to perform the training during rehabilitation independently. Advances in engineering have extended electromyography (EMG) beyond the traditional diagnostic applications to also include applications in diverse areas such as movement analysis. This paper gives an overview of the numerous methods available to recognize motion patterns of EMG signals for both isotonic and isometric contractions. Various signal analysis methods are compared by illustrating their applicability in real-time settings. This paper will be of interest to researchers who would like to select the most appropriate methodology in classifying motion patterns, especially during different types of contractions. For feature extraction, the probability density function (PDF) of EMG signals will be the main interest of this study. Following that, a brief explanation of the different methods for pre-processing, feature extraction and classifying EMG signals will be compared in terms of their performance. The crux of this paper is to review the most recent developments and research studies related to the issues mentioned above. PMID:27548165

  20. Role of atrial natriuretic peptide in systemic responses to acute isotonic volume expansion

    NASA Technical Reports Server (NTRS)

    Watenpaugh, Donald E.; Yancy, Clyde W.; Buckey, Jay C.; Lane, Lynda D.; Hargens, Alan R.; Blomqvist, C. G.

    1992-01-01

    A hypothesis is proposed that a temporal relationship exists between increases in cardiac filling pressure and plasma artrial natriuretic peptide (ANP) concentration and also between ANP elevation and vasodilation, fluid movement from plasma to interstitium, and increased urine volume (UV). To test the hypothesis, 30 ml/kg isotonic saline were infused in supine male subjects over 24 min and responses were monitored for 3 h postinfusion. Results show that at end infusion, mean arterial pressure (RAP), heart rate and plasma volume exhibited peak increases of 146, 23, and 27 percent, respectively. Mean plasma ANP and UV peaked (45 and 390 percent, respectively) at 30 min postinfusion. Most cardiovascular variables had returned toward control levels by 1 h postinfusion, and net reabsorption of extravascular fluid ensued. It is concluded that since ANP was not significantly increased until 30 min postinfusion, factors other than ANP initiate responses to intravascular fluid loading. These factors include increased vascular pressures, baroreceptor-mediated vasolidation, and hemodilution of plasma proteins. ANP is suggested to mediate, in part, the renal response to saline infusion.

  1. A Model of Peritubular Capillary Control of Isotonic Fluid Reabsorption by the Renal Proximal Tubule

    PubMed Central

    Deen, W. M.; Robertson, C. R.; Brenner, B. M.

    1973-01-01

    A mathematical model of peritubular transcapillary fluid exchange has been developed to investigate the role of the peritubular environment in the regulation of net isotonic fluid transport across the mammalian renal proximal tubule. The model, derived from conservation of mass and the Starling transcapillary driving forces, has been used to examine the quantitative effects on proximal reabsorption of changes in efferent arteriolar protein concentration and plasma flow rate. Under normal physiological conditions, relatively small perturbations in protein concentration are predicted to influence reabsorption more than even large variations in plasma flow, a prediction in close accord with recent experimental observations in the rat and dog. Changes either in protein concentration or plasma flow have their most pronounced effects when the opposing transcapillary hydrostatic and osmotic pressure differences are closest to equilibrium. Comparison of these theoretical results with variations in reabsorption observed in micropuncture studies makes it possible to place upper and lower bounds on the difference between interstitial oncotic and hydrostatic pressures in the renal cortex of the rat. PMID:4696761

  2. Cytotoxicity and colloidal behavior of polystyrene latex nanoparticles toward filamentous fungi in isotonic solutions.

    PubMed

    Nomura, Toshiyuki; Tani, Shuji; Yamamoto, Makoto; Nakagawa, Takumi; Toyoda, Shunsuke; Fujisawa, Eri; Yasui, Akiko; Konishi, Yasuhiro

    2016-04-01

    The effects of surface physicochemical properties of functionalized polystyrene latex (PSL) nanoparticles (NPs) and model filamentous fungi Aspergillus oryzae and Aspergillus nidulans cultivated in different environment (aqueous and atmospheric environment) on the colloidal behavior and cytotoxicity were investigated in different isotonic solutions (154 mM NaCl and 292 mM sucrose). When the liquid cultivated fungal cells were exposed to positively charged PSL NPs in 154 mM NaCl solution, the NPs were taken into A. oryzae, but not A. nidulans. Atomic force microscopy revealed that the uptake of NPs was more readily through the cell wall of A. oryzae because of its relatively softer cell wall compared with A. nidulans. In contrast, the positively charged PSL NPs entirely covered the liquid cultivated fungal cell surfaces and induced cell death in 292 mM sucrose solution because of the stronger electrostatic attractive force between the cells and NPs compared with in 154 mM NaCl. When the agar cultivated fungal cells were exposed to the positively charged PSL NPs, both fungal cells did not take the NPs inside the cells. Contact angle measurement revealed that the hydrophobin on the agar cultivated cell surfaces inhibited the uptake of NPs because of its relatively more hydrophobic cell surface compared with the liquid cultivated cells.

  3. ELECTRON MICROSCOPICAL STUDY OF SKELETAL MUSCLE DURING ISOTONIC (AFTERLOAD) AND ISOMETRIC CONTRACTION

    PubMed Central

    Knappeis, G. G.; Carlsen, F.

    1956-01-01

    Bundles of the curarized semitendinosus muscle of the frog were fixed during isotonic (afterload) and isometric contraction and the length of the A and I bands investigated by electron microscopy. The sarcomere length, during afterload contraction initiated at 25 per cent stretch, varied depending on the afterload applied between 3.0 and 1.2 µ, i.e. the shortening amounted to 5 to 50 per cent. The shortening involved both the A and I bands. Between a sarcomere length of 3.0 to 1.7 µ (shortening 5 to 35 per cent) the A bands remained practically constant at about 1.5 µ (6 to 8 per cent shortening); the length of the I bands decreased from 1.4 to 0.3 µ (80 per cent shortening). Below a sarcomere length of 1.7 to 1.2 µ the A bands shortened from 1.5 to 1.0 µ (from 6 to 8 to 25 per cent). At sarcomere lengths 1.6 to 1.2 µ the I band was replaced by a contraction band. During isometric contraction the A bands shortened by about 8 to 10 per cent; the I bands were correspondingly elongated. PMID:13319381

  4. Tendon organ sensitivity to steady-state isotonic contraction of in-series motor units in feline peroneus tertius muscle.

    PubMed Central

    Petit, J; Scott, J J; Reynolds, K J

    1997-01-01

    1. Measurements have been made of the sensitivity of tendon organs to steady-state, isotonic contractions of single and groups of in-series motor units in the peroneus tertius muscle of the cat hindlimb. 2. Linear relationships were found between the Ib afferent discharge and the contractile tension generated by tetanic stimulation of single motor units. These relationships held for the fast, fatiguable (FF) units and for all but the lowest tensions generated by the slow (S) and some fast, fatigue resistant (FR) units. The sensitivity of the organs was independent of the contractile properties of the units. 3. Groups of three motor units were stimulated isotonically at low rates (around 30 Hz), but asynchronously to produce a smooth tension profile. Again, linear relationships pertained between the discharge rate and the tension, and the sensitivity was the same for different motor unit types. 4. Under isotonic conditions, therefore, the tendon organs showed linear responses to the tension with similar sensitivities, indicating that tendon organs may have the capacity to signal faithfully steady-state contractile tensions. PMID:9097946

  5. Calculation of muscle maximal shortening velocity by extrapolation of the force-velocity relationship: afterloaded versus isotonic release contractions.

    PubMed

    Bullimore, Sharon R; Saunders, Travis J; Herzog, Walter; MacIntosh, Brian R

    2010-10-01

    The maximal shortening velocity of a muscle (V(max)) provides a link between its macroscopic properties and the underlying biochemical reactions and is altered in some diseases. Two methods that are widely used for determining V(max) are afterloaded and isotonic release contractions. To determine whether these two methods give equivalent results, we calculated V(max) in 9 intact single fibres from the lumbrical muscles of the frog Xenopus laevis (9.5-15.5 °C, stimulation frequency 35-70 Hz). The data were modelled using a 3-state cross-bridge model in which the states were inactive, detached, and attached. Afterloaded contractions gave lower predictions of Vmax than did isotonic release contractions in all 9 fibres (3.20 ± 0.84 versus 4.11 ± 1.08 lengths per second, respectively; means ± SD, p = 0.001) and underestimated unloaded shortening velocity measured with the slack test by an average of 29% (p = 0.001, n = 6). Excellent model predictions could be obtained by assuming that activation is inhibited by shortening. We conclude that under the experimental conditions used in this study, afterloaded and isotonic release contractions do not give equivalent results. When a change in the V(max) measured with afterloaded contractions is observed in diseased muscle, it is important to consider that this may reflect differences in either activation kinetics or cross-bridge cycling rates.

  6. Analysis of Blood Transfusion Data Using Bivariate Zero-Inflated Poisson Model: A Bayesian Approach

    PubMed Central

    Mohammadi, Tayeb; Sedehi, Morteza

    2016-01-01

    Recognizing the factors affecting the number of blood donation and blood deferral has a major impact on blood transfusion. There is a positive correlation between the variables “number of blood donation” and “number of blood deferral”: as the number of return for donation increases, so does the number of blood deferral. On the other hand, due to the fact that many donors never return to donate, there is an extra zero frequency for both of the above-mentioned variables. In this study, in order to apply the correlation and to explain the frequency of the excessive zero, the bivariate zero-inflated Poisson regression model was used for joint modeling of the number of blood donation and number of blood deferral. The data was analyzed using the Bayesian approach applying noninformative priors at the presence and absence of covariates. Estimating the parameters of the model, that is, correlation, zero-inflation parameter, and regression coefficients, was done through MCMC simulation. Eventually double-Poisson model, bivariate Poisson model, and bivariate zero-inflated Poisson model were fitted on the data and were compared using the deviance information criteria (DIC). The results showed that the bivariate zero-inflated Poisson regression model fitted the data better than the other models. PMID:27703493

  7. Contributions to the Underlying Bivariate Normal Method for Factor Analyzing Ordinal Data

    ERIC Educational Resources Information Center

    Xi, Nuo; Browne, Michael W.

    2014-01-01

    A promising "underlying bivariate normal" approach was proposed by Jöreskog and Moustaki for use in the factor analysis of ordinal data. This was a limited information approach that involved the maximization of a composite likelihood function. Its advantage over full-information maximum likelihood was that very much less computation was…

  8. Analysis of Blood Transfusion Data Using Bivariate Zero-Inflated Poisson Model: A Bayesian Approach.

    PubMed

    Mohammadi, Tayeb; Kheiri, Soleiman; Sedehi, Morteza

    2016-01-01

    Recognizing the factors affecting the number of blood donation and blood deferral has a major impact on blood transfusion. There is a positive correlation between the variables "number of blood donation" and "number of blood deferral": as the number of return for donation increases, so does the number of blood deferral. On the other hand, due to the fact that many donors never return to donate, there is an extra zero frequency for both of the above-mentioned variables. In this study, in order to apply the correlation and to explain the frequency of the excessive zero, the bivariate zero-inflated Poisson regression model was used for joint modeling of the number of blood donation and number of blood deferral. The data was analyzed using the Bayesian approach applying noninformative priors at the presence and absence of covariates. Estimating the parameters of the model, that is, correlation, zero-inflation parameter, and regression coefficients, was done through MCMC simulation. Eventually double-Poisson model, bivariate Poisson model, and bivariate zero-inflated Poisson model were fitted on the data and were compared using the deviance information criteria (DIC). The results showed that the bivariate zero-inflated Poisson regression model fitted the data better than the other models.

  9. BIVARIATE MODELLING OF CLUSTERED CONTINUOUS AND ORDERED CATEGORICAL OUTCOMES. (R824757)

    EPA Science Inventory

    Simultaneous observation of continuous and ordered categorical outcomes for each subject is common in biomedical research but multivariate analysis of the data is complicated by the multiple data types. Here we construct a model for the joint distribution of bivariate continuous ...

  10. Representing Topography with Second-Degree Bivariate Polynomial Functions Fitted by Least Squares.

    ERIC Educational Resources Information Center

    Neuman, Arthur Edward

    1987-01-01

    There is a need for abstracting topography other than for mapping purposes. The method employed should be simple and available to non-specialists, thereby ruling out spline representations. Generalizing from univariate first-degree least squares and from multiple regression, this article introduces bivariate polynomial functions fitted by least…

  11. Meta-analysis of studies with bivariate binary outcomes: a marginal beta-binomial model approach.

    PubMed

    Chen, Yong; Hong, Chuan; Ning, Yang; Su, Xiao

    2016-01-15

    When conducting a meta-analysis of studies with bivariate binary outcomes, challenges arise when the within-study correlation and between-study heterogeneity should be taken into account. In this paper, we propose a marginal beta-binomial model for the meta-analysis of studies with binary outcomes. This model is based on the composite likelihood approach and has several attractive features compared with the existing models such as bivariate generalized linear mixed model (Chu and Cole, 2006) and Sarmanov beta-binomial model (Chen et al., 2012). The advantages of the proposed marginal model include modeling the probabilities in the original scale, not requiring any transformation of probabilities or any link function, having closed-form expression of likelihood function, and no constraints on the correlation parameter. More importantly, because the marginal beta-binomial model is only based on the marginal distributions, it does not suffer from potential misspecification of the joint distribution of bivariate study-specific probabilities. Such misspecification is difficult to detect and can lead to biased inference using currents methods. We compare the performance of the marginal beta-binomial model with the bivariate generalized linear mixed model and the Sarmanov beta-binomial model by simulation studies. Interestingly, the results show that the marginal beta-binomial model performs better than the Sarmanov beta-binomial model, whether or not the true model is Sarmanov beta-binomial, and the marginal beta-binomial model is more robust than the bivariate generalized linear mixed model under model misspecifications. Two meta-analyses of diagnostic accuracy studies and a meta-analysis of case-control studies are conducted for illustration.

  12. CI2 for creating and comparing confidence-intervals for time-series bivariate plots.

    PubMed

    Mullineaux, David R

    2017-02-01

    Currently no method exists for calculating and comparing the confidence-intervals (CI) for the time-series of a bivariate plot. The study's aim was to develop 'CI2' as a method to calculate the CI on time-series bivariate plots, and to identify if the CI between two bivariate time-series overlap. The test data were the knee and ankle angles from 10 healthy participants running on a motorised standard-treadmill and non-motorised curved-treadmill. For a recommended 10+ trials, CI2 involved calculating 95% confidence-ellipses at each time-point, then taking as the CI the points on the ellipses that were perpendicular to the direction vector between the means of two adjacent time-points. Consecutive pairs of CI created convex quadrilaterals, and any overlap of these quadrilaterals at the same time or ±1 frame as a time-lag calculated using cross-correlations, indicated where the two time-series differed. CI2 showed no group differences between left and right legs on both treadmills, but the same legs between treadmills for all participants showed differences of less knee extension on the curved-treadmill before heel-strike. To improve and standardise the use of CI2 it is recommended to remove outlier time-series, use 95% confidence-ellipses, and scale the ellipse by the fixed Chi-square value as opposed to the sample-size dependent F-value. For practical use, and to aid in standardisation or future development of CI2, Matlab code is provided. CI2 provides an effective method to quantify the CI of bivariate plots, and to explore the differences in CI between two bivariate time-series.

  13. Exploratory Causal Analysis in Bivariate Time Series Data

    NASA Astrophysics Data System (ADS)

    McCracken, James M.

    Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data

  14. Fracture phenomena in an isotonic salt solution during freezing and their elimination using glycerol.

    PubMed

    Gao, D Y; Lin, S; Watson, P F; Critser, J K

    1995-06-01

    Thermal stress and consequent fracture in frozen organs or cell suspensions have been proposed to be two causes of cell cryoinjury. A specific device was developed to study the thermal stress and the fracture phenomena during a slow cooling process of isotonic NaCl solutions with different concentrations of glycerol (cryoprotectant) in a cylindrical tube. It was shown from the experimental results that glycerol significantly influenced the solidification process of the ternary solutions and reduced the thermal stress. The higher the initial glycerol concentration, the lower the thermal stress in the frozen solutions. Glycerol concentrations over 0.3 M were sufficient to eliminate the fracture of the frozen solutions under the present experimental conditions. To explain the action of glycerol in reducing the thermal stress and preventing the ice fracture, a further study on ice crystal formation and growth of ice in these solutions was undertaken using cryomicroscopy. It is known from previous studies that an increase of initial glycerol concentration reduced frozen fraction of water in the solution at any given low temperature due to colligative properties of solution, which reduced the total ice volume expansion during water solidification. The present cryomicroscopic investigation showed that under a fixed cooling condition the different initial glycerol concentrations induced the different microstructures of the frozen solutions at not only a given low temperature but also a given frozen fraction of water. It has been known that ice volume expansion during solidification is a major factor causing the thermal stress and the interior microstructure is critical for the mechanical strength of a solid. Therefore, functions of glycerol in reducing the total ice volume expansion during water solidification and in influencing interior microstructure of the ice may contribute to reduce the thermal stress and prevent the fracture in the frozen solutions.

  15. Nonlinear Coupling between Cortical Oscillations and Muscle Activity during Isotonic Wrist Flexion

    PubMed Central

    Yang, Yuan; Solis-Escalante, Teodoro; van de Ruit, Mark; van der Helm, Frans C. T.; Schouten, Alfred C.

    2016-01-01

    Coupling between cortical oscillations and muscle activity facilitates neuronal communication during motor control. The linear part of this coupling, known as corticomuscular coherence, has received substantial attention, even though neuronal communication underlying motor control has been demonstrated to be highly nonlinear. A full assessment of corticomuscular coupling, including the nonlinear part, is essential to understand the neuronal communication within the sensorimotor system. In this study, we applied the recently developed n:m coherence method to assess nonlinear corticomuscular coupling during isotonic wrist flexion. The n:m coherence is a generalized metric for quantifying nonlinear cross-frequency coupling as well as linear iso-frequency coupling. By using independent component analysis (ICA) and equivalent current dipole source localization, we identify four sensorimotor related brain areas based on the locations of the dipoles, i.e., the contralateral primary sensorimotor areas, supplementary motor area (SMA), prefrontal area (PFA) and posterior parietal cortex (PPC). For all these areas, linear coupling between electroencephalogram (EEG) and electromyogram (EMG) is present with peaks in the beta band (15–35 Hz), while nonlinear coupling is detected with both integer (1:2, 1:3, 1:4) and non-integer (2:3) harmonics. Significant differences between brain areas is shown in linear coupling with stronger coherence for the primary sensorimotor areas and motor association cortices (SMA, PFA) compared to the sensory association area (PPC); but not for the nonlinear coupling. Moreover, the detected nonlinear coupling is similar to previously reported nonlinear coupling of cortical activity to somatosensory stimuli. We suggest that the descending motor pathways mainly contribute to linear corticomuscular coupling, while nonlinear coupling likely originates from sensory feedback. PMID:27999537

  16. Nonlinear Coupling between Cortical Oscillations and Muscle Activity during Isotonic Wrist Flexion.

    PubMed

    Yang, Yuan; Solis-Escalante, Teodoro; van de Ruit, Mark; van der Helm, Frans C T; Schouten, Alfred C

    2016-01-01

    Coupling between cortical oscillations and muscle activity facilitates neuronal communication during motor control. The linear part of this coupling, known as corticomuscular coherence, has received substantial attention, even though neuronal communication underlying motor control has been demonstrated to be highly nonlinear. A full assessment of corticomuscular coupling, including the nonlinear part, is essential to understand the neuronal communication within the sensorimotor system. In this study, we applied the recently developed n:m coherence method to assess nonlinear corticomuscular coupling during isotonic wrist flexion. The n:m coherence is a generalized metric for quantifying nonlinear cross-frequency coupling as well as linear iso-frequency coupling. By using independent component analysis (ICA) and equivalent current dipole source localization, we identify four sensorimotor related brain areas based on the locations of the dipoles, i.e., the contralateral primary sensorimotor areas, supplementary motor area (SMA), prefrontal area (PFA) and posterior parietal cortex (PPC). For all these areas, linear coupling between electroencephalogram (EEG) and electromyogram (EMG) is present with peaks in the beta band (15-35 Hz), while nonlinear coupling is detected with both integer (1:2, 1:3, 1:4) and non-integer (2:3) harmonics. Significant differences between brain areas is shown in linear coupling with stronger coherence for the primary sensorimotor areas and motor association cortices (SMA, PFA) compared to the sensory association area (PPC); but not for the nonlinear coupling. Moreover, the detected nonlinear coupling is similar to previously reported nonlinear coupling of cortical activity to somatosensory stimuli. We suggest that the descending motor pathways mainly contribute to linear corticomuscular coupling, while nonlinear coupling likely originates from sensory feedback.

  17. A Bivariate Mixture Model for Natural Antibody Levels to Human Papillomavirus Types 16 and 18: Baseline Estimates for Monitoring the Herd Effects of Immunization

    PubMed Central

    Vink, Margaretha A.; Berkhof, Johannes; van de Kassteele, Jan; van Boven, Michiel; Bogaards, Johannes A.

    2016-01-01

    Post-vaccine monitoring programs for human papillomavirus (HPV) have been introduced in many countries, but HPV serology is still an underutilized tool, partly owing to the weak antibody response to HPV infection. Changes in antibody levels among non-vaccinated individuals could be employed to monitor herd effects of immunization against HPV vaccine types 16 and 18, but inference requires an appropriate statistical model. The authors developed a four-component bivariate mixture model for jointly estimating vaccine-type seroprevalence from correlated antibody responses against HPV16 and -18 infections. This model takes account of the correlation between HPV16 and -18 antibody concentrations within subjects, caused e.g. by heterogeneity in exposure level and immune response. The model was fitted to HPV16 and -18 antibody concentrations as measured by a multiplex immunoassay in a large serological survey (3,875 females) carried out in the Netherlands in 2006/2007, before the introduction of mass immunization. Parameters were estimated by Bayesian analysis. We used the deviance information criterion for model selection; performance of the preferred model was assessed through simulation. Our analysis uncovered elevated antibody concentrations in doubly as compared to singly seropositive individuals, and a strong clustering of HPV16 and -18 seropositivity, particularly around the age of sexual debut. The bivariate model resulted in a more reliable classification of singly and doubly seropositive individuals than achieved by a combination of two univariate models, and suggested a higher pre-vaccine HPV16 seroprevalence than previously estimated. The bivariate mixture model provides valuable baseline estimates of vaccine-type seroprevalence and may prove useful in seroepidemiologic assessment of the herd effects of HPV vaccination. PMID:27537200

  18. The effects of the intake of an isotonic sports drink before orienteering competitions on skeletal muscle damage.

    PubMed

    Colakoglu, Filiz Fatma; Cayci, Banu; Yaman, Metin; Karacan, Selma; Gonulateş, Suleyman; Ipekoglu, Gökhan; Er, Fatmanur

    2016-11-01

    [Purpose] The purpose of this study was to investigate the effects of the intake of an isotonic sports drink (500 ml water, 32 gr carbonhydrate, 120 mg calcium, 248 mg chloride, 230 mg sodium) the level of the skeletal muscle damage of orienteering athletes. [Subjects and Methods] The study was carried out on 21 male elite orienteering athletes. The athletes were divided into two groups by randomized double-blind selection. The experimental group (n=11) was given the isotonic sports drink, while the placebo group (n=10) was given 500 ml pure water. Blood samples were taken pre-competition, post-competition, 2 hours post-competition and 24 hours post-competition. [Results] The pre-c troponin, myoglobin and creatinine kinase serum levels of the placebo group were significantly lower than the post-competition and 2 hours post-competition values. The 24 hours post-competition levels of the same analyses were also significantly lower than the post-c and 2 hours post-competition. The pre-competition troponin, myoglobin and creatinine kinase levels of the experimental group were found to be significantly lower than the post-competition, 2 hours post-competition 24 hours post-competition values. In conclusion, the present results suggest that the intake of supportive sports drinks before exercising significantly prevents the observed muscle damage. The study showed that serum myoglobin levels between the experimental and the placebo group is significantly different during the 2 hours post-competition period. [Conclusion] The level of serum creatinine kinase and myoglobin accurately shows the extent of the muscle damage. However, further studies on the effect of isotonic sports drink in different training programs on the cell membrane and the muscle damage are needed.

  19. The effects of the intake of an isotonic sports drink before orienteering competitions on skeletal muscle damage

    PubMed Central

    Colakoglu, Filiz Fatma; Cayci, Banu; Yaman, Metin; Karacan, Selma; Gonulateş, Suleyman; Ipekoglu, Gökhan; Er, Fatmanur

    2016-01-01

    [Purpose] The purpose of this study was to investigate the effects of the intake of an isotonic sports drink (500 ml water, 32 gr carbonhydrate, 120 mg calcium, 248 mg chloride, 230 mg sodium) the level of the skeletal muscle damage of orienteering athletes. [Subjects and Methods] The study was carried out on 21 male elite orienteering athletes. The athletes were divided into two groups by randomized double-blind selection. The experimental group (n=11) was given the isotonic sports drink, while the placebo group (n=10) was given 500 ml pure water. Blood samples were taken pre-competition, post-competition, 2 hours post-competition and 24 hours post-competition. [Results] The pre-c troponin, myoglobin and creatinine kinase serum levels of the placebo group were significantly lower than the post-competition and 2 hours post-competition values. The 24 hours post-competition levels of the same analyses were also significantly lower than the post-c and 2 hours post-competition. The pre-competition troponin, myoglobin and creatinine kinase levels of the experimental group were found to be significantly lower than the post-competition, 2 hours post-competition 24 hours post-competition values. In conclusion, the present results suggest that the intake of supportive sports drinks before exercising significantly prevents the observed muscle damage. The study showed that serum myoglobin levels between the experimental and the placebo group is significantly different during the 2 hours post-competition period. [Conclusion] The level of serum creatinine kinase and myoglobin accurately shows the extent of the muscle damage. However, further studies on the effect of isotonic sports drink in different training programs on the cell membrane and the muscle damage are needed. PMID:27942149

  20. Knee-joint proprioception during 30-day 6 degrees head-down bed rest with isotonic and isokinetic exercise training

    NASA Technical Reports Server (NTRS)

    Bernauer, E. M.; Walby, W. F.; Ertl, A. C.; Dempster, P. T.; Bond, M.; Greenleaf, J. E.

    1994-01-01

    To determine if daily isotonic exercise or isokinetic exercise training coupled with daily leg proprioceptive training, would influence leg proprioceptive tracking responses during bed rest (BR), 19 men (36 +/- SD 4 years, 178 +/- 7 cm, 76.8 +/- 7.8 kg) were allocated into a no-exercise (NOE) training control group (n = 5), and isotonic exercise (ITE, n = 7) and isokinetic exercise (IKE, n = 7) training groups. Exercise training was conducted during BR for two 30-min periods.d-1, 5 d.week-1. Only the IKE group performed proprioceptive training using a new isokinetic procedure with each lower extremity for 2.5 min before and after the daily exercise training sessions; proprioceptive testing occurred weekly for all groups. There were no significant differences in proprioceptive tracking scores, expressed as a percentage of the perfect score of 100, in the pre-BR ambulatory control period between the three groups. Knee extension and flexion tracking responses were unchanged with NOE during BR, but were significantly greater (*p < 0.05) at the end of BR in both exercise groups when compared with NOE responses (extension: NOE 80.7 +/- 0.7%, ITE 82.9* +/- 0.6%, IKE 86.5* +/- 0.7%; flexion: NOE 77.6 +/- 1.5%, ITE 80.0 +/- 0.8% (NS), IKE 83.6* +/- 0.8%). Although proprioceptive tracking was unchanged during BR with NOE, both isotonic exercise training (without additional proprioceptive training) and especially isokinetic exercise training when combined with daily proprioceptive training, significantly improved knee proprioceptive tracking responses after 30 d of BR.

  1. Simultaneous determination of nifuroxazide and drotaverine hydrochloride in pharmaceutical preparations by bivariate and multivariate spectral analysis.

    PubMed

    Metwally, Fadia H

    2008-02-01

    The quantitative predictive abilities of the new and simple bivariate spectrophotometric method are compared with the results obtained by the use of multivariate calibration methods [the classical least squares (CLS), principle component regression (PCR) and partial least squares (PLS)], using the information contained in the absorption spectra of the appropriate solutions. Mixtures of the two drugs Nifuroxazide (NIF) and Drotaverine hydrochloride (DRO) were resolved by application of the bivariate method. The different chemometric approaches were applied also with previous optimization of the calibration matrix, as they are useful in simultaneous inclusion of many spectral wavelengths. The results found by application of the bivariate, CLS, PCR and PLS methods for the simultaneous determinations of mixtures of both components containing 2-12microgml(-1) of NIF and 2-8microgml(-1) of DRO are reported. Both approaches were satisfactorily applied to the simultaneous determination of NIF and DRO in pure form and in pharmaceutical formulation. The results were in accordance with those given by the EVA Pharma reference spectrophotometric method.

  2. Bivariate Poisson models with varying offsets: an application to the paired mitochondrial DNA dataset.

    PubMed

    Su, Pei-Fang; Mau, Yu-Lin; Guo, Yan; Li, Chung-I; Liu, Qi; Boice, John D; Shyr, Yu

    2017-03-01

    To assess the effect of chemotherapy on mitochondrial genome mutations in cancer survivors and their offspring, a study sequenced the full mitochondrial genome and determined the mitochondrial DNA heteroplasmic (mtDNA) mutation rate. To build a model for counts of heteroplasmic mutations in mothers and their offspring, bivariate Poisson regression was used to examine the relationship between mutation count and clinical information while accounting for the paired correlation. However, if the sequencing depth is not adequate, a limited fraction of the mtDNA will be available for variant calling. The classical bivariate Poisson regression model treats the offset term as equal within pairs; thus, it cannot be applied directly. In this research, we propose an extended bivariate Poisson regression model that has a more general offset term to adjust the length of the accessible genome for each observation. We evaluate the performance of the proposed method with comprehensive simulations, and the results show that the regression model provides unbiased parameter estimations. The use of the model is also demonstrated using the paired mtDNA dataset.

  3. A Case Study of Bivariate Rainfall Frequency Analysis Using Copula in South Korea

    NASA Astrophysics Data System (ADS)

    Joo, K.; Shin, J.; Kim, W.; Heo, J.

    2011-12-01

    For a given rainfall event, it can be characterized into some properties such as rainfall depth (amount), duration, and intensity. By considering these factors simultaneously, the actual phenomenon of rainfall event can be explained better than univariate model. Using bivariate model, rainfall quantiles can be obtained for a given return period without any limitations of specific rainfall duration. For bivariate(depth and duration) frequency analysis, copula model was used in this study. Recently, copula model has been studied widely for hydrological field. And it is more flexible for marginal distribution than other conventional bivariate models. In this study, five weather stations are applied for frequency analysis from Korea Meteorological Administration (KMA) which are Seoul, Chuncheon, Gangneung, Wonju, and Chungju stations. These sites have 38 ~ 50 years of hourly precipitation data. Inter-event time definition is used for identification of rainfall events. And three copula models (Gumbel-Hougaard, Frank, and Joe) are applied in this study. Maximum pseudo-likelihood estimation method is used to estimate the parameter of copula (θ). The normal, generalized extreme value, Gumbel, 3-parameter gamma, and generalized logistic distributions are examined for marginal distribution. As a result, rainfall quantiles can be obtained for any rainfall durations for a given return period by calculating conditional probability. In addition, rainfall quantiles from copula models are compared to those from univariate model.

  4. Simultaneous determination of Nifuroxazide and Drotaverine hydrochloride in pharmaceutical preparations by bivariate and multivariate spectral analysis

    NASA Astrophysics Data System (ADS)

    Metwally, Fadia H.

    2008-02-01

    The quantitative predictive abilities of the new and simple bivariate spectrophotometric method are compared with the results obtained by the use of multivariate calibration methods [the classical least squares (CLS), principle component regression (PCR) and partial least squares (PLS)], using the information contained in the absorption spectra of the appropriate solutions. Mixtures of the two drugs Nifuroxazide (NIF) and Drotaverine hydrochloride (DRO) were resolved by application of the bivariate method. The different chemometric approaches were applied also with previous optimization of the calibration matrix, as they are useful in simultaneous inclusion of many spectral wavelengths. The results found by application of the bivariate, CLS, PCR and PLS methods for the simultaneous determinations of mixtures of both components containing 2-12 μg ml -1 of NIF and 2-8 μg ml -1 of DRO are reported. Both approaches were satisfactorily applied to the simultaneous determination of NIF and DRO in pure form and in pharmaceutical formulation. The results were in accordance with those given by the EVA Pharma reference spectrophotometric method.

  5. Multiresolution transmission of the correlation modes between bivariate time series based on complex network theory

    NASA Astrophysics Data System (ADS)

    Huang, Xuan; An, Haizhong; Gao, Xiangyun; Hao, Xiaoqing; Liu, Pengpeng

    2015-06-01

    This study introduces an approach to study the multiscale transmission characteristics of the correlation modes between bivariate time series. The correlation between the bivariate time series fluctuates over time. The transmission among the correlation modes exhibits a multiscale phenomenon, which provides richer information. To investigate the multiscale transmission of the correlation modes, this paper describes a hybrid model integrating wavelet analysis and complex network theory to decompose and reconstruct the original bivariate time series into sequences in a joint time-frequency domain and defined the correlation modes at each time-frequency domain. We chose the crude oil spot and futures prices as the sample data. The empirical results indicate that the main duration of volatility (32-64 days) for the strongly positive correlation between the crude oil spot price and the futures price provides more useful information for investors. Moreover, the weighted degree, weighted indegree and weighted outdegree of the correlation modes follow power-law distributions. The correlation fluctuation strengthens the extent of persistence over the long term, whereas persistence weakens over the short and medium term. The primary correlation modes dominating the transmission process and the major intermediary modes in the transmission process are clustered both in the short and long term.

  6. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution.

    PubMed

    Dinov, Ivo D; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2013-01-01

    Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students' understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference.

  7. Lower Extremity Muscle Thickness During 30-Day 6 degrees Head-Down Bed Rest with Isotonic and Isokinetic Exercise Training

    NASA Technical Reports Server (NTRS)

    Ellis, S.; Kirby, L. C.; Greenleaf, J. E.

    1993-01-01

    Muscle thickness was measured in 19 Bed-Rested (BR) men (32-42 year) subjected to IsoTonic (ITE, cycle orgometer) and IsoKi- netic (IKE, torque orgometer) lower extremity exercise training, and NO Exercise (NOE) training. Thickness was measured with ultrasonography in anterior thigh-Rectus Femoris (RF) and Vastus Intermadius (VI), and combined posterior log-soleus, flexor ballucis longus, and tibialis posterior (S + FHL +TP) - muscles. Compared with ambulatory control values, thickness of the (S + FHL + TP) decreased by 90%-12% (p less than 0.05) In all three test groups. The (RF) thickness was unchanged in the two exercise groups, but decreased by 10% (p less than 0.05) in the NOE. The (VI) thickness was unchanged In the ITE group, but decreased by 12%-l6% (p less than 0.05) in the IKE and NOE groups. Thus, intensive, alternating, isotonic cycle ergometer exercise training is as effective as intensive, intermittent, isokinetic exercise training for maintaining thicknesses of rectus femoris and vastus lntermedius anterior thigh muscles, but not posterior log muscles, during prolonged BR deconditioning.

  8. Test of the Grodzins product rule in N = 88 isotones and the role of the Z = 64 subshell

    NASA Astrophysics Data System (ADS)

    Gupta, J. B.

    2014-03-01

    Background: The increase of collectivity in the nuclear spectra, with increasing numbers of valence proton and neutron pairs, is a well-known phenomenon, yielding a decreasing E (21+) and increasing B(E2)↑, which is the basis of Grodzins E(21+)×B(E2)↑ product constancy rule. Purpose: In the N = 88 isotones, this product varies sharply with Z. This breakdown of the product rule in the Ba to Dy region is illustrated and its origin is analyzed. Method: Empirical data on energy level structure in various forms along with the E2 transition rates, vis-à-vis the level structure and the nature of the Z = 64 subshell effect, are presented. Results: The complex nuclear structure of N = 88 isotones is highlighted and the genesis of the underlying physics is made more transparent. Conclusion: Besides the static shape of the nucleus, it involves the dynamics of the nucleus, as reproduced in the dynamic pairing plus quadrupole model.

  9. Altered corticomuscular coherence elicited by paced isotonic contractions in individuals with cerebral palsy: a case-control study.

    PubMed

    Riquelme, Inmaculada; Cifre, Ignacio; Muñoz, Miguel A; Montoya, Pedro

    2014-12-01

    The purpose of the study was to analyze corticomuscular coherence during planning and execution of simple hand movements in individuals with cerebral palsy (CP) and healthy controls (HC). Fourteen individuals with CP and 15 HC performed voluntary paced movements (opening and closing the fist) in response to a warning signal. Simultaneous scalp EEG and surface EMG of extensor carpi radialis brevis were recorded during 15 isotonic contractions. Time-frequency corticomuscular coherence (EMG-C3/C4) before and during muscular contraction, as well as EMG intensity, onset latency and duration were analyzed. Although EMG intensity was similar in both groups, individuals with CP exhibited longer onset latency and increased duration of the muscular contraction than HC. CP also showed higher corticomuscular coherence in beta EEG band during both planning and execution of muscular contraction, as well as lower corticomuscular coherence in gamma EEG band at the beginning of the contraction as compared with HC. In conclusion, our results suggest that individuals with CP are characterized by an altered functional coupling between primary motor cortex and effector muscles during planning and execution of isotonic contractions. In addition, the usefulness of corticomuscular coherence as a research tool for exploring deficits in motor central processing in persons with early brain damage is discussed.

  10. Long-lead station-scale prediction of hydrological droughts in South Korea based on bivariate pattern-based downscaling

    NASA Astrophysics Data System (ADS)

    Sohn, Soo-Jin; Tam, Chi-Yung

    2016-05-01

    Capturing climatic variations in boreal winter to spring (December-May) is essential for properly predicting droughts in South Korea. This study investigates the variability and predictability of the South Korean climate during this extended season, based on observations from 60 station locations and multi-model ensemble (MME) hindcast experiments (1983/1984-2005/2006) archived at the APEC Climate Center (APCC). Multivariate empirical orthogonal function (EOF) analysis results based on observations show that the first two leading modes of winter-to-spring precipitation and temperature variability, which together account for ~80 % of the total variance, are characterized by regional-scale anomalies covering the whole South Korean territory. These modes were also closely related to some of the recurrent large-scale circulation changes in the northern hemisphere during the same season. Consistent with the above, examination of the standardized precipitation evapotranspiration index (SPEI) indicates that drought conditions in South Korea tend to be accompanied by regional-to-continental-scale circulation anomalies over East Asia to the western north Pacific. Motivated by the aforementioned findings on the spatial-temporal coherence among station-scale precipitation and temperature anomalies, a new bivariate and pattern-based downscaling method was developed. The novelty of this method is that precipitation and temperature data were first filtered using multivariate EOFs to enhance their spatial-temporal coherence, before being linked to large-scale circulation variables using canonical correlation analysis (CCA). To test its applicability and to investigate its related potential predictability, a perfect empirical model was first constructed with observed datasets as predictors. Next, a model output statistics (MOS)-type hybrid dynamical-statistical model was developed, using products from nine one-tier climate models as inputs. It was found that, with model sea

  11. Bivariate Genome-Wide Association Analysis of the Growth and Intake Components of Feed Efficiency

    PubMed Central

    Beever, Jonathan E.; Bollero, Germán A.; Southey, Bruce R.; Faulkner, Daniel B.; Rodriguez-Zas, Sandra L.

    2013-01-01

    Single nucleotide polymorphisms (SNPs) associated with average daily gain (ADG) and dry matter intake (DMI), two major components of feed efficiency in cattle, were identified in a genome-wide association study (GWAS). Uni- and multi-SNP models were used to describe feed efficiency in a training data set and the results were confirmed in a validation data set. Results from the univariate and bivariate analyses of ADG and DMI, adjusted by the feedlot beef steer maintenance requirements, were compared. The bivariate uni-SNP analysis identified (P-value <0.0001) 11 SNPs, meanwhile the univariate analyses of ADG and DMI identified 8 and 9 SNPs, respectively. Among the six SNPs confirmed in the validation data set, five SNPs were mapped to KDELC2, PHOX2A, and TMEM40. Findings from the uni-SNP models were used to develop highly accurate predictive multi-SNP models in the training data set. Despite the substantially smaller size of the validation data set, the training multi-SNP models had slightly lower predictive ability when applied to the validation data set. Six Gene Ontology molecular functions related to ion transport activity were enriched (P-value <0.001) among the genes associated with the detected SNPs. The findings from this study demonstrate the complementary value of the uni- and multi-SNP models, and univariate and bivariate GWAS analyses. The identified SNPs can be used for genome-enabled improvement of feed efficiency in feedlot beef cattle, and can aid in the design of empirical studies to further confirm the associations. PMID:24205251

  12. Bivariate Linkage Study of Proximal Hip Geometry and Body Size Indices: The Framingham Study

    PubMed Central

    Dupuis, J.; Cupples, L. A.; Beck, T. J.; Mahaney, M. C.; Havill, L. M.; Kiel, D. P.; Demissie, S.

    2008-01-01

    Femoral geometry and body size are both characterized by substantial heritability. The purpose of this study was to discern whether hip geometry and body size (height and body mass index, BMI) share quantitative trait loci (QTL). Dual-energy X-ray absorptiometric scans of the proximal femur from 1,473 members in 323 pedigrees (ages 31–96 years) from the Framingham Osteoporosis Study were studied. We measured femoral neck length, neck-shaft angle, subperiosteal width (outer diameter), cross-sectional bone area, and section modulus, at the narrowest section of the femoral neck (NN), intertrochanteric (IT), and femoral shaft (S) regions. In variance component analyses, genetic correlations (ρG) between hip geometry traits and height ranged 0.30–0.59 and between hip geometry and BMI ranged 0.11–0.47. In a genomewide linkage scan with 636 markers, we obtained nominally suggestive linkages (bivariate LOD scores ≥ 1.9) for geometric traits and either height or BMI at several chromosomes (4, 6, 9, 15, and 21). Two loci, on chr. 2 (80 cM, BMI/shaft section modulus) and chr. X (height/shaft outer diameter), yielded bivariate LOD scores ≥ 3.0; although these loci were linked in univariate analyses with a geometric trait, neither was linked with either height or BMI. In conclusion, substantial genetic correlations were found between the femoral geometric traits, height and BMI. Linkage signals from bivariate linkage analyses of bone geometric indices and body size were similar to those obtained in univariate linkage analyses of femoral geometric traits, suggesting that most of the detected QTL primarily influence geometry of the hip. PMID:17674073

  13. An Application of Endpoint Detection to Bivariate Data in Tau-Path Order.

    PubMed

    Sampath, Srinath; Verducci, Joseph S

    2014-08-01

    The Fligner and Verducci (1988) multistage model for rankings is modified to create the moving average maximum likelihood estimator (MAMLE), a locally smooth estimator that measures stage-wise agreement between two long ranked lists, and provides a stopping rule for the detection of the endpoint of agreement. An application of this MAMLE stopping rule to bivariate data set in tau-path order (Yu, Verducci and Blower (2011)) is discussed. Data from the National Cancer Institute measuring associations between gene expression and compound potency are studied using this application, providing insights into the length of the relationship between the variables.

  14. An efficient algorithm for generating random number pairs drawn from a bivariate normal distribution

    NASA Technical Reports Server (NTRS)

    Campbell, C. W.

    1983-01-01

    An efficient algorithm for generating random number pairs from a bivariate normal distribution was developed. Any desired value of the two means, two standard deviations, and correlation coefficient can be selected. Theoretically the technique is exact and in practice its accuracy is limited only by the quality of the uniform distribution random number generator, inaccuracies in computer function evaluation, and arithmetic. A FORTRAN routine was written to check the algorithm and good accuracy was obtained. Some small errors in the correlation coefficient were observed to vary in a surprisingly regular manner. A simple model was developed which explained the qualities aspects of the errors.

  15. Statistical analysis of multivariate atmospheric variables. [cloud cover

    NASA Technical Reports Server (NTRS)

    Tubbs, J. D.

    1979-01-01

    Topics covered include: (1) estimation in discrete multivariate distributions; (2) a procedure to predict cloud cover frequencies in the bivariate case; (3) a program to compute conditional bivariate normal parameters; (4) the transformation of nonnormal multivariate to near-normal; (5) test of fit for the extreme value distribution based upon the generalized minimum chi-square; (6) test of fit for continuous distributions based upon the generalized minimum chi-square; (7) effect of correlated observations on confidence sets based upon chi-square statistics; and (8) generation of random variates from specified distributions.

  16. Randomised clinical study comparing the effectiveness and physiological effects of hypertonic and isotonic polyethylene glycol solutions for bowel cleansing

    PubMed Central

    Yamano, Hiro-o; Matsushita, Hiro-o; Yoshikawa, Kenjiro; Takagi, Ryo; Harada, Eiji; Tanaka, Yoshihito; Nakaoka, Michiko; Himori, Ryogo; Yoshida, Yuko; Satou, Kentarou; Imai, Yasushi

    2016-01-01

    Objectives Bowel cleansing is necessary before colonoscopy, but is a burden to patients because of the long cleansing time and large dose volume. A low-volume (2 L) hypertonic polyethylene glycol-ascorbic acid solution (PEG-Asc) has been introduced, but its possible dehydration effects have not been quantitatively studied. We compared the efficacy and safety including the dehydration risk between hypertonic PEG-Asc and isotonic PEG regimens. Design This was an observer-blinded randomised study. Participants (n=310) were allocated to receive 1 of 3 regimens on the day of colonoscopy: PEG-Asc (1.5 L) and water (0.75 L) dosed with 1 split (PEG-Asc-S) or 4 splits (PEG-Asc-M), or PEG-electrolyte solution (PEG-ES; 2.25 L) dosed with no split. Dehydration was analysed by measuring haematocrit (Ht). Results The cleansing time using the hypertonic PEG-Asc-S (3.33±0.48 hours) was significantly longer than that with isotonic PEG-ES (3.05±0.56 hours; p<0.001). PEG-Asc-M (3.00±0.53 hours) did not have this same disadvantage. Successful cleansing was achieved in more than 94% of participants using each of the 3 regimens. The percentage changes in Ht from baseline (before dosing) to the end of dosing with PEG-Asc-S (3.53±3.32%) and PEG-Asc-M (4.11±3.07%) were significantly greater than that with PEG-ES (1.31±3.01%). Conclusions These 3 lower volume regimens were efficacious and had no serious adverse effects. Even patients cleansed with isotonic PEG-ES showed significant physiological dehydration at the end of dosing. The four-split PEG-Asc-M regimen is recommended because of its shorter cleansing time without causing serious nausea. Trial registration number UMIN000013103; Results. PMID:27547443

  17. Swine chromosomal DNA quantification by bivariate flow karyotyping and karyotype interpretation.

    PubMed

    Schmitz, A; Chaput, B; Fouchet, P; Guilly, M N; Frelat, G; Vaiman, M

    1992-01-01

    Human and swine chromosomes were analyzed separately and as a mix to obtain bivariate flow karyotypes. They were normalized to each other in order to use the human chromosomal DNA content as standard. Our results led to the characterization of the "DNA line" in swine identical to the human "DNA line." Estimation of the DNA content in mega-base pairs of the swine chromosomes is proposed. Chromosomal assignment to the various resolved peaks on the bivariate swine flow karyotype is suggested from the relation between DNA content quantified by flow cytometry and chromosomal size. Swine chromosomes 1, 13, 6, 5, 10, 16, 11, 18, and Y were assigned to peaks A, B, C, K, L, N, O, Q, and Y, respectively. Peaks D and E were assumed to contain chromosomes 2 and 14, but without specific assignment. Similarly, P and M peaks were expected to correspond to chromosomes 12 and 17. Of the remaining chromosomes (3, 7, X, 8, 15, 9, and 4), chromosomes 3, 7, and X, which were assigned previously to peaks F, G, and H, respectively, led us to deduce that chromosomes 15 and 8 belonged to peaks I and J, and chromosomes 9, 4, and X to peak H.

  18. A semiparametric approach to simultaneous covariance estimation for bivariate sparse longitudinal data.

    PubMed

    Das, Kiranmoy; Daniels, Michael J

    2014-03-01

    Estimation of the covariance structure for irregular sparse longitudinal data has been studied by many authors in recent years but typically using fully parametric specifications. In addition, when data are collected from several groups over time, it is known that assuming the same or completely different covariance matrices over groups can lead to loss of efficiency and/or bias. Nonparametric approaches have been proposed for estimating the covariance matrix for regular univariate longitudinal data by sharing information across the groups under study. For the irregular case, with longitudinal measurements that are bivariate or multivariate, modeling becomes more difficult. In this article, to model bivariate sparse longitudinal data from several groups, we propose a flexible covariance structure via a novel matrix stick-breaking process for the residual covariance structure and a Dirichlet process mixture of normals for the random effects. Simulation studies are performed to investigate the effectiveness of the proposed approach over more traditional approaches. We also analyze a subset of Framingham Heart Study data to examine how the blood pressure trajectories and covariance structures differ for the patients from different BMI groups (high, medium, and low) at baseline.

  19. Recurrent Major Depression and Right Hippocampal Volume: A Bivariate Linkage and Association Study

    PubMed Central

    Mathias, Samuel R.; Knowles, Emma E. M.; Kent, Jack W.; Mckay, D. Reese; Curran, Joanne E.; de Almeida, Marcio A. A.; Dyer, Thomas D.; Göring, Harald H. H.; Olvera, Rene L.; Duggirala, Ravi; Fox, Peter T.; Almasy, Laura; Blangero, John; Glahn, David. C.

    2016-01-01

    Previous work has shown that the hippocampus is smaller in the brains of individuals suffering from major depressive disorder (MDD) than those of healthy controls. Moreover, right hippocampal volume specifically has been found to predict the probability of subsequent depressive episodes. This study explored the utility of right hippocampal volume as an endophenotype of recurrent MDD (rMDD). We observed a significant genetic correlation between the two traits in a large sample of Mexican American individuals from extended pedigrees (ρg = –0.34, p = 0.013). A bivariate linkage scan revealed a significant pleiotropic quantitative trait locus on chromosome 18p11.31-32 (LOD = 3.61). Bivariate association analysis conducted under the linkage peak revealed a variant (rs574972) within an intron of the gene SMCHD1 meeting the corrected significance level (χ2 = 19.0, p = 7.4 × 10–5). Univariate association analyses of each phenotype separately revealed that the same variant was significant for right hippocampal volume alone, and also revealed a suggestively significant variant (rs12455524) within the gene DLGAP1 for rMDD alone. The results implicate right-hemisphere hippocampal volume as a possible endophenotype of rMDD, and in so doing highlight a potential gene of interest for rMDD risk. PMID:26485182

  20. Inheritance of dermatoglyphic traits in twins: univariate and bivariate variance decomposition analysis.

    PubMed

    Karmakar, Bibha; Malkin, Ida; Kobyliansky, Eugene

    2012-01-01

    Dermatoglyphic traits in a sample of twins were analyzed to estimate the resemblance between MZ and DZ twins and to evaluate the mode of inheritance by using the maximum likelihood-based Variance decomposition analysis. The additive genetic variance component was significant in both sexes for four traits--PII, AB_RC, RC_HB, and ATD_L. AB RC and RC_HB had significant sex differences in means, whereas PII and ATD_L did not. The results of the Bivariate Variance decomposition analysis revealed that PII and RC_HB have a significant correlation in both genetic and residual components. Significant correlation in the additive genetic variance between AB_RC and ATD_L was observed. The same analysis only for the females sub-sample in the three traits RBL, RBR and AB_DIS shows that the additive genetic RBR component was significant and the AB_DIS sibling component was not significant while others cannot be constrained to zero. The additive variance for AB DIS sibling component was not significant. The three components additive, sibling and residual were significantly correlated between each pair of traits revealed by the Bivariate Variance decomposition analysis.

  1. Bivariate pointing movements on large touch screens: investigating the validity of a refined Fitts' Law.

    PubMed

    Bützler, Jennifer; Vetter, Sebastian; Jochems, Nicole; Schlick, Christopher M

    2012-01-01

    On the basis of three empirical studies Fitts' Law was refined for bivariate pointing tasks on large touch screens. In the first study different target width parameters were investigated. The second study considered the effect of the motion angle. Based on the results of the two studies a refined model for movement time in human-computer interaction was formulated. A third study, which is described here in detail, concerns the validation of the refined model. For the validation study 20 subjects had to execute a bivariate pointing task on a large touch screen. In the experimental task 250 rectangular target objects were displayed at a randomly chosen position on the screen covering a broad range of ID values (ID= [1.01; 4.88]). Compared to existing refinements of Fitts' Law, the new model shows highest predictive validity. A promising field of application of the model is the ergonomic design and evaluation of project management software. By using the refined model, software designers can calculate a priori the appropriate angular position and the size of buttons, menus or icons.

  2. A bivariate space-time downscaler under space and time misalignment

    PubMed Central

    Berrocal, Veronica J.; Gelfand, Alan E.; Holland, David M.

    2010-01-01

    Ozone and particulate matter PM2.5 are co-pollutants that have long been associated with increased public health risks. Information on concentration levels for both pollutants come from two sources: monitoring sites and output from complex numerical models that produce concentration surfaces over large spatial regions. In this paper, we offer a fully-model based approach for fusing these two sources of information for the pair of co-pollutants which is computationally feasible over large spatial regions and long periods of time. Due to the association between concentration levels of the two environmental contaminants, it is expected that information regarding one will help to improve prediction of the other. Misalignment is an obvious issue since the monitoring networks for the two contaminants only partly intersect and because the collection rate for PM2.5 is typically less frequent than that for ozone. Extending previous work in Berrocal et al. (2009), we introduce a bivariate downscaler that provides a flexible class of bivariate space-time assimilation models. We discuss computational issues for model fitting and analyze a dataset for ozone and PM2.5 for the ozone season during year 2002. We show a modest improvement in predictive performance, not surprising in a setting where we can anticipate only a small gain. PMID:21853015

  3. Xp21 contiguous gene syndromes: Deletion quantitation with bivariate flow karyotyping allows mapping of patient breakpoints

    SciTech Connect

    McCabe, E.R.B.; Towbin, J.A. ); Engh, G. van den; Trask, B.J. )

    1992-12-01

    Bivariate flow karyotyping was used to estimate the deletion sizes for a series of patients with Xp21 contiguous gene syndromes. The deletion estimates were used to develop an approximate scale for the genomic map in Xp21. The bivariate flow karyotype results were compared with clinical and molecular genetic information on the extent of the patients' deletions, and these various types of data were consistent. The resulting map spans >15 Mb, from the telomeric interval between DXS41 (99-6) and DXS68 (1-4) to a position centromeric to the ornithine transcarbamylase locus. The deletion sizing was considered to be accurate to [plus minus]1 Mb. The map provides information on the relative localization of genes and markers within this region. For example, the map suggests that the adrenal hypoplasia congenita and glycerol kinase genes are physically close to each other, are within 1-2 Mb of the telomeric end of the Duchenne muscular dystrophy (DMD) gene, and are nearer to the DMD locus than to the more distal marker DXS28 (C7). Information of this type is useful in developing genomic strategies for positional cloning in Xp21. These investigations demonstrate that the DNA from patients with Xp21 contiguous gene syndromes can be valuable reagents, not only for ordering loci and markers but also for providing an approximate scale to the map of the Xp21 region surrounding DMD. 44 refs., 3 figs.

  4. Bivariate Heritability of Total and Regional Brain Volumes: the Framingham Study

    PubMed Central

    DeStefano, Anita L.; Seshadri, Sudha; Beiser, Alexa; Atwood, Larry D.; Massaro, Joe M.; Au, Rhoda; Wolf, Philip A.; DeCarli, Charles

    2009-01-01

    Heritability and genetic and environmental correlations of total and regional brain volumes were estimated from a large, generally healthy, community-based sample, to determine if there are common elements to the genetic influence of brain volumes and white matter hyperintensity volume. There were 1538 Framingham Heart Study participants with brain volume measures from quantitative magnetic resonance imaging (MRI) who were free of stroke and other neurological disorders that might influence brain volumes and who were members of families with at least two Framingham Heart Study participants. Heritability was estimated using variance component methodology and adjusting for the components of the Framingham stroke risk profile. Genetic and environmental correlations between traits were obtained from bivariate analysis. Heritability estimates ranging from 0.46 to 0.60, were observed for total brain, white matter hyperintensity, hippocampal, temporal lobe, and lateral ventricular volumes. Moderate, yet significant, heritability was observed for the other measures. Bivariate analyses demonstrated that relationships between brain volume measures, except for white matter hyperintensity, reflected both moderate to strong shared genetic and shared environmental influences. This study confirms strong genetic effects on brain and white matter hyperintensity volumes. These data extend current knowledge by showing that these two different types of MRI measures do not share underlying genetic or environmental influences. PMID:19812462

  5. Ultrafast-timing lifetime measurements in 94Ru and 96Pd: Breakdown of the seniority scheme in N =50 isotones

    NASA Astrophysics Data System (ADS)

    Mach, H.; Korgul, A.; Górska, M.; Grawe, H.; Matea, I.; Stǎnoiu, M.; Fraile, L. M.; Penionzkevich, Yu. E.; Santos, F. De Oliviera; Verney, D.; Lukyanov, S.; Cederwall, B.; Covello, A.; Dlouhý, Z.; Fogelberg, B.; De France, G.; Gargano, A.; Georgiev, G.; Grzywacz, R.; Lisetskiy, A. F.; Mrazek, J.; Nowacki, F.; Płóciennik, W. A.; Podolyák, Zs.; Ray, S.; Ruchowska, E.; Saint-Laurent, M.-G.; Sawicka, M.; Stodel, Ch.; Tarasov, O.

    2017-01-01

    The advanced time-delayed γ γ (t ) method has been applied to determine half-lives of low-lying states in the N =50 isotones 94Ru and 96Pd. The inferred experimental E 2 strengths for the 4+→2+ transitions in the two nuclei show a dramatic deviation with respect to the shell model predictions in the (f5 /2,p ,g9 /2) proton hole space in 100Sn. The anomalous behavior can be ascribed to a breakdown of the seniority quantum number in the π g9/2 n configuration due to particle-hole excitations across the N =Z =50 shell as confirmed by large-scale shell model calculations.

  6. Spin-orbit splittings of neutron states in N =20 isotones from covariant density functionals and their extensions

    NASA Astrophysics Data System (ADS)

    Karakatsanis, Konstantinos; Lalazissis, G. A.; Ring, Peter; Litvinova, Elena

    2017-03-01

    Spin-orbit splitting is an essential ingredient for our understanding of the shell structure in nuclei. One of the most important advantages of relativistic mean-field (RMF) models in nuclear physics is the fact that the large spin-orbit (SO) potential emerges automatically from the inclusion of Lorentz-scalar and -vector potentials in the Dirac equation. It is therefore of great importance to compare the results of such models with experimental data. We investigate the size of 2 p and 1 f splittings for the isotone chain 40Ca, 38Ar, 36S, and 34Si in the framework of various relativistic and nonrelativistic density functionals. They are compared with the results of nonrelativistic models and with recent experimental data.

  7. Effect of spaceflight on the isotonic contractile properties of single skeletal muscle fibers in the rhesus monkey

    NASA Technical Reports Server (NTRS)

    Fitts, R. H.; Romatowski, J. G.; Blaser, C.; De La Cruz, L.; Gettelman, G. J.; Widrick, J. J.

    2000-01-01

    Experiments from both Cosmos and Space Shuttle missions have shown weightlessness to result in a rapid decline in the mass and force of rat hindlimb extensor muscles. Additionally, despite an increased maximal shortening velocity, peak power was reduced in rat soleus muscle post-flight. In humans, declines in voluntary peak isometric ankle extensor torque ranging from 15-40% have been reported following long- and short-term spaceflight and prolonged bed rest. Complete understanding of the cellular events responsible for the fiber atrophy and the decline in force, as well as the development of effective countermeasures, will require detailed knowledge of how the physiological and biochemical processes of muscle function are altered by spaceflight. The specific purpose of this investigation was to determine the extent to which the isotonic contractile properties of the slow- and fast-twitch fiber types of the soleus and gastrocnemius muscles of rhesus monkeys (Macaca mulatta) were altered by a 14-day spaceflight.

  8. Impact of Isotonic Beverage on the Hydration Status of Healthy Chinese Adults in Air-Conditioned Environment

    PubMed Central

    Siow, Phei Ching; Tan, Wei Shuan Kimberly; Henry, Christiani Jeyakumar

    2017-01-01

    People living in tropical climates spend much of their time in confined air-conditioned spaces, performing normal daily activities. This study investigated the effect of distilled water (W) or isotonic beverage (IB) on the hydration status in subjects living under these conditions. In a randomized crossover design, forty-nine healthy male subjects either consumed beverage or IB over a period of 8 h (8 h) in a controlled air-conditioned environment. Blood, urine, and saliva samples were collected at baseline and after 8 h. Hydration status was assessed by body mass, urine output, blood and plasma volume, fluid retention, osmolality, electrolyte concentration and salivary flow rate. In the IB group, urine output (1862 ± 86 mL vs. 2104 ± 98 mL) was significantly lower and more fluids were retained (17% ± 3% vs. 7% ± 3%) as compared to W (p < 0.05) after 8 h. IB also resulted in body mass gain (0.14 ± 0.06 kg), while W led to body mass loss (−0.04 ± 0.05 kg) (p = 0.01). A significantly smaller drop in blood volume and lower free water clearance was observed in IB (−1.18% ± 0.43%; 0.55 ± 0.26 mL/min) compared to W (−2.11% ± 0.41%; 1.35 ± 0.24 mL/min) (p < 0.05). IB increased salivary flow rate (0.54 ± 0.05 g/min 0.62 ± 0.04 g/min). In indoor environments, performing routine activities and even without excessive sweating, isotonic beverages may be more effective at retaining fluids and maintaining hydration status by up to 10% compared to distilled water. PMID:28272337

  9. Impact of Isotonic Beverage on the Hydration Status of Healthy Chinese Adults in Air-Conditioned Environment.

    PubMed

    Siow, Phei Ching; Tan, Wei Shuan Kimberly; Henry, Christiani Jeyakumar

    2017-03-07

    People living in tropical climates spend much of their time in confined air-conditioned spaces, performing normal daily activities. This study investigated the effect of distilled water (W) or isotonic beverage (IB) on the hydration status in subjects living under these conditions. In a randomized crossover design, forty-nine healthy male subjects either consumed beverage or IB over a period of 8 h (8 h) in a controlled air-conditioned environment. Blood, urine, and saliva samples were collected at baseline and after 8 h. Hydration status was assessed by body mass, urine output, blood and plasma volume, fluid retention, osmolality, electrolyte concentration and salivary flow rate. In the IB group, urine output (1862 ± 86 mL vs. 2104 ± 98 mL) was significantly lower and more fluids were retained (17% ± 3% vs. 7% ± 3%) as compared to W (p < 0.05) after 8 h. IB also resulted in body mass gain (0.14 ± 0.06 kg), while W led to body mass loss (-0.04 ± 0.05 kg) (p = 0.01). A significantly smaller drop in blood volume and lower free water clearance was observed in IB (-1.18% ± 0.43%; 0.55 ± 0.26 mL/min) compared to W (-2.11% ± 0.41%; 1.35 ± 0.24 mL/min) (p < 0.05). IB increased salivary flow rate (0.54 ± 0.05 g/min 0.62 ± 0.04 g/min). In indoor environments, performing routine activities and even without excessive sweating, isotonic beverages may be more effective at retaining fluids and maintaining hydration status by up to 10% compared to distilled water.

  10. Unilateral fluid absorption and effects on peak power after ingestion of commercially available hypotonic, isotonic, and hypertonic sports drinks.

    PubMed

    Rowlands, David S; Bonetti, Darrell L; Hopkins, Will G

    2011-12-01

    Isotonic sports drinks are often consumed to offset the effects of dehydration and improve endurance performance, but hypotonic drinks may be more advantageous. The purpose of the study was to compare absorption and effects on performance of a commercially available hypotonic sports drink (Mizone Rapid: 3.9% carbohydrate [CHO], 218 mOsmol/kg) with those of an isotonic drink (PowerAde: 7.6% CHO, 281 mOsmol/ kg), a hypertonic drink (Gatorade: 6% CHO, 327 mOsmol/kg), and a noncaloric placebo (8 mOsmol/kg). In a crossover, 11 cyclists consumed each drink on separate days at 250 ml/15 min during a 2-hr preload ride at 55% peak power followed by an incremental test to exhaustion. Small to moderate increases in deuterium oxide enrichment in the preload were observed with Mizone Rapid relative to PowerAde, Gatorade, and placebo (differences of 88, 45, and 42 parts per million, respectively; 90% confidence limits ±28). Serum osmolality was moderately lower with Mizone Rapid than with PowerAde and Gatorade (-1.9, -2.4; mOsmol/L; ±1.2 mOsmol/L) but not clearly different vs. placebo. Plasma volume reduction was small to moderate with Mizone Rapid, PowerAde, and Gatorade relative to placebo (-1.9%, -2.5%, -2.9%; ± 2.5%). Gut comfort was highest with Mizone Rapid but clearly different (8.4% ± 4.8%) only vs PowerAde. Peak power was highest with Mizone Rapid (380 W) vs. placebo and other drinks (1.2-3.0%; 99% confidence limits ±4.7%), but differences were inconclusive with reference to the smallest important effect (~1.2%). The outcomes are consistent with fastest fluid absorption with the hypotonic sports drink. Further research should determine whether the effect has a meaningful impact on performance.

  11. The bovine bivariate flow karyotype and peak identification by chromosome painting with PCR-generated probes.

    PubMed

    Schmitz, A; Oustry, A; Chaput, B; Bahri-Darwich, I; Yerle, M; Millan, D; Frelat, G; Cribiu, E P

    1995-06-01

    A bovine bivariate flow karyotype has been established from a primary fibroblast cell culture carrying a 4;10 Robertsonian translocation. From 27 to 36 populations could be resolved by flow cytometry although the anticipated number was 31. Separation of chromosomal pairs into two populations explains this high resolution and confirms the high level of heteromorphism previously observed. We used a PARM-PCR (Priming Authorizing Random Mismatches) procedure for the production of paint probes from flow-sorted chromosome fractions. These probes were used for chromosome identification by fluorescence in situ hybridization (FISH) on R-banded metaphase spreads. We present the localization of all the bovine chromosome types on the flow karyotype. Twenty-two chromosome types including the translocated chromosome were sorted as pure fractions.

  12. Bayesian bivariate generalized Lindley model for survival data with a cure fraction.

    PubMed

    Martinez, Edson Z; Achcar, Jorge A

    2014-11-01

    The cure fraction models have been widely used to analyze survival data in which a proportion of the individuals is not susceptible to the event of interest. In this article, we introduce a bivariate model for survival data with a cure fraction based on the three-parameter generalized Lindley distribution. The joint distribution of the survival times is obtained by using copula functions. We consider three types of copula function models, the Farlie-Gumbel-Morgenstern (FGM), Clayton and Gumbel-Barnett copulas. The model is implemented under a Bayesian framework, where the parameter estimation is based on Markov Chain Monte Carlo (MCMC) techniques. To illustrate the utility of the model, we consider an application to a real data set related to an invasive cervical cancer study.

  13. On the joint spectral density of bivariate random sequences. Thesis Technical Report No. 21

    NASA Technical Reports Server (NTRS)

    Aalfs, David D.

    1995-01-01

    For univariate random sequences, the power spectral density acts like a probability density function of the frequencies present in the sequence. This dissertation extends that concept to bivariate random sequences. For this purpose, a function called the joint spectral density is defined that represents a joint probability weighing of the frequency content of pairs of random sequences. Given a pair of random sequences, the joint spectral density is not uniquely determined in the absence of any constraints. Two approaches to constraining the sequences are suggested: (1) assume the sequences are the margins of some stationary random field, (2) assume the sequences conform to a particular model that is linked to the joint spectral density. For both approaches, the properties of the resulting sequences are investigated in some detail, and simulation is used to corroborate theoretical results. It is concluded that under either of these two constraints, the joint spectral density can be computed from the non-stationary cross-correlation.

  14. Laser capillary spectrophotometric acquisition of bivariate drop size and concentration data for liquid-liquid dispersion

    DOEpatents

    Tavlarides, L.L.; Bae, J.H.

    1991-12-24

    A laser capillary spectrophotometric technique measures real time or near real time bivariate drop size and concentration distribution for a reactive liquid-liquid dispersion system. The dispersion is drawn into a precision-bore glass capillary and an appropriate light source is used to distinguish the aqueous phase from slugs of the organic phase at two points along the capillary whose separation is precisely known. The suction velocity is measured, as is the length of each slug from which the drop free diameter is calculated. For each drop, the absorptivity at a given wavelength is related to the molar concentration of a solute of interest, and the concentration of given drops of the organic phase is derived from pulse heights of the detected light. This technique permits on-line monitoring and control of liquid-liquid dispersion processes. 17 figures.

  15. Laser capillary spectrophotometric acquisition of bivariate drop size and concentration data for liquid-liquid dispersion

    DOEpatents

    Tavlarides, Lawrence L.; Bae, Jae-Heum

    1991-01-01

    A laser capillary spectrophotometric technique measures real time or near real time bivariate drop size and concentration distribution for a reactive liquid-liquid dispersion system. The dispersion is drawn into a precision-bore glass capillary and an appropriate light source is used to distinguish the aqueous phase from slugs of the organic phase at two points along the capillary whose separation is precisely known. The suction velocity is measured, as is the length of each slug from which the drop free diameter is calculated. For each drop, the absorptivity at a given wavelength is related to the molar concentration of a solute of interest, and the concentration of given drops of the organic phase is derived from pulse heights of the detected light. This technique permits on-line monitoring and control of liquid-liquid dispersion processes.

  16. A bivariate Chebyshev spectral collocation quasilinearization method for nonlinear evolution parabolic equations.

    PubMed

    Motsa, S S; Magagula, V M; Sibanda, P

    2014-01-01

    This paper presents a new method for solving higher order nonlinear evolution partial differential equations (NPDEs). The method combines quasilinearisation, the Chebyshev spectral collocation method, and bivariate Lagrange interpolation. In this paper, we use the method to solve several nonlinear evolution equations, such as the modified KdV-Burgers equation, highly nonlinear modified KdV equation, Fisher's equation, Burgers-Fisher equation, Burgers-Huxley equation, and the Fitzhugh-Nagumo equation. The results are compared with known exact analytical solutions from literature to confirm accuracy, convergence, and effectiveness of the method. There is congruence between the numerical results and the exact solutions to a high order of accuracy. Tables were generated to present the order of accuracy of the method; convergence graphs to verify convergence of the method and error graphs are presented to show the excellent agreement between the results from this study and the known results from literature.

  17. A Bivariate Chebyshev Spectral Collocation Quasilinearization Method for Nonlinear Evolution Parabolic Equations

    PubMed Central

    Motsa, S. S.; Magagula, V. M.; Sibanda, P.

    2014-01-01

    This paper presents a new method for solving higher order nonlinear evolution partial differential equations (NPDEs). The method combines quasilinearisation, the Chebyshev spectral collocation method, and bivariate Lagrange interpolation. In this paper, we use the method to solve several nonlinear evolution equations, such as the modified KdV-Burgers equation, highly nonlinear modified KdV equation, Fisher's equation, Burgers-Fisher equation, Burgers-Huxley equation, and the Fitzhugh-Nagumo equation. The results are compared with known exact analytical solutions from literature to confirm accuracy, convergence, and effectiveness of the method. There is congruence between the numerical results and the exact solutions to a high order of accuracy. Tables were generated to present the order of accuracy of the method; convergence graphs to verify convergence of the method and error graphs are presented to show the excellent agreement between the results from this study and the known results from literature. PMID:25254252

  18. A composite likelihood method for bivariate meta-analysis in diagnostic systematic reviews

    PubMed Central

    Liu, Yulun; Ning, Jing; Nie, Lei; Zhu, Hongjian; Chu, Haitao

    2014-01-01

    Diagnostic systematic review is a vital step in the evaluation of diagnostic technologies. In many applications, it involves pooling pairs of sensitivity and specificity of a dichotomized diagnostic test from multiple studies. We propose a composite likelihood method for bivariate meta-analysis in diagnostic systematic reviews. This method provides an alternative way to make inference on diagnostic measures such as sensitivity, specificity, likelihood ratios and diagnostic odds ratio. Its main advantages over the standard likelihood method are the avoidance of the non-convergence problem, which is non-trivial when the number of studies are relatively small, the computational simplicity and some robustness to model mis-specifications. Simulation studies show that the composite likelihood method maintains high relative efficiency compared to that of the standard likelihood method. We illustrate our method in a diagnostic review of the performance of contemporary diagnostic imaging technologies for detecting metastases in patients with melanoma. PMID:25512146

  19. Bivariate flow cytometric analysis of murine intestinal epithelial cells for cytokinetic studies

    SciTech Connect

    Pallavicini, M.G.; Ng, C.R.; Gray, J.W.

    1984-01-01

    The heterogeneous nature of the small intestine and the lack of methods to obtain pure crypt populations has, in the past, limited the application of standard flow cytometric analysis for cytokinetic studies of the proliferating crypts. The authors describe a flow cytometric technique to discriminate crypt and villus cells in an epithelial cell suspension on the basis of cell length, and to measure the DNA content of the discriminated subpopulations. These data indicate that bivariate analysis of a mixed epithelial cell suspension can be used to distinguish mature villus cells, G/sub 1/ crypt cells, and S-phase crypt cells. In addition, continuous labeling studies suggest that the position of a cell on the cell length axis reflects epithelial cell maturity. The authors applied this flow cytometric technique to determine the cytokinetic nature of epithelial cells obtained by sequential digestion of the small intestine. 22 references, 4 figures, 2 tables.

  20. Estimating the Correlation in Bivariate Normal Data with Known Variances and Small Sample Sizes1

    PubMed Central

    Fosdick, Bailey K.; Raftery, Adrian E.

    2013-01-01

    We consider the problem of estimating the correlation in bivariate normal data when the means and variances are assumed known, with emphasis on the small sample case. We consider eight different estimators, several of them considered here for the first time in the literature. In a simulation study, we found that Bayesian estimators using the uniform and arc-sine priors outperformed several empirical and exact or approximate maximum likelihood estimators in small samples. The arc-sine prior did better for large values of the correlation. For testing whether the correlation is zero, we found that Bayesian hypothesis tests outperformed significance tests based on the empirical and exact or approximate maximum likelihood estimators considered in small samples, but that all tests performed similarly for sample size 50. These results lead us to suggest using the posterior mean with the arc-sine prior to estimate the correlation in small samples when the variances are assumed known. PMID:23378667

  1. A bivariate mixture model analysis of body weight and ascites traits in broilers.

    PubMed

    Zerehdaran, S; van Grevehof, E M; van der Waaij, E H; Bovenhuis, H

    2006-01-01

    The objective of the present study was to use bivariate mixture models to study the relationships between body weight (BW) and ascites indicator traits. Existing data were used from an experiment in which birds were housed in 2 groups under different climate conditions. In the first group, BW, the ratio of right ventricular weight to total ventricular weight (RV:TV), and hematocrit value (HCT) were measured in 4,202 broilers under cold conditions; in the second group, the same traits were measured in 795 birds under normal temperature conditions. Cold-stress conditions were applied to identify individuals that were susceptible to ascites. The RV:TV and HCT were approximately normally distributed under normal temperature conditions, whereas the distributions of these traits were skewed under cold temperature conditions, suggesting different underlying distributions. Fitting a bivariate mixture model to the observations showed that there was only one homogeneous population for ascites traits under normal temperature conditions, whereas there was a mixture of (2) distributions under cold conditions. One distribution contained nonascitic birds and the other distribution contained ascitic birds. In the distribution of nonascitic birds, the inferred phenotypic correlations (phenotypic correlations with 2 distinguishing underlying distributions) of BW with RV:TV and HCT were close to zero (0.10 and -0.07, respectively), whereas in the distribution of ascitic birds, the inferred phenotypic correlations of BW with RV:TV and HCT were negative (-0.39 and -0.4, respectively). The negative inferred correlations of BW with RV:TV and HCT in the distribution of ascitic birds resulted in negative overall correlations (correlations without 2 distinguishing distributions) of BW with RV:TV (-0.30) and HCT (-0.37) under cold conditions. The present results indicate that the overall correlations between BW and ascites traits are dependent on the relative frequency of ascitic and

  2. Pleiotropic locus for emotion recognition and amygdala volume identified using univariate and bivariate linkage

    PubMed Central

    Knowles, Emma E. M.; McKay, D. Reese; Kent, Jack W.; Sprooten, Emma; Carless, Melanie A.; Curran, Joanne E.; de Almeida, Marcio A. A.; Dyer, Thomas D.; Göring, Harald H. H.; Olvera, Rene; Duggirala, Ravi; Fox, Peter; Almasy, Laura; Blangero, John; Glahn, David. C.

    2014-01-01

    The role of the amygdala in emotion recognition is well established and separately each trait has been shown to be highly heritable, but the potential role of common genetic influences on both traits has not been explored. Here we present an investigation of the pleiotropic influences of amygdala and emotion recognition in a sample of randomly selected, extended pedigrees (N = 858). Using a combination of univariate and bivariate linkage we found a pleiotropic region for amygdala and emotion recognition on 4q26 (LOD = 4.34). Association analysis conducted in the region underlying the bivariate linkage peak revealed a variant meeting the corrected significance level (pBonferroni = 5.01×10−05) within an intron of PDE5A (rs2622497, Χ2 =16.67, p = 4.4×10−05) as being jointly influential on both traits. PDE5A has been implicated previously in recognition-memory deficits and is expressed in subcortical structures that are thought to underlie memory ability including the amygdala. The present paper extends our understanding of the shared etiology between amygdala and emotion recognition by showing that the overlap between the two traits is due, at least in part, to common genetic influences. Moreover, the present paper identifies a pleiotropic locus for the two traits and an associated variant, which localizes the genetic signal even more precisely. These results, when taken in the context of previous research, highlight the potential utility of PDE5-inhibitors for ameliorating emotion-recognition deficits in populations including, but not exclusively, those individuals suffering from mental or neurodegenerative illness. PMID:25322361

  3. Pathway analysis using random forests with bivariate node-split for survival outcomes

    PubMed Central

    Pang, Herbert; Datta, Debayan; Zhao, Hongyu

    2010-01-01

    Motivation: There is great interest in pathway-based methods for genomics data analysis in the research community. Although machine learning methods, such as random forests, have been developed to correlate survival outcomes with a set of genes, no study has assessed the abilities of these methods in incorporating pathway information for analyzing microarray data. In general, genes that are identified without incorporating biological knowledge are more difficult to interpret. Correlating pathway-based gene expression with survival outcomes may lead to biologically more meaningful prognosis biomarkers. Thus, a comprehensive study on how these methods perform in a pathway-based setting is warranted. Results: In this article, we describe a pathway-based method using random forests to correlate gene expression data with survival outcomes and introduce a novel bivariate node-splitting random survival forests. The proposed method allows researchers to identify important pathways for predicting patient prognosis and time to disease progression, and discover important genes within those pathways. We compared different implementations of random forests with different split criteria and found that bivariate node-splitting random survival forests with log-rank test is among the best. We also performed simulation studies that showed random forests outperforms several other machine learning algorithms and has comparable results with a newly developed component-wise Cox boosting model. Thus, pathway-based survival analysis using machine learning tools represents a promising approach in dissecting pathways and for generating new biological hypothesis from microarray studies. Availability: R package Pwayrfsurvival is available from URL: http://www.duke.edu/∼hp44/pwayrfsurvival.htm Contact: pathwayrf@gmail.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19933158

  4. Effects of isotonic and isometric hand exercises on pain, hand functions, dexterity and quality of life in women with rheumatoid arthritis.

    PubMed

    Dogu, Beril; Sirzai, Hulya; Yilmaz, Figen; Polat, Basak; Kuran, Banu

    2013-10-01

    The primary objective of our study was to evaluate the effect of 6-week-long isotonic and isometric hand exercises on pain, hand functions, dexterity and quality of life in women diagnosed as rheumatoid arthritis (RA). Our secondary objective was to assess the changes in handgrip strength and disease activity. This randomized, parallel, single-blinded 6-week intervention study enrolled 52 female patients between 40 and 70 years of age, who were diagnosed with RA according to American College of Rheumatology criteria, had disease duration of at least 1 year and had a stage 1-3 disease according to Steinbrocker's functional evaluation scale. Patients were randomized into isotonics and isometrics groups. Exercises were performed on sixth week. All patients were applied wax therapy in the first 2 weeks. Their pain was assessed with visual analog scale (VAS), their hand functions with Duruöz Hand Index (DHI), dexterity with nine hole peg test (NHPT) and quality of life with Rheumatoid Arthritis Quality of Life questionnaire (RAQoL). Dominant and non-dominant handgrip strengths (HS) were measured. Disease activity was determined by disease activity score (DAS 28). We evaluated the difference in the above parameters between baseline and 6 weeks by Wilcoxon paired t test. The study was completed with 47 patients (isotonics n = 23; isometrics n = 24). VAS, DHI, NHPT, and RAQoL scores significantly improved in both groups by the end of 6th week compared to the baseline scores of the study (for isotonics p = 0.036, p = 0.002; p = 0.0001, p = 0.003; for isometrics p = 0.021, p = 0.002, p = 0.005, p = 0.01, respectively). DAS 28 scores decreased in both exercise groups (p = 0.002; p = 0.0001, respectively), while isometrics showed a significant increase in dominant HS (p = 0.029), and isotonics showed a significant increase in non-dominant HS (p = 0.013). This study showed that isometric and isotonic hand exercises decrease pain and disease activity and improve hand functions

  5. Handgrip and general muscular strength and endurance during prolonged bedrest with isometric and isotonic leg exercise training

    NASA Technical Reports Server (NTRS)

    Greenleaf, J. E.; Starr, J. C.; Van Beaumont, W.; Convertino, V. A.

    1983-01-01

    Measurements of maximal grip strength and endurance at 40 percent max strength were obtained for 7 men 19-21 years of age, 1-2 days before and on the first recovery day during three 2-week bedrest (BR) periods, each separated by a 3-week ambulatory recovery period. The subjects performed isometric exercise (IME) for 1 hr/day, isotonic exercise (ITE) for 1 hr/day, and no exercise (NOE) in the three BR periods. It was found that the mean maximal grip strength was unchanged after all three BR periods. Mean grip endurance was found to be unchanged after IME and ITE training, but was significantly reduced after NOE. These results indicate that IME and ITE training during BR do not increase or decrease maximal grip strength, alghough they prevent loss of grip endurance, while the maximal strength of all other major muscle groups decreases in proportion to the length of BR to 70 days. The maximal strength reduction of the large muscle groups was found to be about twice that of the small muscle groups during BR. In addition, it is shown that changes in maximal strength after spaceflight, BR, or water immersion deconditioning cannot be predicted from changes in submaximal or maximal oxygen uptake values.

  6. Isokinetic Strength and Endurance During 30-day 6 deg Head-Down Bed Rest with Isotonic and Isokinetic Exercise Training

    NASA Technical Reports Server (NTRS)

    Greenleaf, J. E.; Bernauer, E. M.; Ertl, A. C.; Bond, M.; Bulbulian, R.

    1994-01-01

    The purpose of our study was to determine if an intensive, intermittent, isokinetic, lower extremity exercise training program would attenuate or eliminate the decrease of muscular strength and endurance during prolonged bed rest (BR). The 19 male subjects (36 +/- 1 yr, 178 +/- 2 cm, 76.5 +/- 1.7 kg) were allocated into a no exercise (NOE) training group (N = 5), an isotonic (lower extremity cycle orgometer) exercise (ITE) training group (N = 7), and an isokinetic (isokinetic knee flexion-extension) exercise (IKE) training group (N = 7). Peak knee (flexion and extension) and shoulder (abduction-adduction) functions were measured weekly in all groups with one 5-repetition set. After BR, average knee extension total work decreased by 16% with NOE, increased by 27% with IKE, and was unchanged with ITE. Average knee flexion total work and peak torque (strength) responses were unchanged in all groups. Force production increased by 20% with IKE and was unchanged with NOE and ITE. Shoulder total work was unchanged in all groups, while gross average peak torque increased by 27% with ITE and by 22% with IKE, and was unchanged with NOE. Thus, while ITE training can maintain some isokinetic functions during BR, maximal intermittent IKE training can increase other functions above pre-BR control levels.

  7. Inhaled clemastine, an H1 antihistamine inhibits airway narrowing caused by aerosols of non-isotonic saline.

    PubMed

    Rodwell, L T; Anderson, S D; Seale, J P

    1991-10-01

    Asthmatic subjects were challenged with aerosols of hyper- and hypotonic saline 15 min (Group A) and 90 min (Group B) after inhaling clemastine. Measurements were made of forced expiratory volume in one second (FEV1) before and after medication and after challenge. When the FEV1 values (% predicted) were compared on the active and placebo days they were higher 15 min after clemastine (p less than 0.05) for both challenges and higher 90 min after clemastine inhalation (p less than 0.05) for the hypertonic challenge. The % fall in FEV1 was compared after the same concentration of saline aerosol had been given on both active and placebo days. In Group A the % fall in FEV1 on the clemastine day was reduced after challenge with hypertonic (p less than 0.02) and hypotonic (p less than 0.03) aerosol. In Group B there was a reduction in the % fall in FEV1 on the clemastine day only after hypertonic challenge (p less than 0.04). The protective effect afforded by clemastine was unrelated to change in baseline lung function. We conclude that histamine is an important mediator of the airway response to non-isotonic aerosols and suggest that the aerosol route of administration may be useful for delivering antihistamines.

  8. Comparison of hypertonic and isotonic reference electrode junctions for measuring ionized calcium in whole blood: a clinical study.

    PubMed

    Masters, P W; Payne, R B

    1993-06-01

    We measured ionized calcium concentrations in whole blood from 91 patients who had no clinical or biochemical evidence of disturbed calcium homeostasis and who had a wide range of serum albumin concentrations. We used both a standard Ciba-Corning 634 analyzer, which has a membrane-restricted saturated KCl reference electrode bridge, and a modified instrument with a 150 mmol/L NaCl bridge. After adjusting the externally standardized values from each instrument for their least-squares regressions on pH, there was a significant correlation between ionized calcium and albumin only with the standard analyzer. In contrast, only values from the modified instrument correlated with serum chloride; this was not explained by ionic strength or organic anion interferences. We conclude that there is unlikely to be any major advantage in using a membrane-restricted isotonic NaCl reference electrode for in vitro clinical measurements, although it may be of value for in vivo monitoring. The importance of measuring serum albumin when using most commercial ionized calcium analyzers is emphasized.

  9. Level lifetimes and quadrupole moments from Coulomb excitation in the Ba chain and the N = 80 isotones

    NASA Astrophysics Data System (ADS)

    Bauer, C.; Guastalla, G.; Leske, J.; Möller, O.; Möller, T.; Pakarinen, J.; Pietralla, N.; Rainovski, G.; Rapisarda, E.; Seweryniak, D.; Stahl, C.; Stegmann, R.; Wiederhold, J.; Zhu, S.

    2012-12-01

    The chain of Barium isotopes enables us to study experimentally the evolution of nuclear quadrupole collectivity from the shell closure at N = 82 towards neutron-deficient or neutron-rich deformed nuclei. The TU Darmstadt group has investigated several nuclei from stable 130,132Ba up to radioactive 140,142Ba with the projectile-Coulomb excitation technique including the use of the Doppler-shift attenuation method (DSAM). Lifetimes of quadrupole-collective states of 132Ba and 140Ba were obtained for the first time as well as the static electric quadrupole moments Q(21+) for 130,132Ba and 140,142Ba. The results are compared to Monte Carlo shell model and Beyond-Mean-Field calculations. The phenomenon of shell stabilization in the N = 80 isotones is further investigated by measurements of the B(E2;21+ → 01+) values of 140Nd and 142Sm and comparison to the quasi-particle phonon model and shell-model calculations.

  10. Isokinetic strength and endurance during 30-day 6 degrees head-down bed rest with isotonic and isokinetic exercise training

    NASA Technical Reports Server (NTRS)

    Greenleaf, J. E.; Bernauer, E. M.; Ertl, A. C.; Bulbulian, R.; Bond, M.

    1994-01-01

    The purpose of our study was to determine if an intensive, intermittent, isokinetic, lower extremity exercise training program would attenuate or eliminate the decrease of muscular strength and endurance during prolonged bed rest (BR). The 19 male subjects (36 +/- 1 yr, 178 +/- 2 cm, 76.5 +/- 1.7 kg) were allocated into a no exercise (NOE) training group (N = 5), an isotonic (lower extremity cycle ergometer) exercise (ITE) training group (N = 7), and an isokinetic (isokinetic knee flexion-extension) exercise (IKE) training group (N = 7). Peak knee (flexion and extension) and shoulder (abduction-adduction) functions were measured weekly in all groups with one 5-repetition set. After BR, average knee extension total work decreased by 16% with NOE, increased by 27% with IKE, and was unchanged with ITE. Average knee flexion total work and peak torque (strength) responses were unchanged in all groups. Force production increased by 20% with IKE and was unchanged with NOE and ITE. Shoulder total work was unchanged in all groups, while gross average peak torque increased by 27% with ITE and by 22% with IKE, and was unchanged with NOE. Thus, while ITE training can maintain some isokinetic functions during BR, maximal intermittent IKE training can increase other functions above pre-BR control levels.

  11. Antioxidant Activity, Total Phenolics Content, Anthocyanin, and Color Stability of Isotonic Model Beverages Colored with Andes Berry (Rubus glaucus Benth) Anthocyanin Powder

    PubMed Central

    Estupiñan, D.C.; Schwartz, S.J.; Garzón, G.A.

    2013-01-01

    The stability of anthocyanin (ACN) freeze-dried powders from Andes berry (Rubus glaucus Benth) as affected by storage, addition of maltodextrin as a carrier agent, and illumination was evaluated in isotonic model beverages. The ethanolic ACN extract was freeze dried with and without maltodextrin DE 20. Isotonic model beverages were colored with freeze-dried ACN powder (FDA), freeze-dried ACN powder with maltodextrin (MFDA), and red nr 40. Beverages were stored in the dark and under the effect of illumination. Half life of the ACNs, changes in color, total phenolics content (TPC), and antioxidant activity were analyzed for 71 d. Addition of maltodextrin and absence of light stabilized the color of beverages and improved ACN and TPC stability during storage. The antioxidant activity of the beverages was higher when they were colored with MFDA and highly correlated with ACN content. There was no correlation between antioxidant activity and TPC. It is concluded that addition of maltodextrin DE 20 as a carrier agent during freeze-drying improves the color and stability of nutraceutical antioxidants present in Andes berry extract. This suggests a protective enclosing of ACNs within a maltodextrin matrix with a resulting powder that could serve as a supplement or additive to naturally color and to enhance the antioxidant capacity of isotonic beverages. PMID:21535712

  12. Analysis of meteorological droughts for the Saskatchewan River Basin using univariate and bivariate approaches

    NASA Astrophysics Data System (ADS)

    Masud, M. B.; Khaliq, M. N.; Wheater, H. S.

    2015-03-01

    This study is focused on the Saskatchewan River Basin (SRB) that spans southern parts of Alberta, Saskatchewan and Manitoba, the three Prairie Provinces of Canada, where most of the country's agricultural activities are concentrated. The SRB is confronted with immense water-related challenges and is now one of the ten GEWEX (Global Energy and Water Exchanges) Regional Hydroclimate Projects in the world. In the past, various multi-year droughts have been observed in this part of Canada that impacted agriculture, energy and socio-economic sectors. Therefore, proper understanding of the spatial and temporal characteristics of historical droughts is important for many water resources planning and management related activities across the basin. In the study, observed gridded data of daily precipitation and temperature and conventional univariate and copula-based bivariate frequency analyses are used to characterize drought events in terms of drought severity and duration on the basis of two drought indices, the Standardized Precipitation Index (SPI) and the Standardized Precipitation Evapotranspiration Index (SPEI). Within the framework of univariate and bivariate analyses, drought risk indicators are developed and mapped across the SRB to delineate the most vulnerable parts of the basin. Based on the results obtained, southern parts of the SRB (i.e., western part of the South Saskatchewan River, Seven Persons Creek and Bigstick Lake watersheds) are associated with a higher drought risk, while moderate risk is noted for the North Saskatchewan River (except its eastern parts), Red Deer River, Oldman River, Bow River, Sounding Creek, Carrot River and Battle River watersheds. Lower drought risk is found for the areas surrounding the Saskatchewan-Manitoba border (particularly, the Saskatchewan River watershed). It is also found that the areas characterized with higher drought severity are also associated with higher drought duration. A comparison of SPI- and SPEI

  13. Dynamics of intracranial electroencephalographic recordings from epilepsy patients using univariate and bivariate recurrence networks

    NASA Astrophysics Data System (ADS)

    Subramaniyam, Narayan Puthanmadam; Hyttinen, Jari

    2015-02-01

    Recently Andrezejak et al. combined the randomness and nonlinear independence test with iterative amplitude adjusted Fourier transform (iAAFT) surrogates to distinguish between the dynamics of seizure-free intracranial electroencephalographic (EEG) signals recorded from epileptogenic (focal) and nonepileptogenic (nonfocal) brain areas of epileptic patients. However, stationarity is a part of the null hypothesis for iAAFT surrogates and thus nonstationarity can violate the null hypothesis. In this work we first propose the application of the randomness and nonlinear independence test based on recurrence network measures to distinguish between the dynamics of focal and nonfocal EEG signals. Furthermore, we combine these tests with both iAAFT and truncated Fourier transform (TFT) surrogate methods, which also preserves the nonstationarity of the original data in the surrogates along with its linear structure. Our results indicate that focal EEG signals exhibit an increased degree of structural complexity and interdependency compared to nonfocal EEG signals. In general, we find higher rejections for randomness and nonlinear independence tests for focal EEG signals compared to nonfocal EEG signals. In particular, the univariate recurrence network measures, the average clustering coefficient C and assortativity R , and the bivariate recurrence network measure, the average cross-clustering coefficient Ccross, can successfully distinguish between the focal and nonfocal EEG signals, even when the analysis is restricted to nonstationary signals, irrespective of the type of surrogates used. On the other hand, we find that the univariate recurrence network measures, the average path length L , and the average betweenness centrality BC fail to distinguish between the focal and nonfocal EEG signals when iAAFT surrogates are used. However, these two measures can distinguish between focal and nonfocal EEG signals when TFT surrogates are used for nonstationary signals. We also

  14. Application of continuous normal-lognormal bivariate density functions in a sensitivity analysis of municipal solid waste landfill.

    PubMed

    Petrovic, Igor; Hip, Ivan; Fredlund, Murray D

    2016-09-01

    The variability of untreated municipal solid waste (MSW) shear strength parameters, namely cohesion and shear friction angle, with respect to waste stability problems, is of primary concern due to the strong heterogeneity of MSW. A large number of municipal solid waste (MSW) shear strength parameters (friction angle and cohesion) were collected from published literature and analyzed. The basic statistical analysis has shown that the central tendency of both shear strength parameters fits reasonably well within the ranges of recommended values proposed by different authors. In addition, it was established that the correlation between shear friction angle and cohesion is not strong but it still remained significant. Through use of a distribution fitting method it was found that the shear friction angle could be adjusted to a normal probability density function while cohesion follows the log-normal density function. The continuous normal-lognormal bivariate density function was therefore selected as an adequate model to ascertain rational boundary values ("confidence interval") for MSW shear strength parameters. It was concluded that a curve with a 70% confidence level generates a "confidence interval" within the reasonable limits. With respect to the decomposition stage of the waste material, three different ranges of appropriate shear strength parameters were indicated. Defined parameters were then used as input parameters for an Alternative Point Estimated Method (APEM) stability analysis on a real case scenario of the Jakusevec landfill. The Jakusevec landfill is the disposal site of the capital of Croatia - Zagreb. The analysis shows that in the case of a dry landfill the most significant factor influencing the safety factor was the shear friction angle of old, decomposed waste material, while in the case of a landfill with significant leachate level the most significant factor influencing the safety factor was the cohesion of old, decomposed waste material. The

  15. Improving the chi-squared approximation for bivariate normal tolerance regions

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.

    1993-01-01

    Let X be a two-dimensional random variable distributed according to N2(mu,Sigma) and let bar-X and S be the respective sample mean and covariance matrix calculated from N observations of X. Given a containment probability beta and a level of confidence gamma, we seek a number c, depending only on N, beta, and gamma such that the ellipsoid R = (x: (x - bar-X)'S(exp -1) (x - bar-X) less than or = c) is a tolerance region of content beta and level gamma; i.e., R has probability gamma of containing at least 100 beta percent of the distribution of X. Various approximations for c exist in the literature, but one of the simplest to compute -- a multiple of the ratio of certain chi-squared percentage points -- is badly biased for small N. For the bivariate normal case, most of the bias can be removed by simple adjustment using a factor A which depends on beta and gamma. This paper provides values of A for various beta and gamma so that the simple approximation for c can be made viable for any reasonable sample size. The methodology provides an illustrative example of how a combination of Monte-Carlo simulation and simple regression modelling can be used to improve an existing approximation.

  16. Modeling both of the number of pausibacillary and multibacillary leprosy patients by using bivariate poisson regression

    NASA Astrophysics Data System (ADS)

    Winahju, W. S.; Mukarromah, A.; Putri, S.

    2015-03-01

    Leprosy is a chronic infectious disease caused by bacteria of leprosy (Mycobacterium leprae). Leprosy has become an important thing in Indonesia because its morbidity is quite high. Based on WHO data in 2014, in 2012 Indonesia has the highest number of new leprosy patients after India and Brazil with a contribution of 18.994 people (8.7% of the world). This number makes Indonesia automatically placed as the country with the highest number of leprosy morbidity of ASEAN countries. The province that most contributes to the number of leprosy patients in Indonesia is East Java. There are two kind of leprosy. They consist of pausibacillary and multibacillary. The morbidity of multibacillary leprosy is higher than pausibacillary leprosy. This paper will discuss modeling both of the number of multibacillary and pausibacillary leprosy patients as responses variables. These responses are count variables, so modeling will be conducted by using bivariate poisson regression method. Unit experiment used is in East Java, and predictors involved are: environment, demography, and poverty. The model uses data in 2012, and the result indicates that all predictors influence significantly.

  17. Bivariate Frequency Analysis with Nonstationary Gumbel/GEV Marginal Distributions for Rainfall Event

    NASA Astrophysics Data System (ADS)

    Joo, Kyungwon; Kim, Sunghun; Kim, Hanbeen; Ahn, Hyunjun; Heo, Jun-Haeng

    2016-04-01

    Multivariate frequency analysis has been developing for hydrological data recently. Particularly, the copula model has been used as an effective method which has no limitation on deciding marginal distributions. The time-series rainfall data can be characterized to rainfall event by inter-event time definition and each rainfall event has rainfall depth and duration. In addition, changes in rainfall depth have been studied recently due to climate change. The nonstationary (time-varying) Gumbel and Generalized Extreme Value (GEV) have been developed and their performances have been investigated from many studies. In the current study, bivariate frequency analysis has performed for rainfall depth and duration using Archimedean copula on stationary and nonstationary hourly rainfall data to consider the effect of climate change. The parameter of copula model is estimated by inference function for margin (IFM) method and stationary/nonstationary Gumbel and GEV distributions are used for marginal distributions. As a result, level curve of copula model is obtained and goodness-of-fit test is performed to choose appropriate marginal distribution among the applied stationary and nonstationary Gumbel and GEV distributions.

  18. Pig standard bivariate flow karyotype and peak assignment for chromosomes X, Y, 3, and 7.

    PubMed

    Schmitz, A; Chardon, P; Gainche, I; Chaput, B; Guilly, M N; Frelat, G; Vaiman, M

    1992-10-01

    A standard pig flow karyotype (2N = 38 chromosomes) was defined by standardization of several flow karyotypes obtained from stimulated peripheral blood lymphocytes of normal male and female pigs. Depending on the animals under study, the flow analysis of their chromosome suspensions gave rise to bivariate flow karyotypes comprising from 15 to 17 peaks, of which 11 to 15 represented single chromosomes. The results were used to propose a peak nomenclature. In addition, a male miniature pig lymphoblastoid cell line was characterized by flow cytogenetics. A very high-resolution flow karyotype, in which all peaks but one superimposed on those of the standard karyotype, was obtained. Peaks were assigned for chromosomes X and Y. Analysis of flow karyotypes obtained from translocated t(3,7)(p1.3;q2.1) pigs combined with polymerase chain reaction (PCR) studies of major histocompatibility complex (MHC)-linked sequences on flow-sorted chromosomes allowed identification of peaks 3 and 7 of normal pig chromosomes and of the derivative chromosomes associated with the t(3,7)(p1.3;q2.1) translocation.

  19. Semiparametric bivariate zero-inflated Poisson models with application to studies of abundance for multiple species

    USGS Publications Warehouse

    Arab, Ali; Holan, Scott H.; Wikle, Christopher K.; Wildhaber, Mark L.

    2012-01-01

    Ecological studies involving counts of abundance, presence–absence or occupancy rates often produce data having a substantial proportion of zeros. Furthermore, these types of processes are typically multivariate and only adequately described by complex nonlinear relationships involving externally measured covariates. Ignoring these aspects of the data and implementing standard approaches can lead to models that fail to provide adequate scientific understanding of the underlying ecological processes, possibly resulting in a loss of inferential power. One method of dealing with data having excess zeros is to consider the class of univariate zero-inflated generalized linear models. However, this class of models fails to address the multivariate and nonlinear aspects associated with the data usually encountered in practice. Therefore, we propose a semiparametric bivariate zero-inflated Poisson model that takes into account both of these data attributes. The general modeling framework is hierarchical Bayes and is suitable for a broad range of applications. We demonstrate the effectiveness of our model through a motivating example on modeling catch per unit area for multiple species using data from the Missouri River Benthic Fishes Study, implemented by the United States Geological Survey.

  20. A bivariate binormal ROC methodology for comparing new methods to an existing standard for screening applications

    NASA Astrophysics Data System (ADS)

    Abbey, Craig K.; Insana, Michael F.; Eckstein, Miguel P.; Boone, John M.

    2007-03-01

    Validating the use of new imaging technologies for screening large patient populations is an important and very challenging area of diagnostic imaging research. A particular concern in ROC studies evaluating screening technologies is the problem of verification bias, in which an independent verification of disease status is only available for a subpopulation of patients, typically those with positive results by a current screening standard. For example, in screening mammography, a study might evaluate a new approach using a sample of patients that have undergone needle biopsy following a standard mammogram and subsequent work-up. This case sampling approach provides accurate independent verification of ground truth and increases the prevalence of disease cases. However, the selection criteria will likely bias results of the study. In this work we present an initial exploration of an approach to correcting this bias within the parametric framework of binormal assumptions. We posit conditionally bivariate normal distributions on the latent decision variable for both the new methodology as well as the screening standard. In this case, verification bias can be seen as the effect of missing data from an operating point in the screening standard. We examine the magnitude of this bias in the setting of breast cancer screening with mammography, and we derive a maximum likelihood approach to estimating bias corrected ROC curves in this model.

  1. On a bivariate spectral relaxation method for unsteady magneto-hydrodynamic flow in porous media.

    PubMed

    Magagula, Vusi M; Motsa, Sandile S; Sibanda, Precious; Dlamini, Phumlani G

    2016-01-01

    The paper presents a significant improvement to the implementation of the spectral relaxation method (SRM) for solving nonlinear partial differential equations that arise in the modelling of fluid flow problems. Previously the SRM utilized the spectral method to discretize derivatives in space and finite differences to discretize in time. In this work we seek to improve the performance of the SRM by applying the spectral method to discretize derivatives in both space and time variables. The new approach combines the relaxation scheme of the SRM, bivariate Lagrange interpolation as well as the Chebyshev spectral collocation method. The technique is tested on a system of four nonlinear partial differential equations that model unsteady three-dimensional magneto-hydrodynamic flow and mass transfer in a porous medium. Computed solutions are compared with previously published results obtained using the SRM, the spectral quasilinearization method and the Keller-box method. There is clear evidence that the new approach produces results that as good as, if not better than published results determined using the other methods. The main advantage of the new approach is that it offers better accuracy on coarser grids which significantly improves the computational speed of the method. The technique also leads to faster convergence to the required solution.

  2. Copula-based regression modeling of bivariate severity of temporary disability and permanent motor injuries.

    PubMed

    Ayuso, Mercedes; Bermúdez, Lluís; Santolino, Miguel

    2016-04-01

    The analysis of factors influencing the severity of the personal injuries suffered by victims of motor accidents is an issue of major interest. Yet, most of the extant literature has tended to address this question by focusing on either the severity of temporary disability or the severity of permanent injury. In this paper, a bivariate copula-based regression model for temporary disability and permanent injury severities is introduced for the joint analysis of the relationship with the set of factors that might influence both categories of injury. Using a motor insurance database with 21,361 observations, the copula-based regression model is shown to give a better performance than that of a model based on the assumption of independence. The inclusion of the dependence structure in the analysis has a higher impact on the variance estimates of the injury severities than it does on the point estimates. By taking into account the dependence between temporary and permanent severities a more extensive factor analysis can be conducted. We illustrate that the conditional distribution functions of injury severities may be estimated, thus, providing decision makers with valuable information.

  3. Bayesian Data Analysis with the Bivariate Hierarchical Ornstein-Uhlenbeck Process Model.

    PubMed

    Oravecz, Zita; Tuerlinckx, Francis; Vandekerckhove, Joachim

    2016-01-01

    In this paper, we propose a multilevel process modeling approach to describing individual differences in within-person changes over time. To characterize changes within an individual, repeated measures over time are modeled in terms of three person-specific parameters: a baseline level, intraindividual variation around the baseline, and regulatory mechanisms adjusting toward baseline. Variation due to measurement error is separated from meaningful intraindividual variation. The proposed model allows for the simultaneous analysis of longitudinal measurements of two linked variables (bivariate longitudinal modeling) and captures their relationship via two person-specific parameters. Relationships between explanatory variables and model parameters can be studied in a one-stage analysis, meaning that model parameters and regression coefficients are estimated simultaneously. Mathematical details of the approach, including a description of the core process model-the Ornstein-Uhlenbeck model-are provided. We also describe a user friendly, freely accessible software program that provides a straightforward graphical interface to carry out parameter estimation and inference. The proposed approach is illustrated by analyzing data collected via self-reports on affective states.

  4. IDF relationships using bivariate copula for storm events in Peninsular Malaysia

    NASA Astrophysics Data System (ADS)

    Ariff, N. M.; Jemain, A. A.; Ibrahim, K.; Wan Zin, W. Z.

    2012-11-01

    SummaryIntensity-duration-frequency (IDF) curves are used in many hydrologic designs for the purpose of water managements and flood preventions. The IDF curves available in Malaysia are those obtained from univariate analysis approach which only considers the intensity of rainfalls at fixed time intervals. As several rainfall variables are correlated with each other such as intensity and duration, this paper aims to derive IDF points for storm events in Peninsular Malaysia by means of bivariate frequency analysis. This is achieved through utilizing the relationship between storm intensities and durations using the copula method. Four types of copulas; namely the Ali-Mikhail-Haq (AMH), Frank, Gaussian and Farlie-Gumbel-Morgenstern (FGM) copulas are considered because the correlation between storm intensity, I, and duration, D, are negative and these copulas are appropriate when the relationship between the variables are negative. The correlations are attained by means of Kendall's τ estimation. The analysis was performed on twenty rainfall stations with hourly data across Peninsular Malaysia. Using Akaike's Information Criteria (AIC) for testing goodness-of-fit, both Frank and Gaussian copulas are found to be suitable to represent the relationship between I and D. The IDF points found by the copula method are compared to the IDF curves yielded based on the typical IDF empirical formula of the univariate approach. This study indicates that storm intensities obtained from both methods are in agreement with each other for any given storm duration and for various return periods.

  5. The Bivariate Luminosity--HI Mass Distribution Function of Galaxies based on the NIBLES Survey

    NASA Astrophysics Data System (ADS)

    Butcher, Zhon; Schneider, Stephen E.; van Driel, Wim; Lehnert, Matt

    2016-01-01

    We use 21cm HI line observations for 2610 galaxies from the Nançay Interstellar Baryons Legacy Extragalactic Survey (NIBLES) to derive a bivariate luminosity--HI mass distribution function. Our HI survey was selected to randomly probe the local (900 < cz < 12,000 km/s) galaxy population in each 0.5 mag wide bin for the absolute z-band magnitude range of -13.5 < Mz < -24 without regard to morphology or color. This targeted survey allowed more on-source integration time for weak and non-detected sources, enabling us to probe lower HI mass fractions and apply lower upper limits for non-detections than would be possible with the larger blind HI surveys. Additionally, we obtained a factor of four higher sensitivity follow-up observations at Arecibo of 90 galaxies from our non-detected and marginally detected categories to quantify the underlying HI distribution of sources not detected at Nançay. Using the optical luminosity function and our higher sensitivity follow up observations as priors, we use a 2D stepwise maximum likelihood technique to derive the two dimensional volume density distribution of luminosity and HI mass in each SDSS band.

  6. Improved deadzone modeling for bivariate wavelet shrinkage-based image denoising

    NASA Astrophysics Data System (ADS)

    DelMarco, Stephen

    2016-05-01

    Modern image processing performed on-board low Size, Weight, and Power (SWaP) platforms, must provide high- performance while simultaneously reducing memory footprint, power consumption, and computational complexity. Image preprocessing, along with downstream image exploitation algorithms such as object detection and recognition, and georegistration, place a heavy burden on power and processing resources. Image preprocessing often includes image denoising to improve data quality for downstream exploitation algorithms. High-performance image denoising is typically performed in the wavelet domain, where noise generally spreads and the wavelet transform compactly captures high information-bearing image characteristics. In this paper, we improve modeling fidelity of a previously-developed, computationally-efficient wavelet-based denoising algorithm. The modeling improvements enhance denoising performance without significantly increasing computational cost, thus making the approach suitable for low-SWAP platforms. Specifically, this paper presents modeling improvements to the Sendur-Selesnick model (SSM) which implements a bivariate wavelet shrinkage denoising algorithm that exploits interscale dependency between wavelet coefficients. We formulate optimization problems for parameters controlling deadzone size which leads to improved denoising performance. Two formulations are provided; one with a simple, closed form solution which we use for numerical result generation, and the second as an integral equation formulation involving elliptic integrals. We generate image denoising performance results over different image sets drawn from public domain imagery, and investigate the effect of wavelet filter tap length on denoising performance. We demonstrate denoising performance improvement when using the enhanced modeling over performance obtained with the baseline SSM model.

  7. Trimmed weighted Simes' test for two one-sided hypotheses with arbitrarily correlated test statistics.

    PubMed

    Brannath, Werner; Bretz, Frank; Maurer, Willi; Sarkar, Sanat

    2009-12-01

    The two-sided Simes test is known to control the type I error rate with bivariate normal test statistics. For one-sided hypotheses, control of the type I error rate requires that the correlation between the bivariate normal test statistics is non-negative. In this article, we introduce a trimmed version of the one-sided weighted Simes test for two hypotheses which rejects if (i) the one-sided weighted Simes test rejects and (ii) both p-values are below one minus the respective weighted Bonferroni adjusted level. We show that the trimmed version controls the type I error rate at nominal significance level alpha if (i) the common distribution of test statistics is point symmetric and (ii) the two-sided weighted Simes test at level 2alpha controls the level. These assumptions apply, for instance, to bivariate normal test statistics with arbitrary correlation. In a simulation study, we compare the power of the trimmed weighted Simes test with the power of the weighted Bonferroni test and the untrimmed weighted Simes test. An additional result of this article ensures type I error rate control of the usual weighted Simes test under a weak version of the positive regression dependence condition for the case of two hypotheses. This condition is shown to apply to the two-sided p-values of one- or two-sample t-tests for bivariate normal endpoints with arbitrary correlation and to the corresponding one-sided p-values if the correlation is non-negative. The Simes test for such types of bivariate t-tests has not been considered before. According to our main result, the trimmed version of the weighted Simes test then also applies to the one-sided bivariate t-test with arbitrary correlation.

  8. Leg muscle volume during 30-day 6-degree head-down bed rest with isotonic and isokinetic exercise training

    NASA Technical Reports Server (NTRS)

    Greenleaf, J. E.; Lee, P. L.; Ellis, S.; Selzer, R. H.; Ortendahl, D. A.

    1994-01-01

    Magnetic resonance imaging (MRI) was used to compare the effect of two modes of lower-extremity exercise training on the mass (volume) of posterior leg group (PLG) muscles (soleus, flexor hallucis longus, tibialis posterior, lateral and medial gastrocnemius, and flexor digitorum longus) on 19 men (ages 32-42 years) subjected to intense dynamic-isotonic (ITE, cycle ergometer, number of subjects (N) = 7), isokinetic (IKE, torque egrometer, N = 7), and no exercise (NOE, N = 5) training for 60 min/day during head-down bed rest (HDBR). Total volume of the PLG muscles decreased (p less than 0.05) similarly: ITE = 4.3 +/- SE 1.6%, IKE = 7.7 +/- 1.6%, and NOE = 6.3 +/- 0.8%; combined volume (N = 19) loss was 6.1 +/- 0.9%. Ranges of volume changes were 2.6% to -9.0% (ITE), -2.1% to -14.9% (IKE), and -3.4% to -8/1% (NOE). Correlation coefficients (r) of muscle volume versus thickness measured with ultrasonography were: ITE r + 0.79 (p less than 0.05), IKE r = 0.27 (not significant (NS)), and NOE r = 0.63 (NS). Leg-muscle volume and thickness were highly correlated (r = 0.79) when plasma volume was maintained during HDBR with ITE. Thus, neither intensive lower extremity ITE nor IKE training influence the normal non-exercised posterior leg muscle atrophy during HDBR. The relationship of muscle volume and thickness may depend on the mode of exercise training associated with the maintenance of plasma volume.

  9. Isometric force development, isotonic shortening, and elasticity measurements from Ca(2+)-activated ventricular muscle of the guinea pig

    PubMed Central

    Maughan, DW; Low, ES; Alpert, NR

    1978-01-01

    Isometric tension and isotonic shortening were measured at constant levels of calcium activation of varying magnitude in mechanically disrupted EGTA-treated ventricular bundles from guinea pigs. The results were as follows: (a) The effect of creatine phosphate (CP) on peak tension and rate of shortening saturated at a CP concentration more than 10 mM; below that level tension was increased and shortening velocity decreased. We interpreted this to mean that CP above 10 mM was sufficient to buffer MgATP(2-) intracellularly. (b) The activated bundles exhibited an exponential stress-strain relationship and the series elastic properties did not vary appreciably with degree of activation or creatine phosphate level. (c) At a muscle length 20 percent beyond just taut, peak tension increased with Ca(2+) concentration over the range slightly below 10(-6) to slightly above 10(-4)M. (d) By releasing the muscle length-active tension curves were constructed. Force declined to 20 percent peak tension with a decrease in muscle length (after the recoil) of only 11 percent at 10(-4)M Ca(2+) and 6 percent at 4x10(-6)M Ca(2+). (e) The rate of shortening after a release was greater at lower loads. At identical loads (relative to maximum force at a given Ca(2+) level), velocity at a given time after the release was less at lower Ca(2+) concentrations; at 10 M(-5), velocity was 72 percent of that at 10(-4)M, and at 4x10(-6)M, active shortening was usually delayed and was 40 percent of the velocity at 10(-4) M. Thus, under the conditions of these experiments, both velocity and peak tension depend on the level of Ca(2+) activation over a similar range of Ca(2+) concentration. PMID:149182

  10. Knee-Joint Proprioception During 30-Day 6 deg Head-Down Bed Rest with Isotonic and Isokinetic Exercise Training

    NASA Technical Reports Server (NTRS)

    Bernauer, E. M.; Walby, W. F.; Ertl, A. C.; Dempster, P. T.; Bond, M.; Greenleaf, J. E.

    1994-01-01

    To determine if daily isotonic exercise or isokinetic exercise training coupled with daily log proprioceptive training, would influence log proprioceptive tracking responses during Bed Rest (BR), 19 men (36 +/- SD 4 years, 178 +/- 7 cm, 76.8 +/- 7.8 kg) were allocated into a NO-Exercise (NOE) training control group (n = 5), and IsoTanic Exercise (ITE, n = 7) and IsoKinetic Exercise (IKE, n = 7) training groups. Exercise training was conducted during BR for two 30-min period / d, 5 d /week. Only the IKE group performed proprioceptive training using a now isokinetic procedure with each lower extremity for 2.5 min before and after the daily exercise training sessions; proprioceptive testing occurred weekly for all groups. There were no significant differences in proprioceptive tracking scores, expressed as a percentage of the perfect score of 100, in the pro-BR ambulatory control period between the three groups. Knee extension and flexion tracking responses were unchanged with NOE during BR, but were significantly greater (*p less than 0.05) at the end of BR in both exercise groups when compared with NOE responses (extension: NOE 80.7 +/- 0.7%, ITE 82.9 +/- 0.6%, IKE 86.5* +/- 0.7%; flexion: NOE 77.6 +/- 1.50, ITE 80.0 +/- 0.8% (NS), IKE 83.6* +/- 0.8%). Although proprioceptive tracking was unchanged during BR with NOE, both lsotonic exercise training (without additional propriaceptive training) and especially isokinetic exercise training when combined with daily proprioceptive training, significantly improved knee proprioceptive tracking responses after 30 d of BR.

  11. Alterations in Skeletal Muscle Function with Microgravity, and the Protective Effects of High Resistance Isometric and Isotonic Exercise

    NASA Technical Reports Server (NTRS)

    Fitts, R. H.; Hurst, J. E.; Norenberg, K. M.; Widrick, J. J.; Riley, D. A.; Bain, J. L. W.; Trappe, S. W.; Trappe, T. A.; Costill, D. L.

    1999-01-01

    Exposure to microgravity or models designed to mimic the unloaded condition, such as bed rest in humans and hindlimb unloading (HU) in rats leads to skeletal muscle atrophy, a loss in peak force and power, and an increased susceptibility to fatigue. The posterior compartment muscles of the lower leg (calf muscle group) appear to be particularly susceptible. Following only 1 wk in space or HU, rat soleus muscle showed a 30 to 40% loss in wet weight. After 3 wk of HU, almost all of the atrophied soleus fibers showed a significant increase in maximal shortening velocity (V(sub 0)), while only 25 to 30 % actually transitioned to fast fibers. The increased V(sub 0), was protective in that it reduced the decline in peak power associated with the reduced peak force. When the soleus is stimulated in situ following HU or zero-g one observes an increased rate and extent of fatigue, and in the former the increased fatigue is associated with a more rapid depletion of muscle glycogen and lactate production. Our working hypothesis is that following HU or spaceflight in rats and bed rest or spaceflight in humans limb skeletal muscles during contractile activity depend more on carbohydrates and less on fatty acids for their substrate supply. Baldwin et al. found 9 days of spaceflight to reduce by 37% the ability of both the high and low oxidative regions of the vastus muscle to oxidize long-chain fatty acids. This decline was not associated with any change in the enzymes of the tricarboxylic acid cycle or oxidation pathway. The purpose of the current research was to establish the extent of functional change in the slow type I and fast type H fibers of the human calf muscle following 17 days of spaceflight, and determine the cellular mechanisms of the observed changes. A second goal was to study the effectiveness of high resistance isotonic and isometric exercise in preventing the deleterious functional changes associated with unloading.

  12. On the level of skill in predicting maximum sunspot number - A comparative study of single variate and bivariate precursor techniques

    NASA Technical Reports Server (NTRS)

    Wilson, Robert M.

    1990-01-01

    The level of skill in predicting the size of the sunspot cycle is investigated for the two types of precursor techniques, single variate and bivariate fits, both applied to cycle 22. The present level of growth in solar activity is compared to the mean level of growth (cycles 10-21) and to the predictions based on the precursor techniques. It is shown that, for cycle 22, both single variate methods (based on geomagnetic data) and bivariate methods suggest a maximum amplitude smaller than that observed for cycle 19, and possibly for cycle 21. Compared to the mean cycle, cycle 22 is presently behaving as if it were a +2.6 sigma cycle (maximum amplitude of about 225), which means that either it will be the first cycle not to be reliably predicted by the combined precursor techniques or its deviation relative to the mean cycle will substantially decrease over the next 18 months.

  13. Efficient independent planar dose calculation for FFF IMRT QA with a bivariate Gaussian source model.

    PubMed

    Li, Feifei; Park, Ji-Yeon; Barraclough, Brendan; Lu, Bo; Li, Jonathan; Liu, Chihray; Yan, Guanghua

    2017-03-01

    The aim of this study is to perform a direct comparison of the source model for photon beams with and without flattening filter (FF) and to develop an efficient independent algorithm for planar dose calculation for FF-free (FFF) intensity-modulated radiotherapy (IMRT) quality assurance (QA). The source model consisted of a point source modeling the primary photons and extrafocal bivariate Gaussian functions modeling the head scatter, monitor chamber backscatter, and collimator exchange effect. The model parameters were obtained by minimizing the difference between the calculated and measured in-air output factors (Sc ). The fluence of IMRT beams was calculated from the source model using a backprojection and integration method. The off-axis ratio in FFF beams were modeled with a fourth degree polynomial. An analytical kernel consisting of the sum of three Gaussian functions was used to describe the dose deposition process. A convolution-based method was used to account for the ionization chamber volume averaging effect when commissioning the algorithm. The algorithm was validated by comparing the calculated planar dose distributions of FFF head-and-neck IMRT plans with measurements performed with a 2D diode array. Good agreement between the measured and calculated Sc was achieved for both FF beams (<0.25%) and FFF beams (<0.10%). The relative contribution of the head-scattered photons reduced by 34.7% for 6 MV and 49.3% for 10 MV due to the removal of the FF. Superior agreement between the calculated and measured dose distribution was also achieved for FFF IMRT. In the gamma comparison with a 2%/2 mm criterion, the average passing rate was 96.2 ± 1.9% for 6 MV FFF and 95.5 ± 2.6% for 10 MV FFF. The efficient independent planar dose calculation algorithm is easy to implement and can be valuable in FFF IMRT QA.

  14. New Colors for Histology: Optimized Bivariate Color Maps Increase Perceptual Contrast in Histological Images

    PubMed Central

    Kather, Jakob Nikolas; Weis, Cleo-Aron; Marx, Alexander; Schuster, Alexander K.; Schad, Lothar R.; Zöllner, Frank Gerrit

    2015-01-01

    Background Accurate evaluation of immunostained histological images is required for reproducible research in many different areas and forms the basis of many clinical decisions. The quality and efficiency of histopathological evaluation is limited by the information content of a histological image, which is primarily encoded as perceivable contrast differences between objects in the image. However, the colors of chromogen and counterstain used for histological samples are not always optimally distinguishable, even under optimal conditions. Methods and Results In this study, we present a method to extract the bivariate color map inherent in a given histological image and to retrospectively optimize this color map. We use a novel, unsupervised approach based on color deconvolution and principal component analysis to show that the commonly used blue and brown color hues in Hematoxylin—3,3’-Diaminobenzidine (DAB) images are poorly suited for human observers. We then demonstrate that it is possible to construct improved color maps according to objective criteria and that these color maps can be used to digitally re-stain histological images. Validation To validate whether this procedure improves distinguishability of objects and background in histological images, we re-stain phantom images and N = 596 large histological images of immunostained samples of human solid tumors. We show that perceptual contrast is improved by a factor of 2.56 in phantom images and up to a factor of 2.17 in sets of histological tumor images. Context Thus, we provide an objective and reliable approach to measure object distinguishability in a given histological image and to maximize visual information available to a human observer. This method could easily be incorporated in digital pathology image viewing systems to improve accuracy and efficiency in research and diagnostics. PMID:26717571

  15. Cost-effectiveness analysis using data from multinational trials: The use of bivariate hierarchical modelling

    PubMed Central

    Manca, Andrea; Lambert, Paul C; Sculpher, Mark; Rice, Nigel

    2008-01-01

    Healthcare cost-effectiveness analysis (CEA) often uses individual patient data (IPD) from multinational randomised controlled trials. Although designed to account for between-patient sampling variability in the clinical and economic data, standard analytical approaches to CEA ignore the presence of between-location variability in the study results. This is a restrictive limitation given that countries often differ in factors that could affect the results of CEAs, such as the availability of healthcare resources, their unit costs, clinical practice, and patient case-mix. We advocate the use of Bayesian bivariate hierarchical modelling to analyse multinational cost-effectiveness data. This analytical framework explicitly recognises that patient-level costs and outcomes are nested within countries. Using real life data, we illustrate how the proposed methods can be applied to obtain (a) more appropriate estimates of overall cost-effectiveness and associated measure of sampling uncertainty compared to standard CEA; and (b) country-specific cost-effectiveness estimates which can be used to assess the between-location variability of the study results, while controlling for differences in country-specific and patient-specific characteristics. It is demonstrated that results from standard CEA using IPD from multinational trials display a large degree of variability across the 17 countries included in the analysis, producing potentially misleading results. In contrast, ‘shrinkage estimates’ obtained from the modelling approach proposed here facilitate the appropriate quantification of country-specific cost-effectiveness estimates, while weighting the results based on the level of information available within each country. We suggest that the methods presented here represent a general framework for the analysis of economic data collected from different locations. PMID:17641141

  16. Bivariate segmentation of SNP-array data for allele-specific copy number analysis in tumour samples

    PubMed Central

    2013-01-01

    Background SNP arrays output two signals that reflect the total genomic copy number (LRR) and the allelic ratio (BAF), which in combination allow the characterisation of allele-specific copy numbers (ASCNs). While methods based on hidden Markov models (HMMs) have been extended from array comparative genomic hybridisation (aCGH) to jointly handle the two signals, only one method based on change-point detection, ASCAT, performs bivariate segmentation. Results In the present work, we introduce a generic framework for bivariate segmentation of SNP array data for ASCN analysis. For the matter, we discuss the characteristics of the typically applied BAF transformation and how they affect segmentation, introduce concepts of multivariate time series analysis that are of concern in this field and discuss the appropriate formulation of the problem. The framework is implemented in a method named CnaStruct, the bivariate form of the structural change model (SCM), which has been successfully applied to transcriptome mapping and aCGH. Conclusions On a comprehensive synthetic dataset, we show that CnaStruct outperforms the segmentation of existing ASCN analysis methods. Furthermore, CnaStruct can be integrated into the workflows of several ASCN analysis tools in order to improve their performance, specially on tumour samples highly contaminated by normal cells. PMID:23497144

  17. Nonlinear bivariate dependency of price-volume relationships in agricultural commodity futures markets: A perspective from Multifractal Detrended Cross-Correlation Analysis

    NASA Astrophysics Data System (ADS)

    He, Ling-Yun; Chen, Shu-Peng

    2011-01-01

    Nonlinear dependency between characteristic financial and commodity market quantities (variables) is crucially important, especially between trading volume and market price. Studies on nonlinear dependency between price and volume can provide practical insights into market trading characteristics, as well as the theoretical understanding of market dynamics. Actually, nonlinear dependency and its underlying dynamical mechanisms between price and volume can help researchers and technical analysts in understanding the market dynamics by integrating the market variables, instead of investigating them in the current literature. Therefore, for investigating nonlinear dependency of price-volume relationships in agricultural commodity futures markets in China and the US, we perform a new statistical test to detect cross-correlations and apply a new methodology called Multifractal Detrended Cross-Correlation Analysis (MF-DCCA), which is an efficient algorithm to analyze two spatially or temporally correlated time series. We discuss theoretically the relationship between the bivariate cross-correlation exponent and the generalized Hurst exponents for time series of respective variables. We also perform an empirical study and find that there exists a power-law cross-correlation between them, and that multifractal features are significant in all the analyzed agricultural commodity futures markets.

  18. Descriptive statistics.

    PubMed

    Shi, Runhua; McLarty, Jerry W

    2009-10-01

    In this article, we introduced basic concepts of statistics, type of distributions, and descriptive statistics. A few examples were also provided. The basic concepts presented herein are only a fraction of the concepts related to descriptive statistics. Also, there are many commonly used distributions not presented herein, such as Poisson distributions for rare events and exponential distributions, F distributions, and logistic distributions. More information can be found in many statistics books and publications.

  19. Statistical Software.

    ERIC Educational Resources Information Center

    Callamaras, Peter

    1983-01-01

    This buyer's guide to seven major types of statistics software packages for microcomputers reviews Edu-Ware Statistics 3.0; Financial Planning; Speed Stat; Statistics with DAISY; Human Systems Dynamics package of Stats Plus, ANOVA II, and REGRESS II; Maxistat; and Moore-Barnes' MBC Test Construction and MBC Correlation. (MBR)

  20. Statistical Diversions

    ERIC Educational Resources Information Center

    Petocz, Peter; Sowey, Eric

    2008-01-01

    As a branch of knowledge, Statistics is ubiquitous and its applications can be found in (almost) every field of human endeavour. In this article, the authors track down the possible source of the link between the "Siren song" and applications of Statistics. Answers to their previous five questions and five new questions on Statistics are presented.

  1. Reducing uncertainty in the selection of bi-variate distributions of flood peaks and volumes using copulas and hydrological process-based model selection

    NASA Astrophysics Data System (ADS)

    Szolgay, Jan; Gaál, Ladislav; Bacigál, Tomáš; Kohnová, Silvia; Blöschl, Günter

    2016-04-01

    Bi-variate distributions of flood peaks and flood event volumes are needed for a range of practical purposes including e.g. retention basin design and identifying extent and duration of flooding in flood hazard zones. However, the selection of the types of bi-variate distributions and estimating their parameters from observed peak-volume pairs are associated with far larger uncertainties compared to uni-variate distributions, since observed flood records of required length are rarely available. This poses a serious problem to reliable flood risk estimation in bi-variate design cases. The aim of this contribution was to shed light on the possibility of reducing uncertainties in the estimation of the dependence models/parameters from a regional perspective. The peak-volume relationships were modeled in terms of copulas. Flood events were classified according to their origin. In order to reduce the uncertainty in estimating flood risk, pooling and analyzing catchments of similar behavior according to flood process types was attempted. Most of the work reported in the literature so far did not direct the multivariate analysis toward discriminating certain types of models regionally according to specific runoff generation processes. Specifically, the contribution addresses these problems: - Are the peak-volume relationships of different flood types for a given catchment similar? - Are the peak-volume dependence structures between catchments in a larger region for given flood types similar? - Are some copula types more suitable for given flood process types and does this have consequences for reliable risk estimation? The target region is located in the northern parts of Austria, and consists of 72 small and mid-sized catchments. Instead of the traditional approach that deals with annual maximum floods, the current analysis includes all independent flood events in the region. 24 872 flood events from the period 1976-2007 were identified, and classified as synoptic, flash

  2. Bivariate hydrologic risk analysis based on a coupled entropy-copula method for the Xiangxi River in the Three Gorges Reservoir area, China

    NASA Astrophysics Data System (ADS)

    Fan, Y. R.; Huang, W. W.; Huang, G. H.; Huang, K.; Li, Y. P.; Kong, X. M.

    2016-07-01

    In this study, a bivariate hydrologic risk framework is proposed based on a coupled entropy-copula method. In the proposed risk analysis framework, bivariate flood frequency would be analyzed for different flood variable pairs (i.e., flood peak-volume, flood peak-duration, flood volume-duration). The marginal distributions of flood peak, volume, and duration are quantified through both parametric (i.e., gamma, general extreme value (GEV), and lognormal distributions) and nonparametric (i.e., entropy) approaches. The joint probabilities of flood peak-volume, peak-duration, and volume-duration are established through copulas. The bivariate hydrologic risk is then derived based on the joint return period to reflect the interactive effects of flood variables on the final hydrologic risk values. The proposed method is applied to the risk analysis for the Xiangxi River in the Three Gorges Reservoir area, China. The results indicate the entropy method performs best in quantifying the distribution of flood duration. Bivariate hydrologic risk would then be generated to characterize the impacts of flood volume and duration on the occurrence of a flood. The results suggest that the bivariate risk for flood peak-volume would not decrease significantly for the flood volume less than 1000 m3/s. Moreover, a flood in the Xiangxi River may last at least 5 days without significant decrease of the bivariate risk for flood peak-duration.

  3. A Statistical Method for Estimating Luminosity Functions Using Truncated Data

    NASA Astrophysics Data System (ADS)

    Schafer, Chad M.

    2007-06-01

    The observational limitations of astronomical surveys lead to significant statistical inference challenges. One such challenge is the estimation of luminosity functions given redshift (z) and absolute magnitude (M) measurements from an irregularly truncated sample of objects. This is a bivariate density estimation problem; we develop here a statistically rigorous method which (1) does not assume a strict parametric form for the bivariate density; (2) does not assume independence between redshift and absolute magnitude (and hence allows evolution of the luminosity function with redshift); (3) does not require dividing the data into arbitrary bins; and (4) naturally incorporates a varying selection function. We accomplish this by decomposing the bivariate density φ(z,M) vialogφ(z,M)=f(z)+g(M)+h(z,M,θ), where f and g are estimated nonparametrically and h takes an assumed parametric form. There is a simple way of estimating the integrated mean squared error of the estimator; smoothing parameters are selected to minimize this quantity. Results are presented from the analysis of a sample of quasars.

  4. Low-energy dipole excitations in neon isotopes and N=16 isotones within the quasiparticle random-phase approximation and the Gogny force

    SciTech Connect

    Martini, M.; Peru, S.; Dupuis, M.

    2011-03-15

    Low-energy dipole excitations in neon isotopes and N=16 isotones are calculated with a fully consistent axially-symmetric-deformed quasiparticle random phase approximation (QRPA) approach based on Hartree-Fock-Bogolyubov (HFB) states. The same Gogny D1S effective force has been used both in HFB and QRPA calculations. The microscopical structure of these low-lying resonances, as well as the behavior of proton and neutron transition densities, are investigated in order to determine the isoscalar or isovector nature of the excitations. It is found that the N=16 isotones {sup 24}O, {sup 26}Ne, {sup 28}Mg, and {sup 30}Si are characterized by a similar behavior. The occupation of the 2s{sub 1/2} neutron orbit turns out to be crucial, leading to nontrivial transition densities and to small but finite collectivity. Some low-lying dipole excitations of {sup 28}Ne and {sup 30}Ne, characterized by transitions involving the {nu}1d{sub 3/2} state, present a more collective behavior and isoscalar transition densities. A collective proton low-lying excitation is identified in the {sup 18}Ne nucleus.

  5. The systematic study of the electroporation and electrofusion of B16-F1 and CHO cells in isotonic and hypotonic buffer.

    PubMed

    Usaj, Marko; Kanduser, Masa

    2012-09-01

    The fusogenic state of the cell membrane can be induced by external electric field. When two fusogenic membranes are in close contact, cell fusion takes place. An appropriate hypotonic treatment of cells before the application of electric pulses significantly improves electrofusion efficiency. How hypotonic treatment improves electrofusion is still not known in detail. Our results indicate that at given induced transmembrane potential electroporation was not affected by buffer osmolarity. In contrast to electroporation, cells' response to hypotonic treatment significantly affects their electrofusion. High fusion yield was observed when B16-F1 cells were used; this cell line in hypotonic buffer resulted in 41 ± 9 % yield, while in isotonic buffer 32 ± 11 % yield was observed. Based on our knowledge, these fusion yields determined in situ by dual-color fluorescence microscopy are among the highest in electrofusion research field. The use of hypotonic buffer was more crucial for electrofusion of CHO cells; the fusion yield increased from below 1 % in isotonic buffer to 10 ± 4 % in hypotonic buffer. Since the same degree of cell permeabilization was achieved in both buffers, these results indicate that hypotonic treatment significantly improves fusion yield. The effect could be attributed to improved physical contact of cell membranes or to enhanced fusogenic state of the cell membrane itself.

  6. A tutorial on Bayesian bivariate meta-analysis of mixed binary-continuous outcomes with missing treatment effects.

    PubMed

    Gajic-Veljanoski, Olga; Cheung, Angela M; Bayoumi, Ahmed M; Tomlinson, George

    2016-05-30

    Bivariate random-effects meta-analysis (BVMA) is a method of data synthesis that accounts for treatment effects measured on two outcomes. BVMA gives more precise estimates of the population mean and predicted values than two univariate random-effects meta-analyses (UVMAs). BVMA also addresses bias from incomplete reporting of outcomes. A few tutorials have covered technical details of BVMA of categorical or continuous outcomes. Limited guidance is available on how to analyze datasets that include trials with mixed continuous-binary outcomes where treatment effects on one outcome or the other are not reported. Given the advantages of Bayesian BVMA for handling missing outcomes, we present a tutorial for Bayesian BVMA of incompletely reported treatment effects on mixed bivariate outcomes. This step-by-step approach can serve as a model for our intended audience, the methodologist familiar with Bayesian meta-analysis, looking for practical advice on fitting bivariate models. To facilitate application of the proposed methods, we include our WinBUGS code. As an example, we use aggregate-level data from published trials to demonstrate the estimation of the effects of vitamin K and bisphosphonates on two correlated bone outcomes, fracture, and bone mineral density. We present datasets where reporting of the pairs of treatment effects on both outcomes was 'partially' complete (i.e., pairs completely reported in some trials), and we outline steps for modeling the incompletely reported data. To assess what is gained from the additional work required by BVMA, we compare the resulting estimates to those from separate UVMAs. We discuss methodological findings and make four recommendations. Copyright © 2015 John Wiley & Sons, Ltd.

  7. Anthropometric Survey of US Army Personnel (1988): Correlation Coefficients and Regression Equations. Part 4. Bivariate Regression Tables

    DTIC Science & Technology

    1990-05-01

    0.499 ( 0.015) 40.674 .379 218 MAXFRONH 165.697 0.781 ( 0.015) 32.011 .615 702 . • • poll I I SIMPLE BIVARIATE REGRESSIC:’S -- MALE NBER VARI E VARIABLE...103.788 0.212 (0.007) 22.431 .299 122 WSHTSTOM 0.022 0.969 (0.032) 22.434 .299 129 WRISHTST -88.486 0.656 (0.009) 14.425 .710 HOB Y.LRIA VARIABLE

  8. Statistical Diversions

    ERIC Educational Resources Information Center

    Petocz, Peter; Sowey, Eric

    2008-01-01

    In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…

  9. Use of statistical tests and statistical software choice in 2014: tale from three Medline indexed Pakistani journals.

    PubMed

    Shaikh, Masood Ali

    2016-04-01

    Statistical tests help infer meaningful conclusions from studies conducted and data collected. This descriptive study analyzed the type of statistical tests used and the statistical software utilized for analysis reported in the original articles published in 2014 by the three Medline-indexed journals of Pakistan. Cumulatively, 466 original articles were published in 2014. The most frequently reported statistical tests for original articles by all three journals were bivariate parametric and non-parametric tests i.e. involving comparisons between two groups e.g. Chi-square test, t-test, and various types of correlations. Cumulatively, 201 (43.1%) articles used these tests. SPSS was the primary choice for statistical analysis, as it was exclusively used in 374 (80.3%) original articles. There has been a substantial increase in the number of articles published, and in the sophistication of statistical tests used in the articles published in the Pakistani Medline indexed journals in 2014, compared to 2007.

  10. Critical Evaluation of Internet Resources for Teaching Trend and Variability in Bivariate Data

    ERIC Educational Resources Information Center

    Forster, Pat

    2007-01-01

    A search on the Internet for resources for teaching statistics yields multiple sites with data sets, projects, worksheets, applets, and software. Often these are made available without information on how they might benefit learning. This paper addresses potential benefits from resources that target trend and variability relationships in bivariate…

  11. Body mass estimates of an exceptionally complete Stegosaurus (Ornithischia: Thyreophora): comparing volumetric and linear bivariate mass estimation methods.

    PubMed

    Brassey, Charlotte A; Maidment, Susannah C R; Barrett, Paul M

    2015-03-01

    Body mass is a key biological variable, but difficult to assess from fossils. Various techniques exist for estimating body mass from skeletal parameters, but few studies have compared outputs from different methods. Here, we apply several mass estimation methods to an exceptionally complete skeleton of the dinosaur Stegosaurus. Applying a volumetric convex-hulling technique to a digital model of Stegosaurus, we estimate a mass of 1560 kg (95% prediction interval 1082-2256 kg) for this individual. By contrast, bivariate equations based on limb dimensions predict values between 2355 and 3751 kg and require implausible amounts of soft tissue and/or high body densities. When corrected for ontogenetic scaling, however, volumetric and linear equations are brought into close agreement. Our results raise concerns regarding the application of predictive equations to extinct taxa with no living analogues in terms of overall morphology and highlight the sensitivity of bivariate predictive equations to the ontogenetic status of the specimen. We emphasize the significance of rare, complete fossil skeletons in validating widely applied mass estimation equations based on incomplete skeletal material and stress the importance of accurately determining specimen age prior to further analyses.

  12. Body mass estimates of an exceptionally complete Stegosaurus (Ornithischia: Thyreophora): comparing volumetric and linear bivariate mass estimation methods

    PubMed Central

    Brassey, Charlotte A.; Maidment, Susannah C. R.; Barrett, Paul M.

    2015-01-01

    Body mass is a key biological variable, but difficult to assess from fossils. Various techniques exist for estimating body mass from skeletal parameters, but few studies have compared outputs from different methods. Here, we apply several mass estimation methods to an exceptionally complete skeleton of the dinosaur Stegosaurus. Applying a volumetric convex-hulling technique to a digital model of Stegosaurus, we estimate a mass of 1560 kg (95% prediction interval 1082–2256 kg) for this individual. By contrast, bivariate equations based on limb dimensions predict values between 2355 and 3751 kg and require implausible amounts of soft tissue and/or high body densities. When corrected for ontogenetic scaling, however, volumetric and linear equations are brought into close agreement. Our results raise concerns regarding the application of predictive equations to extinct taxa with no living analogues in terms of overall morphology and highlight the sensitivity of bivariate predictive equations to the ontogenetic status of the specimen. We emphasize the significance of rare, complete fossil skeletons in validating widely applied mass estimation equations based on incomplete skeletal material and stress the importance of accurately determining specimen age prior to further analyses. PMID:25740841

  13. Bivariate mass-size relation as a function of morphology as determined by Galaxy Zoo 2 crowdsourced visual classifications

    NASA Astrophysics Data System (ADS)

    Beck, Melanie; Scarlata, Claudia; Fortson, Lucy; Willett, Kyle; Galloway, Melanie

    2016-01-01

    It is well known that the mass-size distribution evolves as a function of cosmic time and that this evolution is different between passive and star-forming galaxy populations. However, the devil is in the details and the precise evolution is still a matter of debate since this requires careful comparison between similar galaxy populations over cosmic time while simultaneously taking into account changes in image resolution, rest-frame wavelength, and surface brightness dimming in addition to properly selecting representative morphological samples.Here we present the first step in an ambitious undertaking to calculate the bivariate mass-size distribution as a function of time and morphology. We begin with a large sample (~3 x 105) of SDSS galaxies at z ~ 0.1. Morphologies for this sample have been determined by Galaxy Zoo crowdsourced visual classifications and we split the sample not only by disk- and bulge-dominated galaxies but also in finer morphology bins such as bulge strength. Bivariate distribution functions are the only way to properly account for biases and selection effects. In particular, we quantify the mass-size distribution with a version of the parametric Maximum Likelihood estimator which has been modified to account for measurement errors as well as upper limits on galaxy sizes.

  14. Genetic regulation of plasma and red blood cell magnesium concentrations in man. I. Univariate and bivariate path analyses.

    PubMed Central

    Darlu, P; Rao, D C; Henrotte, J G; Lalouel, J M

    1982-01-01

    This paper concerns an analysis of family resemblance for magnesium concentrations, based on data from nuclear families and twins. Neither red blood cell magnesium nor plasma magnesium varies with age in children (under 20 years of age). Whereas adult plasma magnesium varies linearly with age, the red cell magnesium clearly showed a nonlinear trend: quadratic for males and a fifth-degree polynomial for females. Transformed magnesium concentrations generated six correlations in nuclear families and twins for each of the two traits. Separate univariate analyses, using a simple linear model with four parameters, strongly suggested that genetic factors are primarily responsible for the observed family resemblance. Both traits were then analyzed simultaneously using a simple bivariate model. We found that one common genetic factor alone could not explain all the 24 correlations generated for the bivariate analysis. The most parsimonious model involved only three parameters: genetic heritability for red blood cell magnesium (.922 +/- .014), genetic heritability for plasma magnesium (.721 +/- .040), and the genetic correlation between the two traits (.233 +/- .040). PMID:6891178

  15. Operator identities involving the bivariate Rogers-Szegö polynomials and their applications to the multiple q-series identities

    NASA Astrophysics Data System (ADS)

    Zhang, Zhizheng; Wang, Tianze

    2008-07-01

    In this paper, we first give several operator identities involving the bivariate Rogers-Szegö polynomials. By applying the technique of parameter augmentation to the multiple q-binomial theorems given by Milne [S.C. Milne, Balanced summation theorems for U(n) basic hypergeometric series, AdvE Math. 131 (1997) 93-187], we obtain several new multiple q-series identities involving the bivariate Rogers-Szegö polynomials. These include multiple extensions of Mehler's formula and Rogers's formula. Our U(n+1) generalizations are quite natural as they are also a direct and immediate consequence of their (often classical) known one-variable cases and Milne's fundamental theorem for An or U(n+1) basic hypergeometric series in Theorem 1E49 of [S.C. Milne, An elementary proof of the Macdonald identities for , Adv. Math. 57 (1985) 34-70], as rewritten in Lemma 7.3 on p. 163 of [S.C. Milne, Balanced summation theorems for U(n) basic hypergeometric series, Adv. Math. 131 (1997) 93-187] or Corollary 4.4 on pp. 768-769 of [S.C. Milne, M. Schlosser, A new An extension of Ramanujan's summation with applications to multilateral An series, Rocky Mountain J. Math. 32 (2002) 759-792].

  16. Towards an accurate model of redshift-space distortions: a bivariate Gaussian description for the galaxy pairwise velocity distributions

    NASA Astrophysics Data System (ADS)

    Bianchi, Davide; Chiesa, Matteo; Guzzo, Luigi

    2016-10-01

    As a step towards a more accurate modelling of redshift-space distortions (RSD) in galaxy surveys, we develop a general description of the probability distribution function of galaxy pairwise velocities within the framework of the so-called streaming model. For a given galaxy separation , such function can be described as a superposition of virtually infinite local distributions. We characterize these in terms of their moments and then consider the specific case in which they are Gaussian functions, each with its own mean μ and variance σ2. Based on physical considerations, we make the further crucial assumption that these two parameters are in turn distributed according to a bivariate Gaussian, with its own mean and covariance matrix. Tests using numerical simulations explicitly show that with this compact description one can correctly model redshift-space distorsions on all scales, fully capturing the overall linear and nonlinear dynamics of the galaxy flow at different separations. In particular, we naturally obtain Gaussian/exponential, skewed/unskewed distribution functions, depending on separation as observed in simulations and data. Also, the recently proposed single-Gaussian description of redshift-space distortions is included in this model as a limiting case, when the bivariate Gaussian is collapsed to a two-dimensional Dirac delta function. More work is needed, but these results indicate a very promising path to make definitive progress in our program to improve RSD estimators.

  17. Statistics Clinic

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  18. Quick Statistics

    MedlinePlus

    ... population, or about 25 million Americans, has experienced tinnitus lasting at least five minutes in the past ... by NIDCD Epidemiology and Statistics Program staff: (1) tinnitus prevalence was obtained from the 2008 National Health ...

  19. Bivariate versus multivariate smart spectrophotometric calibration methods for the simultaneous determination of a quaternary mixture of mosapride, pantoprazole and their degradation products.

    PubMed

    Hegazy, M A; Yehia, A M; Moustafa, A A

    2013-05-01

    The ability of bivariate and multivariate spectrophotometric methods was demonstrated in the resolution of a quaternary mixture of mosapride, pantoprazole and their degradation products. The bivariate calibrations include bivariate spectrophotometric method (BSM) and H-point standard addition method (HPSAM), which were able to determine the two drugs, simultaneously, but not in the presence of their degradation products, the results showed that simultaneous determinations could be performed in the concentration ranges of 5.0-50.0 microg/ml for mosapride and 10.0-40.0 microg/ml for pantoprazole by bivariate spectrophotometric method and in the concentration ranges of 5.0-45.0 microg/ml for both drugs by H-point standard addition method. Moreover, the applied multivariate calibration methods were able for the determination of mosapride, pantoprazole and their degradation products using concentration residuals augmented classical least squares (CRACLS) and partial least squares (PLS). The proposed multivariate methods were applied to 17 synthetic samples in the concentration ranges of 3.0-12.0 microg/ml mosapride, 8.0-32.0 microg/ml pantoprazole, 1.5-6.0 microg/ml mosapride degradation products and 2.0-8.0 microg/ml pantoprazole degradation products. The proposed bivariate and multivariate calibration methods were successfully applied to the determination of mosapride and pantoprazole in their pharmaceutical preparations.

  20. A Bivariate Genetic Analysis of Drug Abuse Ascertained Through Medical and Criminal Registries in Swedish Twins, Siblings and Half-Siblings.

    PubMed

    Maes, Hermine H; Neale, Michael C; Ohlsson, Henrik; Zahery, Mahsa; Lichtenstein, Paul; Sundquist, Kristina; Sundquist, Jan; Kendler, Kenneth S

    2016-11-01

    Using Swedish nationwide registry data, the authors investigated the correlation of genetic and environmental risk factors in the etiology of drug abuse as ascertained from medical and criminal registries by modeling twin and sibling data. Medical drug abuse was defined using public inpatient and outpatient records, while criminal drug abuse was ascertained through legal records. Twin, full and half sibling pairs were obtained from the national twin and genealogical registers. Information about sibling pair residence within the same household was obtained from Statistics Sweden. Standard bivariate genetic structural equation modeling was applied to the population-based data on drug abuse ascertained through medical and crime registries, using OpenMx. Analyses of all possible pairs of twins (MZ: N = 4482; DZ: N = 9838 pairs), full- (N = 1,278,086) and half-siblings (paternal: N = 7767; maternal N = 70,553) who grew up together suggested that factors explaining familial resemblance for drug abuse as defined through medical or criminal registries were mostly the same. Results showed substantial heritability and moderate contributions of shared environmental factors to drug abuse; both were higher in males versus females, and higher for drug abuse ascertained through criminal than medical records. Because of the low prevalence of both assessments of drug abuse, having access to population data was crucial to obtain stable estimates. Using objective registry data, the authors found that drug abuse-whether ascertained through medical versus criminal records-was highly heritable. Furthermore, shared environmental factors contributed significantly to the liability of drug abuse. Genetic and shared environmental risk factors for these two forms of drug abuse were highly correlated.

  1. Effects of Statistical Dependence.

    DTIC Science & Technology

    1985-01-01

    Some research being carried on deals with concepts of dependence for multicomponent systems. Such dependence arises naturally in reliability work...because of common environmental factors and common sources of material. This article mainly explores modes of positive dependence for bivariate and

  2. Effects of abrupt load alterations on force—velocity—length and time relations during isotonic contractions of heart muscle: load clamping

    PubMed Central

    Brutsaert, D. L.; Claes, V. A.; Sonnenblick, E. H.

    1971-01-01

    1. Abrupt alterations in load (load-clamping) have been imposed on cat papillary muscles during the course of isotonic shortening, between the onset of shortening and peak shortening. 2. For any given total load, whether imposed during the course of shortening or before stimulation, the velocity of shortening is determined solely by the instantaneous length, and not by the sequence of length and tension changes through which it arrived at that length. 3. This unique force—velocity—length relation is independent of time from just after the onset of shortening until just prior to peak shortening. 4. These results suggest that a steady state exists for the maximum intensity of active state in heart muscle over a major portion of the time during which isometric force is rising, and that heart muscle always senses total load while shortening. PMID:5559625

  3. Magnetopause shape as a bivariate function of interplanetary magnetic field B{sub z} and solar wind dynamic pressure

    SciTech Connect

    Roelof, E.C.; Sibeck, D.G.

    1993-12-01

    The authors present a new method for determining the shape of the magnetopause as a bivariate function of the hourly averaged solar wind dynamic pressure (p) and the north-south component of the interplanetary magnetic field (IMF) B{sub z}. They represent the magnetopause (for X{sub GSE}>{minus}40R{sub E}) as an ellipsoid of revolution in solar-wind-aberrated coordinates and express the (p, B{sub z}) dependence of each of the three ellipsoid parameters as a second-order (6-term) bivariate expansion in lnp and B{sub z}. The authors define 12 overlapping bins in a normalized dimensionless (p,B{sub z}) {open_quotes}control space{close_quotes} and fit an ellipsoid to those magnetopause crossings having (p,B{sub z}) values within each bin. They also calculate the bivariate (lnp, B{sub z}) moments to second order over each bin in control space. They can then calculate the six control-space expansion coefficients for each of the three ellipsoid parameters in configuration space. From these coefficients they can derive useful diagnostics of the magnetopause shape as joint functions of p and B{sub z}: the aspect ratio of the ellipsoid`s minor-to-major axes the flank distance radius of curvature, and flaring angle (at X{sub GSE}=0); and the subsolar distance and radius of curvature. The authors confirm and quantify previous results that during periods of southward B{sub z} the subsolar magnetopause moves inward, while at X{sub GSE}=0 the flank magnetopause moves outward and the flaring angle increases. These changes are most pronounced during periods of low pressure, wherein all have a dependence on B{sub z} that is stronger and functionally different for B{sub z} southward as compared to B{sub z} northward. In contrast, all these changes are much less sensitive to IMF B{sub z} at the highest pressures. 44 refs., 22 figs., 6 tabs.

  4. Isomers and high-spin structures in the N =81 isotones 135Xe and 137Ba

    NASA Astrophysics Data System (ADS)

    Vogt, A.; Birkenbach, B.; Reiter, P.; Blazhev, A.; Siciliano, M.; Hadyńska-Klek, K.; Valiente-Dobón, J. J.; Wheldon, C.; Teruya, E.; Yoshinaga, N.; Arnswald, K.; Bazzacco, D.; Bowry, M.; Bracco, A.; Bruyneel, B.; Chakrawarthy, R. S.; Chapman, R.; Cline, D.; Corradi, L.; Crespi, F. C. L.; Cromaz, M.; de Angelis, G.; Eberth, J.; Fallon, P.; Farnea, E.; Fioretto, E.; Freeman, S. J.; Fu, B.; Gadea, A.; Geibel, K.; Gelletly, W.; Gengelbach, A.; Giaz, A.; Görgen, A.; Gottardo, A.; Hayes, A. B.; Hess, H.; Hirsch, R.; Hua, H.; John, P. R.; Jolie, J.; Jungclaus, A.; Kaya, L.; Korten, W.; Lee, I. Y.; Leoni, S.; Lewandowski, L.; Liang, X.; Lunardi, S.; Macchiavelli, A. O.; Menegazzo, R.; Mengoni, D.; Michelagnoli, C.; Mijatović, T.; Montagnoli, G.; Montanari, D.; Müller-Gatermann, C.; Napoli, D.; Pearson, C. J.; Pellegri, L.; Podolyák, Zs.; Pollarolo, G.; Pullia, A.; Queiser, M.; Radeck, F.; Recchia, F.; Regan, P. H.; Rosiak, D.; Saed-Samii, N.; Şahin, E.; Scarlassara, F.; Schneiders, D.; Seidlitz, M.; Siebeck, B.; Sletten, G.; Smith, J. F.; Söderström, P.-A.; Stefanini, A. M.; Steinbach, T.; Stezowski, O.; Szilner, S.; Szpak, B.; Teng, R.; Ur, C.; Vandone, V.; Warner, D. D.; Wiens, A.; Wu, C. Y.; Zell, K. O.

    2017-02-01

    The high-spin structures and isomers of the N =81 isotones 135Xe and 137Ba are investigated after multinucleon-transfer (MNT) and fusion-evaporation reactions. Both nuclei are populated (i) in 136Xe+238U and (ii) 136Xe+208Pb MNT reactions employing the high-resolution Advanced Gamma Tracking Array (AGATA) coupled to the magnetic spectrometer PRISMA, (iii) in the 136Xe+198Pt MNT reaction employing the γ -ray array GAMMASPHERE in combination with the gas-detector array CHICO, and (iv) via a 11B+130Te fusion-evaporation reaction with the HORUS γ -ray array at the University of Cologne. The high-spin level schemesof 135Xe and 137Ba are considerably extended to higher energies. The 2058-keV (19 /2-) state in 135Xe is identified as an isomer, closing a gap in the systematics along the N =81 isotones. Its half-life is measured to be 9.0(9) ns, corresponding to a reduced transition probability of B (E 2 ,19 /2-→15 /2-) =0.52 (6 ) W.u. The experimentally deduced reduced transition probabilities of the isomeric states are compared to shell-model predictions. Latest shell-model calculations reproduce the experimental findings generally well and provide guidance to the interpretation of the new levels.

  5. Further studies on the partial double Donnan. Is isosmotic KCl solution isotonic with cells of respiratory trees of the holothurian Isostichopus badionotus Selenka?

    PubMed

    Herrera; Herrera; López

    2000-05-02

    As potassium, chloride and water traverse cell membranes, the cells of stenohaline marine invertebrates should swell if exposed to sea water mixed with an isosmotic KCl solution as they do when exposed to sea water diluted with water. To test this hypothesis respiratory tree fragments of the holothurian Isostichopus badionotus were exposed to five isosmotic media prepared by mixing artificial sodium sea water with isosmotic (611 mmol/l) KCl solution to obtain 100, 83, 71, 60 and 50% sea water, with and without 2 mmol/l ouabain. For comparison, respiratory tree fragments were incubated in sea water diluted to the same concentrations with distilled water, with and without ouabain. Cell water contents and potassium and sodium concentrations were unaffected by KCl-dilution or ouabain in isosmotic KCl-sea water mixtures. In tissues exposed to H(2)O-diluted sea water, cell water increased osmometrically and potassium, sodium and chloride concentrations decreased with dilution; ouabain caused a decrease in potasium and an increase in sodium but no effect on chloride concentrations. The isotonicity of the isosmotic KCl solution cannot be adscribed to impermeability of the cell membrane to KCl as both ions easily traverse the cell membrane. Rather, operationally immobilized extracellular sodium ions, which electrostatically hold back anions and consequently water, together with the lack of a cellward electrochemical gradient for potassium, resulting from membrane depolarization caused by high external potassium concentration, would explain the isotonicity of isosmotic KCl solution. The high external potassium concentration also antagonizes the inhibitory effect of ouabain on the Na(+)/K(+) ATPase responsible for sodium and potassium active transport.

  6. Mechanisms Underlying Activation of α1-Adrenergic Receptor-Induced Trafficking of AQP5 in Rat Parotid Acinar Cells under Isotonic or Hypotonic Conditions

    PubMed Central

    Bragiel, Aneta M.; Wang, Di; Pieczonka, Tomasz D.; Shono, Masayuki; Ishikawa, Yasuko

    2016-01-01

    Defective cellular trafficking of aquaporin-5 (AQP5) to the apical plasma membrane (APM) in salivary glands is associated with the loss of salivary fluid secretion. To examine mechanisms of α1-adrenoceptor (AR)-induced trafficking of AQP5, immunoconfocal microscopy and Western blot analysis were used to analyze AQP5 localization in parotid tissues stimulated with phenylephrine under different osmolality. Phenylephrine-induced trafficking of AQP5 to the APM and lateral plasma membrane (LPM) was mediated via the α1A-AR subtype, but not the α1B- and α1D-AR subtypes. Phenylephrine-induced trafficking of AQP5 was inhibited by ODQ and KT5823, inhibitors of nitric oxide (NO)-stimulated guanylcyclase (GC) and protein kinase (PK) G, respectively, indicating the involvement of the NO/ soluble (c) GC/PKG signaling pathway. Under isotonic conditions, phenylephrine-induced trafficking was inhibited by La3+, implying the participation of store-operated Ca2+ channel. Under hypotonic conditions, phenylephrine-induced trafficking of AQP5 to the APM was higher than that under isotonic conditions. Under non-stimulated conditions, hypotonicity-induced trafficking of AQP5 to the APM was inhibited by ruthenium red and La3+, suggesting the involvement of extracellular Ca2+ entry. Thus, α1A-AR activation induced the trafficking of AQP5 to the APM and LPM via the Ca2+/ cyclic guanosine monophosphate (cGMP)/PKG signaling pathway, which is associated with store-operated Ca2+ entry. PMID:27367668

  7. Symbolic transfer entropy rate is equal to transfer entropy rate for bivariate finite-alphabet stationary ergodic Markov processes

    NASA Astrophysics Data System (ADS)

    Haruna, Taichi; Nakajima, Kohei

    2013-05-01

    Transfer entropy is a measure of the magnitude and the direction of information flow between jointly distributed stochastic processes. In recent years, its permutation analogues are considered in the literature to estimate the transfer entropy by counting the number of occurrences of orderings of values, not the values themselves. It has been suggested that the method of permutation is easy to implement, computationally low cost and robust to noise when applying to real world time series data. In this paper, we initiate a theoretical treatment of the corresponding rates. In particular, we consider the transfer entropy rate and its permutation analogue, the symbolic transfer entropy rate, and show that they are equal for any bivariate finite-alphabet stationary ergodic Markov process. This result is an illustration of the duality method introduced in [T. Haruna, K. Nakajima, Physica D 240, 1370 (2011)]. We also discuss the relationship among the transfer entropy rate, the time-delayed mutual information rate and their permutation analogues.

  8. A bivariate twin study of regional brain volumes and verbal and nonverbal intellectual skills during childhood and adolescence.

    PubMed

    Wallace, Gregory L; Lee, Nancy Raitano; Prom-Wormley, Elizabeth C; Medland, Sarah E; Lenroot, Rhoshel K; Clasen, Liv S; Schmitt, James E; Neale, Michael C; Giedd, Jay N

    2010-03-01

    Twin studies indicate that both intelligence and brain structure are moderately to highly heritable. Recent bivariate studies of adult twins also suggest that intelligence and brain morphometry are influenced by shared genetic factors. The current study examines shared genetic and environmental factors between brain morphometry and intelligence in a sample of children and adolescents (twins, twin siblings, and singletons; n = 649, ages 4-19). To extend previous studies, brain morphometric data were parsed into subregions (lobar gray/white matter volumes, caudate nucleus, lateral ventricles) and intelligence into verbal and nonverbal skills (Wechsler Vocabulary and Block Design subtests). Phenotypic relationships between brain volumes and intelligence were small. Verbal skills shared unique environmental effects with gray matter volumes while nonverbal skills shared genetic effects with both global and regional gray and white matter. These results suggest that distinct mechanisms contribute to the small phenotypic relationships between brain volumes and verbal versus nonverbal intelligence.

  9. Statistics Revelations

    ERIC Educational Resources Information Center

    Chicot, Katie; Holmes, Hilary

    2012-01-01

    The use, and misuse, of statistics is commonplace, yet in the printed format data representations can be either over simplified, supposedly for impact, or so complex as to lead to boredom, supposedly for completeness and accuracy. In this article the link to the video clip shows how dynamic visual representations can enliven and enhance the…

  10. Improving the modelling of redshift-space distortions - I. A bivariate Gaussian description for the galaxy pairwise velocity distributions

    NASA Astrophysics Data System (ADS)

    Bianchi, Davide; Chiesa, Matteo; Guzzo, Luigi

    2015-01-01

    As a step towards a more accurate modelling of redshift-space distortions (RSD) in galaxy surveys, we develop a general description of the probability distribution function of galaxy pairwise velocities within the framework of the so-called streaming model. For a given galaxy separation r, such function can be described as a superposition of virtually infinite local distributions. We characterize these in terms of their moments and then consider the specific case in which they are Gaussian functions, each with its own mean μ and dispersion σ. Based on physical considerations, we make the further crucial assumption that these two parameters are in turn distributed according to a bivariate Gaussian, with its own mean and covariance matrix. Tests using numerical simulations explicitly show that with this compact description one can correctly model redshift-space distortions on all scales, fully capturing the overall linear and non-linear dynamics of the galaxy flow at different separations. In particular, we naturally obtain Gaussian/exponential, skewed/unskewed distribution functions, depending on separation as observed in simulations and data. Also, the recently proposed single-Gaussian description of RSD is included in this model as a limiting case, when the bivariate Gaussian is collapsed to a two-dimensional Dirac delta function. We also show how this description naturally allows for the Taylor expansion of 1 + ξS(s) around 1 + ξR(r), which leads to the Kaiser linear formula when truncated to second order, explicating its connection with the moments of the velocity distribution functions. More work is needed, but these results indicate a very promising path to make definitive progress in our programme to improve RSD estimators.

  11. Statistical Inference

    NASA Astrophysics Data System (ADS)

    Khan, Shahjahan

    Often scientific information on various data generating processes are presented in the from of numerical and categorical data. Except for some very rare occasions, generally such data represent a small part of the population, or selected outcomes of any data generating process. Although, valuable and useful information is lurking in the array of scientific data, generally, they are unavailable to the users. Appropriate statistical methods are essential to reveal the hidden "jewels" in the mess of the row data. Exploratory data analysis methods are used to uncover such valuable characteristics of the observed data. Statistical inference provides techniques to make valid conclusions about the unknown characteristics or parameters of the population from which scientifically drawn sample data are selected. Usually, statistical inference includes estimation of population parameters as well as performing test of hypotheses on the parameters. However, prediction of future responses and determining the prediction distributions are also part of statistical inference. Both Classical or Frequentists and Bayesian approaches are used in statistical inference. The commonly used Classical approach is based on the sample data alone. In contrast, increasingly popular Beyesian approach uses prior distribution on the parameters along with the sample data to make inferences. The non-parametric and robust methods are also being used in situations where commonly used model assumptions are unsupported. In this chapter,we cover the philosophical andmethodological aspects of both the Classical and Bayesian approaches.Moreover, some aspects of predictive inference are also included. In the absence of any evidence to support assumptions regarding the distribution of the underlying population, or if the variable is measured only in ordinal scale, non-parametric methods are used. Robust methods are employed to avoid any significant changes in the results due to deviations from the model

  12. Statistical Inference

    NASA Astrophysics Data System (ADS)

    Khan, Shahjahan

    Often scientific information on various data generating processes are presented in the from of numerical and categorical data. Except for some very rare occasions, generally such data represent a small part of the population, or selected outcomes of any data generating process. Although, valuable and useful information is lurking in the array of scientific data, generally, they are unavailable to the users. Appropriate statistical methods are essential to reveal the hidden “jewels” in the mess of the row data. Exploratory data analysis methods are used to uncover such valuable characteristics of the observed data. Statistical inference provides techniques to make valid conclusions about the unknown characteristics or parameters of the population from which scientifically drawn sample data are selected. Usually, statistical inference includes estimation of population parameters as well as performing test of hypotheses on the parameters. However, prediction of future responses and determining the prediction distributions are also part of statistical inference. Both Classical or Frequentists and Bayesian approaches are used in statistical inference. The commonly used Classical approach is based on the sample data alone. In contrast, increasingly popular Beyesian approach uses prior distribution on the parameters along with the sample data to make inferences. The non-parametric and robust methods are also being used in situations where commonly used model assumptions are unsupported. In this chapter,we cover the philosophical andmethodological aspects of both the Classical and Bayesian approaches.Moreover, some aspects of predictive inference are also included. In the absence of any evidence to support assumptions regarding the distribution of the underlying population, or if the variable is measured only in ordinal scale, non-parametric methods are used. Robust methods are employed to avoid any significant changes in the results due to deviations from the model

  13. [Descriptive statistics].

    PubMed

    Rendón-Macías, Mario Enrique; Villasís-Keever, Miguel Ángel; Miranda-Novales, María Guadalupe

    2016-01-01

    Descriptive statistics is the branch of statistics that gives recommendations on how to summarize clearly and simply research data in tables, figures, charts, or graphs. Before performing a descriptive analysis it is paramount to summarize its goal or goals, and to identify the measurement scales of the different variables recorded in the study. Tables or charts aim to provide timely information on the results of an investigation. The graphs show trends and can be histograms, pie charts, "box and whiskers" plots, line graphs, or scatter plots. Images serve as examples to reinforce concepts or facts. The choice of a chart, graph, or image must be based on the study objectives. Usually it is not recommended to use more than seven in an article, also depending on its length.

  14. Order Statistics and Nonparametric Statistics.

    DTIC Science & Technology

    2014-09-26

    Topics investigated include the following: Probability that a fuze will fire; moving order statistics; distribution theory and properties of the...problem posed by an Army Scientist: A fuze will fire when at least n-i (or n-2) of n detonators function within time span t. What is the probability of

  15. Statistical Optics

    NASA Astrophysics Data System (ADS)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  16. Spatial prediction of flood susceptible areas using rule based decision tree (DT) and a novel ensemble bivariate and multivariate statistical models in GIS

    NASA Astrophysics Data System (ADS)

    Tehrany, Mahyat Shafapour; Pradhan, Biswajeet; Jebur, Mustafa Neamah

    2013-11-01

    Decision tree (DT) machine learning algorithm was used to map the flood susceptible areas in Kelantan, Malaysia.We used an ensemble frequency ratio (FR) and logistic regression (LR) model in order to overcome weak points of the LR.Combined method of FR and LR was used to map the susceptible areas in Kelantan, Malaysia.Results of both methods were compared and their efficiency was assessed.Most influencing conditioning factors on flooding were recognized.

  17. Statistical properties of phase-shift algorithms

    NASA Astrophysics Data System (ADS)

    Rathjen, C.

    1995-09-01

    Statistical properties of phase-shift algorithms are investigated for the case of additive Gaussian intensity noise. Based on a bivariate normal distribution, a generally valid probability-density function for the random phase error is derived. This new description of the random phase error shows properties that cannot be obtained through Gaussian error propagation. The assumption of a normally distributed phase error is compared with the derived probability-density function. For small signal-to-noise ratios the assumption of a normally distributed phase error is not valid. Additionally, it is shown that some advanced systematic-error-compensating algorithms have a disadvantageous effect on the random phase error. error, systematic error, additive Gaussian noise, phase-measuring interferometry.

  18. Aquaculture in artificially developed wetlands in urban areas: an application of the bivariate relationship between soil and surface water in landscape ecology.

    PubMed

    Paul, Abhijit

    2011-01-01

    Wetlands show a strong bivariate relationship between soil and surface water. Artificially developed wetlands help to build landscape ecology and make built environments sustainable. The bheries, wetlands of eastern Calcutta (India), utilize the city sewage to develop urban aquaculture that supports the local fish industries and opens a new frontier in sustainable environmental planning research.

  19. Sequential Temporal Dependencies in Associations between Symptoms of Depression and Posttraumatic Stress Disorder: An Application of Bivariate Latent Difference Score Structural Equation Modeling

    ERIC Educational Resources Information Center

    King, Daniel W.; King, Lynda A.; McArdle, John J.; Shalev, Arieh Y.; Doron-LaMarca, Susan

    2009-01-01

    Depression and posttraumatic stress disorder (PTSD) are highly comorbid conditions that may arise following exposure to psychological trauma. This study examined their temporal sequencing and mutual influence using bivariate latent difference score structural equation modeling. Longitudinal data from 182 emergency room patients revealed level of…

  20. The role of drop velocity in statistical spray description

    NASA Technical Reports Server (NTRS)

    Groeneweg, J. F.; El-Wakil, M. M.; Myers, P. S.; Uyehara, O. A.

    1978-01-01

    The justification for describing a spray by treating drop velocity as a random variable on an equal statistical basis with drop size was studied experimentally. A double exposure technique using fluorescent drop photography was used to make size and velocity measurements at selected locations in a steady ethanol spray formed by a swirl atomizer. The size velocity data were categorized to construct bivariate spray density functions to describe the spray immediately after formation and during downstream propagation. Bimodal density functions were formed by environmental interaction during downstream propagation. Large differences were also found between spatial mass density and mass flux size distribution at the same location.

  1. Order-Constrained Reference Priors with Implications for Bayesian Isotonic Regression, Analysis of Covariance and Spatial Models

    NASA Astrophysics Data System (ADS)

    Gong, Maozhen

    Selecting an appropriate prior distribution is a fundamental issue in Bayesian Statistics. In this dissertation, under the framework provided by Berger and Bernardo, I derive the reference priors for several models which include: Analysis of Variance (ANOVA)/Analysis of Covariance (ANCOVA) models with a categorical variable under common ordering constraints, the conditionally autoregressive (CAR) models and the simultaneous autoregressive (SAR) models with a spatial autoregression parameter rho considered. The performances of reference priors for ANOVA/ANCOVA models are evaluated by simulation studies with comparisons to Jeffreys' prior and Least Squares Estimation (LSE). The priors are then illustrated in a Bayesian model of the "Risk of Type 2 Diabetes in New Mexico" data, where the relationship between the type 2 diabetes risk (through Hemoglobin A1c) and different smoking levels is investigated. In both simulation studies and real data set modeling, the reference priors that incorporate internal order information show good performances and can be used as default priors. The reference priors for the CAR and SAR models are also illustrated in the "1999 SAT State Average Verbal Scores" data with a comparison to a Uniform prior distribution. Due to the complexity of the reference priors for both CAR and SAR models, only a portion (12 states in the Midwest) of the original data set is considered. The reference priors can give a different marginal posterior distribution compared to a Uniform prior, which provides an alternative for prior specifications for areal data in Spatial statistics.

  2. Statistical Neurodynamics.

    NASA Astrophysics Data System (ADS)

    Paine, Gregory Harold

    1982-03-01

    The primary objective of the thesis is to explore the dynamical properties of small nerve networks by means of the methods of statistical mechanics. To this end, a general formalism is developed and applied to elementary groupings of model neurons which are driven by either constant (steady state) or nonconstant (nonsteady state) forces. Neuronal models described by a system of coupled, nonlinear, first-order, ordinary differential equations are considered. A linearized form of the neuronal equations is studied in detail. A Lagrange function corresponding to the linear neural network is constructed which, through a Legendre transformation, provides a constant of motion. By invoking the Maximum-Entropy Principle with the single integral of motion as a constraint, a probability distribution function for the network in a steady state can be obtained. The formalism is implemented for some simple networks driven by a constant force; accordingly, the analysis focuses on a study of fluctuations about the steady state. In particular, a network composed of N noninteracting neurons, termed Free Thinkers, is considered in detail, with a view to interpretation and numerical estimation of the Lagrange multiplier corresponding to the constant of motion. As an archetypical example of a net of interacting neurons, the classical neural oscillator, consisting of two mutually inhibitory neurons, is investigated. It is further shown that in the case of a network driven by a nonconstant force, the Maximum-Entropy Principle can be applied to determine a probability distribution functional describing the network in a nonsteady state. The above examples are reconsidered with nonconstant driving forces which produce small deviations from the steady state. Numerical studies are performed on simplified models of two physical systems: the starfish central nervous system and the mammalian olfactory bulb. Discussions are given as to how statistical neurodynamics can be used to gain a better

  3. Bivariate piecewise stationary segmentation; improved pre-treatment for synchronization measures used on non-stationary biological signals.

    PubMed

    Terrien, Jérémy; Germain, Guy; Marque, Catherine; Karlsson, Brynjar

    2013-08-01

    Analysis of synchronization between biological signals can be helpful in characterization of biological functions. Many commonly used measures of synchronicity assume that the signal is stationary. Biomedical signals are however often strongly non stationary. We propose to use a bivariate piecewise stationary pre-segmentation (bPSP) of the signals of interest, before the computation of synchronization measures on biomedical signals to improve the performance of standard synchronization measures. In prior work we have shown how this can be achieved by using the auto-spectrum of either one of the signals under investigation. In this work we show how major improvements of the performance of synchronization measures can be achieved using the cross-spectrum of the signals to detect stationary changes which occur independently in either signal. We show on synthetic as well as on real biological signals (epileptic EEG and uterine EMG) that the proposed bPSP approach increases the accuracy of the measures by making a good tradeoff between the stationarity assumption and the length of the analyzed segments, when compared to the classical windowing method.

  4. On the sources of the height-intelligence correlation: new insights from a bivariate ACE model with assortative mating.

    PubMed

    Beauchamp, Jonathan P; Cesarini, David; Johannesson, Magnus; Lindqvist, Erik; Apicella, Coren

    2011-03-01

    A robust positive correlation between height and intelligence, as measured by IQ tests, has been established in the literature. This paper makes several contributions toward establishing the causes of this association. First, we extend the standard bivariate ACE model to account for assortative mating. The more general theoretical framework provides several key insights, including formulas to decompose a cross-trait genetic correlation into components attributable to assortative mating and pleiotropy and to decompose a cross-trait within-family correlation. Second, we use a large dataset of male twins drawn from Swedish conscription records and examine how well genetic and environmental factors explain the association between (i) height and intelligence and (ii) height and military aptitude, a professional psychologist's assessment of a conscript's ability to deal with wartime stress. For both traits, we find suggestive evidence of a shared genetic architecture with height, but we demonstrate that point estimates are very sensitive to assumed degrees of assortative mating. Third, we report a significant within-family correlation between height and intelligence (p^ = 0.10), suggesting that pleiotropy might be at play.

  5. Evaluating Dynamic Bivariate Correlations in Resting-state fMRI: A comparison study and a new approach

    PubMed Central

    Lindquist, Martin A.; Xu, Yuting; Nebel, Mary Beth; Caffo, Brain S.

    2014-01-01

    To date, most functional Magnetic Resonance Imaging (fMRI) studies have assumed that the functional connectivity (FC) between time series from distinct brain regions is constant across time. However, recently, there has been increased interest in quantifying possible dynamic changes in FC during fMRI experiments, as it is thought this may provide insight into the fundamental workings of brain networks. In this work we focus on the specific problem of estimating the dynamic behavior of pair-wise correlations between time courses extracted from two different regions of the brain. We critique the commonly used sliding-windows technique, and discuss some alternative methods used to model volatility in the finance literature that could also prove useful in the neuroimaging setting. In particular, we focus on the Dynamic Conditional Correlation (DCC) model, which provides a model-based approach towards estimating dynamic correlations. We investigate the properties of several techniques in a series of simulation studies and find that DCC achieves the best overall balance between sensitivity and specificity in detecting dynamic changes in correlations. We also investigate its scalability beyond the bivariate case to demonstrate its utility for studying dynamic correlations between more than two brain regions. Finally, we illustrate its performance in an application to test-retest resting state fMRI data. PMID:24993894

  6. Evaluating dynamic bivariate correlations in resting-state fMRI: a comparison study and a new approach.

    PubMed

    Lindquist, Martin A; Xu, Yuting; Nebel, Mary Beth; Caffo, Brain S

    2014-11-01

    To date, most functional Magnetic Resonance Imaging (fMRI) studies have assumed that the functional connectivity (FC) between time series from distinct brain regions is constant across time. However, recently, there has been an increased interest in quantifying possible dynamic changes in FC during fMRI experiments, as it is thought that this may provide insight into the fundamental workings of brain networks. In this work we focus on the specific problem of estimating the dynamic behavior of pair-wise correlations between time courses extracted from two different regions of the brain. We critique the commonly used sliding-window technique, and discuss some alternative methods used to model volatility in the finance literature that could also prove to be useful in the neuroimaging setting. In particular, we focus on the Dynamic Conditional Correlation (DCC) model, which provides a model-based approach towards estimating dynamic correlations. We investigate the properties of several techniques in a series of simulation studies and find that DCC achieves the best overall balance between sensitivity and specificity in detecting dynamic changes in correlations. We also investigate its scalability beyond the bivariate case to demonstrate its utility for studying dynamic correlations between more than two brain regions. Finally, we illustrate its performance in an application to test-retest resting state fMRI data.

  7. Bivariate and multivariate analyses of the correlations between stability of the erythrocyte membrane, serum lipids and hematological variables.

    PubMed

    Bernardino Neto, M; de Avelar, E B; Arantes, T S; Jordão, I A; da Costa Huss, J C; de Souza, T M T; de Souza Penha, V A; da Silva, S C; de Souza, P C A; Tavares, M; Penha-Silva, N

    2013-01-01

    The observation that the fluidity must remain within a critical interval, outside which the stability and functionality of the cell tends to decrease, shows that stability, fluidity and function are related and that the measure of erythrocyte stability allows inferences about the fluidity or functionality of these cells. This study determined the biochemical and hematological variables that are directly or indirectly related to erythrocyte stability in a population of 71 volunteers. Data were evaluated by bivariate and multivariate analysis. The erythrocyte stability showed a greater association with hematological variables than the biochemical variables. The RDW stands out for its strong correlation with the stability of erythrocyte membrane, without being heavily influenced by other factors. Regarding the biochemical variables, the erythrocyte stability was more sensitive to LDL-C. Erythrocyte stability was significantly associated with RDW and LDL-C. Thus, the level of LDL-C is a consistent link between stability and functionality, suggesting that a measure of stability could be more one indirect parameter for assessing the risk of degenerative processes associated with high levels of LDL-C.

  8. A bivariate genome-wide association study identifies ADAM12 as a novel susceptibility gene for Kashin-Beck disease

    PubMed Central

    Hao, Jingcan; Wang, Wenyu; Wen, Yan; Xiao, Xiao; He, Awen; Guo, Xiong; Yang, Tielin; Liu, Xiaogang; Shen, Hui; Chen, Xiangding; Tian, Qing; Deng, Hong-Wen; Zhang, Feng

    2016-01-01

    Kashin-Beck disease (KBD) is a chronic osteoarthropathy, which manifests as joint deformities and growth retardation. Only a few genetic studies of growth retardation associated with the KBD have been carried out by now. In this study, we conducted a two-stage bivariate genome-wide association study (BGWAS) of the KBD using joint deformities and body height as study phenotypes, totally involving 2,417 study subjects. Articular cartilage specimens from 8 subjects were collected for immunohistochemistry. In the BGWAS, ADAM12 gene achieved the most significant association (rs1278300 p-value = 9.25 × 10−9) with the KBD. Replication study observed significant association signal at rs1278300 (p-value = 0.007) and rs1710287 (p-value = 0.002) of ADAM12 after Bonferroni correction. Immunohistochemistry revealed significantly decreased expression level of ADAM12 protein in the KBD articular cartilage (average positive chondrocyte rate = 47.59 ± 7.79%) compared to healthy articular cartilage (average positive chondrocyte rate = 64.73 ± 5.05%). Our results suggest that ADAM12 gene is a novel susceptibility gene underlying both joint destruction and growth retardation of the KBD. PMID:27545300

  9. A bivariate genome-wide association study identifies ADAM12 as a novel susceptibility gene for Kashin-Beck disease.

    PubMed

    Hao, Jingcan; Wang, Wenyu; Wen, Yan; Xiao, Xiao; He, Awen; Guo, Xiong; Yang, Tielin; Liu, Xiaogang; Shen, Hui; Chen, Xiangding; Tian, Qing; Deng, Hong-Wen; Zhang, Feng

    2016-08-22

    Kashin-Beck disease (KBD) is a chronic osteoarthropathy, which manifests as joint deformities and growth retardation. Only a few genetic studies of growth retardation associated with the KBD have been carried out by now. In this study, we conducted a two-stage bivariate genome-wide association study (BGWAS) of the KBD using joint deformities and body height as study phenotypes, totally involving 2,417 study subjects. Articular cartilage specimens from 8 subjects were collected for immunohistochemistry. In the BGWAS, ADAM12 gene achieved the most significant association (rs1278300 p-value = 9.25 × 10(-9)) with the KBD. Replication study observed significant association signal at rs1278300 (p-value = 0.007) and rs1710287 (p-value = 0.002) of ADAM12 after Bonferroni correction. Immunohistochemistry revealed significantly decreased expression level of ADAM12 protein in the KBD articular cartilage (average positive chondrocyte rate = 47.59 ± 7.79%) compared to healthy articular cartilage (average positive chondrocyte rate = 64.73 ± 5.05%). Our results suggest that ADAM12 gene is a novel susceptibility gene underlying both joint destruction and growth retardation of the KBD.

  10. Systematics of the Electric and Magnetic Dipole Response in N=82 Isotones Below the Neutron Separation Energy

    NASA Astrophysics Data System (ADS)

    Tonchev, A. P.; Kwan, E.; Raut, R.; Rusev, G.; Tornow, W.; Hammond, S.; Kelley, J. H.; Tsoneva, N.; Lenske, H.

    2013-03-01

    In stable and weakly bound neutron-rich nuclei, a resonance-like concentration of dipole states has been observed for excitation energies around the neutron separation energy. This clustering of strong dipole states has been named the pygmy dipole resonance in contrast to the giant dipole resonance that dominates the E1 response. Understanding the pygmy resonance is presently of great interest in nuclear structure and nuclear astrophysics. High-sensitivity studies of E1 and M1 transitions in N=82 nuclei using the quasi monoenergetic and 100% linearly-polarized photon beams from High-Intensity-Gamma-Ray Source facility is presented. The nuclear dipole-strength distribution of the pygmy resonance has been measured and novel information about the character of this mode of excitation has been obtained. The data are compared with predictions from statistical and quasiparticle random-phase approximation models.

  11. isocir: An R Package for Constrained Inference using Isotonic Regression for Circular Data, with an Application to Cell Biology.

    PubMed

    Barragán, Sandra; Fernández, Miguel A; Rueda, Cristina; Peddada, Shyamal Das

    2013-08-01

    In many applications one may be interested in drawing inferences regarding the order of a collection of points on a unit circle. Due to the underlying geometry of the circle standard constrained inference procedures developed for Euclidean space data are not applicable. Recently, statistical inference for parameters under such order constraints on a unit circle was discussed in Rueda et al. (2009); Fernández et al. (2012). In this paper we introduce an R package called isocir which provides a set of functions that can be used for analyzing angular data subject to order constraints on a unit circle. Since this work is motivated by applications in cell biology, we illustrate the proposed package using a relevant cell cycle data.

  12. Submaximal Exercise VO2 and Q During 30-Day 6 degree Head-Down Bed Rest with Isotonic and Isokinetic Exercise Training

    NASA Technical Reports Server (NTRS)

    Greenleaf, J. E.; Bernauer, E. M.; Erti, A. C.

    1995-01-01

    Submaximal exercise (61+3% peak VO2) metabolism was measured before (AC day-2) and on bed rest day 4, 11, and 25 in 19 healthy men (32-42 yr) allocated into no exercise (NOE, N=5) control, and isotonic exercise (ITE, N=7)and isokinetic exercise (IKE, N=7) training groups. Training was conducting supine for two 30-min periods/d for 6 d/wk: ITE was 60-90% peak VO2: IKE was peak knee flexion-extension at 100 deg/s. Supine submaximal exercise 102 decreased significantly (*p<0.05) by 10.3%, with ITE and by 7.3%* with IKE; similar to the submaximal cardiac output (Q) change of -14.5%* (ITE) and -203%* (IKE), but different from change in peak VO2 (+1.4% with ITE and - 10.2%, with IKE) and plasma volume of -3.7% (ITE) and - 18.0% * (IKE). Thus, reduction of submaximal V02 during prolonged bed rest appears to respond to submaximal Q but is not related to change in peak VO2 or plasma volume.

  13. Evolution of the one-phonon 21,ms+ mixed-symmetry state in N = 80 isotones as a local measure for the proton-neutron quadrupole interaction

    NASA Astrophysics Data System (ADS)

    Ahn, T.; Coquard, L.; Pietralla, N.; Rainovski, G.; Costin, A.; Janssens, R. V. F.; Lister, C. J.; Carpenter, M.; Zhu, S.; Heyde, K.

    2009-08-01

    An inverse kinematics Coulomb excitation experiment was performed to obtain absolute E2 and M1 transition strengths in 134Xe. The measured transition strengths indicate that the 23+ state of 134Xe is the dominant fragment of the one-phonon 21,ms+ mixed-symmetry state. Comparing the energy of the 21,ms+ mixed-symmetry state in 134Xe to that of the 21,ms+ levels in the N = 80 isotonic chain indicates that the separation in energy between the fully-symmetric 21+ state and the 21,ms+ level increases as a function of the number of proton pairs outside the Z = 50 shell closure. This behavior can be understood as resulting from the mixing of the basic components of a two-fluid quantum system. A phenomenological fit based on this concept was performed. It provides the first experimental estimate of the strength of the proton-neutron quadrupole interaction derived from nuclear collective states with symmetric and antisymmetric nature.

  14. Stupid statistics!

    PubMed

    Tellinghuisen, Joel

    2008-01-01

    The method of least squares is probably the most powerful data analysis tool available to scientists. Toward a fuller appreciation of that power, this work begins with an elementary review of statistics fundamentals, and then progressively increases in sophistication as the coverage is extended to the theory and practice of linear and nonlinear least squares. The results are illustrated in application to data analysis problems important in the life sciences. The review of fundamentals includes the role of sampling and its connection to probability distributions, the Central Limit Theorem, and the importance of finite variance. Linear least squares are presented using matrix notation, and the significance of the key probability distributions-Gaussian, chi-square, and t-is illustrated with Monte Carlo calculations. The meaning of correlation is discussed, including its role in the propagation of error. When the data themselves are correlated, special methods are needed for the fitting, as they are also when fitting with constraints. Nonlinear fitting gives rise to nonnormal parameter distributions, but the 10% Rule of Thumb suggests that such problems will be insignificant when the parameter is sufficiently well determined. Illustrations include calibration with linear and nonlinear response functions, the dangers inherent in fitting inverted data (e.g., Lineweaver-Burk equation), an analysis of the reliability of the van't Hoff analysis, the problem of correlated data in the Guggenheim method, and the optimization of isothermal titration calorimetry procedures using the variance-covariance matrix for experiment design. The work concludes with illustrations on assessing and presenting results.

  15. Plastic Surgery Statistics

    MedlinePlus

    ... PRS GO PSN PSEN GRAFT Contact Us News Plastic Surgery Statistics Plastic surgery procedural statistics from the ... Plastic Surgery Statistics 2005 Plastic Surgery Statistics 2016 Plastic Surgery Statistics Stats Report 2016 National Clearinghouse of ...

  16. Submaximal exercise VO2 and Qc during 30-day 6 degrees head-down bed rest with isotonic and isokinetic exercise training

    NASA Technical Reports Server (NTRS)

    Greenleaf, J. E.; Ertl, A. C.; Bernauer, E. M.

    1996-01-01

    BACKGROUND: Maintaining intermediary metabolism is necessary for the health and well-being of astronauts on long-duration spaceflights. While peak oxygen uptake (VO2) is consistently decreased during prolonged bed rest, submaximal VO2 is either unchanged or decreased. METHODS: Submaximal exercise metabolism (61 +/- 3% peak VO2) was measured during ambulation (AMB day-2) and on bed rest days 4, 11, and 25 in 19 healthy men (32-42 yr) allocated into no exercise (NOE, N = 5) control, and isotonic exercise (ITE, N = 7) and isokinetic exercise (IKE, N = 7) training groups. Exercise training was conducted supine for two 30-min periods per day for 6 d per week: ITE training was intermittent at 60-90% peak VO2; IKE training was 10 sets of 5 repetitions of peak knee flexion-extension force at a velocity of 100 degrees s-1. Cardiac output was measured with the indirect Fick CO2 method, and plasma volume with Evans blue dye dilution. RESULTS: Supine submaximal exercise VO2 decreased significantly (*p < 0.05) by 10.3%* with ITE and by 7.3%* with IKE; similar to the submaximal cardiac output decrease of 14.5%* (ITE) and 20.3%* (IKE), but different from change in peak VO2 (+1.4% with ITE and -10.2%* with IKE) and decrease in plasma volume of -3.7% (ITE) and -18.0%* (IKE). Reduction of submaximal VO2 during bed rest correlated 0.79 (p < 0.01) with submaximal Qc, but was not related to change in peak VO2 or plasma volume. CONCLUSION: Reduction in submaximal oxygen uptake during prolonged bed rest is related to decrease in exercise but not resting cardiac output; perturbations in active skeletal muscle metabolism may be involved.

  17. Bivariate and multivariate analyses of the influence of blood variables of patients submitted to Roux-en-Y gastric bypass on the stability of erythrocyte membrane against the chaotropic action of ethanol.

    PubMed

    de Arvelos, Leticia Ramos; Rocha, Vanessa Custódio Afonso; Felix, Gabriela Pereira; da Cunha, Cleine Chagas; Bernardino Neto, Morun; da Silva Garrote Filho, Mario; de Fátima Pinheiro, Conceição; Resende, Elmiro Santos; Penha-Silva, Nilson

    2013-03-01

    The stability of the erythrocyte membrane, which is essential for the maintenance of cell functions, occurs in a critical region of fluidity, which depends largely on its composition and the composition and characteristics of the medium. As the composition of the erythrocyte membrane is influenced by several blood variables, the stability of the erythrocyte membrane must have relations with them. The present study aimed to evaluate, by bivariate and multivariate statistical analyses, the correlations and causal relationships between hematologic and biochemical variables and the stability of the erythrocyte membrane against the chaotropic action of ethanol. The validity of this type of analysis depends on the homogeneity of the population and on the variability of the studied parameters, conditions that can be filled by patients who undergo bariatric surgery by the technique of Roux-en-Y gastric bypass since they will suffer feeding restrictions that have great impact on their blood composition. Pathway analysis revealed that an increase in hemoglobin leads to decreased stability of the cell, probably through a process mediated by an increase in mean corpuscular volume. Furthermore, an increase in the mean corpuscular hemoglobin (MCH) leads to an increase in erythrocyte membrane stability, probably because higher values of MCH are associated with smaller quantities of red blood cells and a larger contact area between the cell membrane and ethanol present in the medium.

  18. Statistical analyses in the physiology of exercise and kinanthropometry.

    PubMed

    Winter, E M; Eston, R G; Lamb, K L

    2001-10-01

    Research into the physiology of exercise and kinanthropometry is intended to improve our understanding of how the body responds and adapts to exercise. If such studies are to be meaningful, they have to be well designed and analysed. Advances in personal computing have made available statistical analyses that were previously the preserve of elaborate mainframe systems and have increased opportunities for investigation. However, the ease with which analyses can be performed can mask underlying philosophical and epistemological shortcomings. The aim of this review is to examine the use of four techniques that are especially relevant to physiological studies: (1) bivariate correlation and linear and non-linear regression, (2) multiple regression, (3) repeated-measures analysis of variance and (4) multi-level modelling. The importance of adhering to underlying statistical assumptions is emphasized and ways to accommodate violations of these assumptions are identified.

  19. MQSA National Statistics

    MedlinePlus

    ... Standards Act and Program MQSA Insights MQSA National Statistics Share Tweet Linkedin Pin it More sharing options ... but should level off with time. Archived Scorecard Statistics 2017 Scorecard Statistics 2016 Scorecard Statistics (Archived) 2015 ...

  20. g-factor and spin-parity assignments of excited states in the N=83 isotones {sup 135}Te, {sup 136}I, {sup 137}Xe, and {sup 138}Cs

    SciTech Connect

    Liu, S. H.; Hamilton, J. H.; Ramayya, A. V.; Hwang, J. K.; Covello, A.; Itaco, N.; Gargano, A.; Stone, N. J.; Daniel, A. V.; Luo, Y. X.; Rasmussen, J. O.; Ter-Akopian, G. M.; Zhu, S. J.; Ma, W. C.

    2010-01-15

    The g factor of the 15/2{sup -} state in {sup 137}Xe was measured for the first time by using a newly developed technique for measuring angular correlations with Gammasphere. Spins and parities were assigned to several levels in the N=83 isotones {sup 135}Te, {sup 136}I, {sup 137}Xe, and {sup 138}Cs. The calculated g factor in the shell-model frame is in good agreement with the measured one in the present work. Shell-model calculations also support our spin-parity assignments.

  1. Interday Reliability of Peak Muscular Power Outputs on an Isotonic Dynamometer and Assessment of Active Trunk Control Using the Chop and Lift Tests

    PubMed Central

    Palmer, Thomas G.; Uhl, Timothy L.

    2011-01-01

    Abstract Context: Assessment techniques used to measure functional tasks involving active trunk control are restricted to linear movements that lack the explosive movements and dynamic tasks associated with activities of daily living and sport. Reliable clinical methods used to assess the diagonal and ballistic movements about the trunk are lacking. Objective: To assess the interday reliability of peak muscular power outputs while participants performed diagonal chop and lift tests and maintained a stable trunk. Design: Controlled laboratory study. Setting: University research laboratory. Patients or Other Participants: Eighteen healthy individuals (10 men and 8 women; age  =  32 ± 11 years, height  =  168 ± 12 cm, mass  =  80 ± 19 kg) from the general population participated. Intervention(s): Participants performed 2 power tests (chop, lift) using an isotonic dynamometer and 3 endurance tests (Biering-Sørensen, side-plank left, side-plank right) to assess active trunk control. Testing was performed on 3 different days separated by at least 1 week. Reliability was compared between days 1 and 2 and between days 2 and 3. Correlations between the power and endurance tests were evaluated to determine the degree of similarity. Main Outcome Measure(s): Peak muscular power outputs (watts) derived from a 1-repetition maximum protocol for the chop and lift tests were collected for both the right and left sides. Results: Intraclass correlation coefficients for peak muscular power were highly reliable for the chop (range, 0.87–0.98), lift (range, 0.83–0.96), and endurance (range, 0.80–0.98) tests between test sessions. The correlations between the power assessments and the Biering-Sørensen test (r range, −0.008 to 0.017) were low. The side-plank tests were moderately correlated with the chop (r range, 0.528–0.590) and the lift (r range, 0.359–0.467) tests. Conclusions: The diagonal chop and lift power protocol generated reliable data and

  2. [Cell kinetic analysis of human brain tumors by bivariate flow cytometric measurement of cellular DNA content and amount of incorporated bromodeoxyuridine].

    PubMed

    Okuda, Y; Taomoto, K; Saya, H; Ijichi, A; Kudo, H; Kokunai, T; Tamaki, N; Matsumoto, S

    1989-04-01

    Cell kinetics of 91 human brain tumors obtained from 88 patients were analyzed with the following two methods, 1) bivariate (two-color) flow cytometric measurement of cellular DNA content and amount of bromodeoxyuridine (BrdU) incorporated into cellular DNA, in 66 specimens, 2) immunohistochemical detection of BrdU incorporated S-phase cells, in 34 specimens. Patients were given an intravenous 1 hour infusion of 200 mg/sq. m. of BrdU 1-2 hours before the surgical removal. The excised tumor specimen was divided into several portions. One was fixed with 70% ethanol and embedded in paraffin, and another was digested mechanically and/or chemically to obtain a single cell suspension, and fixed in 70% ethanol. Paraffin-embedded tissue sections were stained by the peroxidase-antiperoxidase immunohistochemical method using anti-BrdU monoclonal antibody (MoAb). Single cell suspensions were reacted with fluorescein isothiocyanate (FITC) conjugated anti-BrdU MoAb, or anti-BrdU MoAb and FITC-conjugated second antibody successively by the staining with propidium iodide, for flow cytometry (FCM). Rates of S-phase fraction in single cell suspensions calculated by bivariate FCM were correlated well with labeling indexes (LI, i.e. the percentage of BrdU incorporated cells) calculated in tissue sections, but not with the result of analysis of DNA histogram by Dean's method. This discrepancy is probably due to large coefficient value in several samples. Histological malignancy of the tumors was reflected both in the proliferating index (PI, i.e. % S+G2M phase) calculated by bivariate FCM and the LI by immunohistochemical method. PI tended to be high in primitive neuroectodermal tumors and metastatic carcinomas, moderately high in gliomas, and low in benign tumor groups.(ABSTRACT TRUNCATED AT 250 WORDS)

  3. Statistics Poker: Reinforcing Basic Statistical Concepts

    ERIC Educational Resources Information Center

    Leech, Nancy L.

    2008-01-01

    Learning basic statistical concepts does not need to be tedious or dry; it can be fun and interesting through cooperative learning in the small-group activity of Statistics Poker. This article describes a teaching approach for reinforcing basic statistical concepts that can help students who have high anxiety and makes learning and reinforcing…

  4. Predict! Teaching Statistics Using Informational Statistical Inference

    ERIC Educational Resources Information Center

    Makar, Katie

    2013-01-01

    Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…

  5. Examining Bivariate Item-Criterion Associations: A Method of Exploring Personality Correlates of Job-Related Behavior

    DTIC Science & Technology

    1988-07-05

    the sample size, the criterion substantially exceeded the magnitude required for statistical significance as estimated in simulation studies ( Gorsuch ...persons and situations. Psychological Bulletin, 82, 278-288. Gorsuch , R.L. (1974). Factor Analysis. Philadelphia, PA: W. B. Saunders Company. Gough, H.G

  6. Neuroendocrine Tumor: Statistics

    MedlinePlus

    ... Tumor > Neuroendocrine Tumor: Statistics Request Permissions Neuroendocrine Tumor: Statistics Approved by the Cancer.Net Editorial Board , 11/ ... the body. It is important to remember that statistics on how many people survive this type of ...

  7. Adrenal Gland Tumors: Statistics

    MedlinePlus

    ... Gland Tumor: Statistics Request Permissions Adrenal Gland Tumor: Statistics Approved by the Cancer.Net Editorial Board , 03/ ... primary adrenal gland tumor is very uncommon. Exact statistics are not available for this type of tumor ...

  8. Time-resolved X-ray diffraction studies of myosin head movements in live frog sartorius muscle during isometric and isotonic contractions.

    PubMed

    Martin-Fernandez, M L; Bordas, J; Diakun, G; Harries, J; Lowy, J; Mant, G R; Svensson, A; Towns-Andrews, E

    1994-06-01

    Using the facilities at the Daresbury Synchrotron Radiation Source, meridional diffraction patterns of muscles at ca 8 degrees C were recorded with a time resolution of 2 or 4 ms. In isometric contractions tetanic peak tension (P0) is reached in ca 400 ms. Under such conditions, following stimulation from rest, the timing of changes in the major reflections (the 38.2 nm troponin reflection, and the 21.5 and 14.34/14.58 nm myosin reflections) can be explained in terms of four types of time courses: K1, K2, K3 and K4. The onset of K1 occurs immediately after stimulation, but that of K2, K3 and K4 is delayed by a latent period of ca 16 ms. Relative to the end of their own latent periods the half-times for K1, K2, K3 and K4 are 14-16, 16, 32 and 52 ms, respectively. In half-times, K1, K2, K3 lead tension rise by 52, 36 and 20 ms, respectively. K4 parallels the time course of tension rise. From an analysis of the data we conclude that K1 reflects thin filament activation which involves the troponin system; K2 arises from an order-disorder transition during which the register between the filaments is lost; K3 is due to the formation of an acto-myosin complex which (at P0) causes 70% or more of the heads to diffract with actin-based periodicities; and K4 is caused by a change in the axial orientation of the myosin heads (relative to thin filament axis) which is estimated to be from 65-70 degrees at rest to ca 90 degrees at P0. Isotonic contraction experiments showed that during shortening under a load of ca 0.27 P0, at least 85% of the heads (relative to those forming an acto-myosin complex at P0) diffract with actin-based periodicities, whilst their axial orientation does not change from that at rest. During shortening under a negligible load, at most 5-10% of the heads (relative to those forming an acto-myosin complex at P0) diffract with actin-based periodicities, and their axial orientation also remains the same as that at rest. This suggests that in isometric

  9. PROBABILITY AND STATISTICS.

    DTIC Science & Technology

    STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION

  10. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.

    PubMed

    Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg

    2009-11-01

    G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.

  11. Statistical Reference Datasets

    National Institute of Standards and Technology Data Gateway

    Statistical Reference Datasets (Web, free access)   The Statistical Reference Datasets is also supported by the Standard Reference Data Program. The purpose of this project is to improve the accuracy of statistical software by providing reference datasets with certified computational results that enable the objective evaluation of statistical software.

  12. Explorations in statistics: statistical facets of reproducibility.

    PubMed

    Curran-Everett, Douglas

    2016-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science.

  13. Association of Supply Type with Fecal Contamination of Source Water and Household Stored Drinking Water in Developing Countries: A Bivariate Meta-analysis

    PubMed Central

    Bain, Robert E.S.; Cronk, Ryan; Wright, Jim A.; Bartram, Jamie

    2015-01-01

    Background Access to safe drinking water is essential for health. Monitoring access to drinking water focuses on water supply type at the source, but there is limited evidence on whether quality differences at the source persist in water stored in the household. Objectives We assessed the extent of fecal contamination at the source and in household stored water (HSW) and explored the relationship between contamination at each sampling point and water supply type. Methods We performed a bivariate random-effects meta-analysis of 45 studies, identified through a systematic review, that reported either the proportion of samples free of fecal indicator bacteria and/or individual sample bacteria counts for source and HSW, disaggregated by supply type. Results Water quality deteriorated substantially between source and stored water. The mean percentage of contaminated samples (noncompliance) at the source was 46% (95% CI: 33, 60%), whereas mean noncompliance in HSW was 75% (95% CI: 64, 84%). Water supply type was significantly associated with noncompliance at the source (p < 0.001) and in HSW (p = 0.03). Source water (OR = 0.2; 95% CI: 0.1, 0.5) and HSW (OR = 0.3; 95% CI: 0.2, 0.8) from piped supplies had significantly lower odds of contamination compared with non-piped water, potentially due to residual chlorine. Conclusions Piped water is less likely to be contaminated compared with other water supply types at both the source and in HSW. A focus on upgrading water services to piped supplies may help improve safety, including for those drinking stored water. Citation Shields KF, Bain RE, Cronk R, Wright JA, Bartram J. 2015. Association of supply type with fecal contamination of source water and household stored drinking water in developing countries: a bivariate meta-analysis. Environ Health Perspect 123:1222–1231; http://dx.doi.org/10.1289/ehp.1409002 PMID:25956006

  14. A spatial scan statistic for nonisotropic two-level risk cluster.

    PubMed

    Li, Xiao-Zhou; Wang, Jin-Feng; Yang, Wei-Zhong; Li, Zhong-Jie; Lai, Sheng-Jie

    2012-01-30

    Spatial scan statistic methods are commonly used for geographical disease surveillance and cluster detection. The standard spatial scan statistic does not model any variability in the underlying risks of subregions belonging to a detected cluster. For a multilevel risk cluster, the isotonic spatial scan statistic could model a centralized high-risk kernel in the cluster. Because variations in disease risks are anisotropic owing to different social, economical, or transport factors, the real high-risk kernel will not necessarily take the central place in a whole cluster area. We propose a spatial scan statistic for a nonisotropic two-level risk cluster, which could be used to detect a whole cluster and a noncentralized high-risk kernel within the cluster simultaneously. The performance of the three methods was evaluated through an intensive simulation study. Our proposed nonisotropic two-level method showed better power and geographical precision with two-level risk cluster scenarios, especially for a noncentralized high-risk kernel. Our proposed method is illustrated using the hand-foot-mouth disease data in Pingdu City, Shandong, China in May 2009, compared with two other methods. In this practical study, the nonisotropic two-level method is the only way to precisely detect a high-risk area in a detected whole cluster.

  15. Mathematical and statistical analysis

    NASA Technical Reports Server (NTRS)

    Houston, A. Glen

    1988-01-01

    The goal of the mathematical and statistical analysis component of RICIS is to research, develop, and evaluate mathematical and statistical techniques for aerospace technology applications. Specific research areas of interest include modeling, simulation, experiment design, reliability assessment, and numerical analysis.

  16. Uterine Cancer Statistics

    MedlinePlus

    ... Research AMIGAS Fighting Cervical Cancer Worldwide Stay Informed Statistics for Other Kinds of Cancer Breast Cervical Colorectal ( ... Skin Vaginal and Vulvar Cancer Home Uterine Cancer Statistics Language: English Español (Spanish) Recommend on Facebook Tweet ...

  17. Experiment in Elementary Statistics

    ERIC Educational Resources Information Center

    Fernando, P. C. B.

    1976-01-01

    Presents an undergraduate laboratory exercise in elementary statistics in which students verify empirically the various aspects of the Gaussian distribution. Sampling techniques and other commonly used statistical procedures are introduced. (CP)

  18. Ethics in Statistics

    ERIC Educational Resources Information Center

    Lenard, Christopher; McCarthy, Sally; Mills, Terence

    2014-01-01

    There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…

  19. Teaching Statistics Using SAS.

    ERIC Educational Resources Information Center

    Mandeville, Garrett K.

    The Statistical Analysis System (SAS) is presented as the single most appropriate statistical package to use as an aid in teaching statistics. A brief review of literature in which SAS is compared to SPSS, BMDP, and other packages is followed by six examples which demonstrate features unique to SAS which have pedagogical utility. Of particular…

  20. Minnesota Health Statistics 1988.

    ERIC Educational Resources Information Center

    Minnesota State Dept. of Health, St. Paul.

    This document comprises the 1988 annual statistical report of the Minnesota Center for Health Statistics. After introductory technical notes on changes in format, sources of data, and geographic allocation of vital events, an overview is provided of vital health statistics in all areas. Thereafter, separate sections of the report provide tables…

  1. Statistical Methods for Astronomy

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric D.; Babu, G. Jogesh

    Statistical methodology, with deep roots in probability theory, providesquantitative procedures for extracting scientific knowledge from astronomical dataand for testing astrophysical theory. In recent decades, statistics has enormouslyincreased in scope and sophistication. After a historical perspective, this reviewoutlines concepts of mathematical statistics, elements of probability theory,hypothesis tests, and point estimation. Least squares, maximum likelihood, andBayesian approaches to statistical inference are outlined. Resampling methods,particularly the bootstrap, provide valuable procedures when distributionsfunctions of statistics are not known. Several approaches to model selection andgoodness of fit are considered.

  2. Statistical Conclusion Validity: Some Common Threats and Simple Remedies

    PubMed Central

    García-Pérez, Miguel A.

    2012-01-01

    The ultimate goal of research is to produce dependable knowledge or to provide the evidence that may guide practical decisions. Statistical conclusion validity (SCV) holds when the conclusions of a research study are founded on an adequate analysis of the data, generally meaning that adequate statistical methods are used whose small-sample behavior is accurate, besides being logically capable of providing an answer to the research question. Compared to the three other traditional aspects of research validity (external validity, internal validity, and construct validity), interest in SCV has recently grown on evidence that inadequate data analyses are sometimes carried out which yield conclusions that a proper analysis of the data would not have supported. This paper discusses evidence of three common threats to SCV that arise from widespread recommendations or practices in data analysis, namely, the use of repeated testing and optional stopping without control of Type-I error rates, the recommendation to check the assumptions of statistical tests, and the use of regression whenever a bivariate relation or the equivalence between two variables is studied. For each of these threats, examples are presented and alternative practices that safeguard SCV are discussed. Educational and editorial changes that may improve the SCV of published research are also discussed. PMID:22952465

  3. Statistical analysis of single-trial Granger causality spectra.

    PubMed

    Brovelli, Andrea

    2012-01-01

    Granger causality analysis is becoming central for the analysis of interactions between neural populations and oscillatory networks. However, it is currently unclear whether single-trial estimates of Granger causality spectra can be used reliably to assess directional influence. We addressed this issue by combining single-trial Granger causality spectra with statistical inference based on general linear models. The approach was assessed on synthetic and neurophysiological data. Synthetic bivariate data was generated using two autoregressive processes with unidirectional coupling. We simulated two hypothetical experimental conditions: the first mimicked a constant and unidirectional coupling, whereas the second modelled a linear increase in coupling across trials. The statistical analysis of single-trial Granger causality spectra, based on t-tests and linear regression, successfully recovered the underlying pattern of directional influence. In addition, we characterised the minimum number of trials and coupling strengths required for significant detection of directionality. Finally, we demonstrated the relevance for neurophysiology by analysing two local field potentials (LFPs) simultaneously recorded from the prefrontal and premotor cortices of a macaque monkey performing a conditional visuomotor task. Our results suggest that the combination of single-trial Granger causality spectra and statistical inference provides a valuable tool for the analysis of large-scale cortical networks and brain connectivity.

  4. Predicting radiotherapy outcomes using statistical learning techniques.

    PubMed

    El Naqa, Issam; Bradley, Jeffrey D; Lindsay, Patricia E; Hope, Andrew J; Deasy, Joseph O

    2009-09-21

    Radiotherapy outcomes are determined by complex interactions between treatment, anatomical and patient-related variables. A common obstacle to building maximally predictive outcome models for clinical practice is the failure to capture potential complexity of heterogeneous variable interactions and applicability beyond institutional data. We describe a statistical learning methodology that can automatically screen for nonlinear relations among prognostic variables and generalize to unseen data before. In this work, several types of linear and nonlinear kernels to generate interaction terms and approximate the treatment-response function are evaluated. Examples of institutional datasets of esophagitis, pneumonitis and xerostomia endpoints were used. Furthermore, an independent RTOG dataset was used for 'generalizabilty' validation. We formulated the discrimination between risk groups as a supervised learning problem. The distribution of patient groups was initially analyzed using principle components analysis (PCA) to uncover potential nonlinear behavior. The performance of the different methods was evaluated using bivariate correlations and actuarial analysis. Over-fitting was controlled via cross-validation resampling. Our results suggest that a modified support vector machine (SVM) kernel method provided superior performance on leave-one-out testing compared to logistic regression and neural networks in cases where the data exhibited nonlinear behavior on PCA. For instance, in prediction of esophagitis and pneumonitis endpoints, which exhibited nonlinear behavior on PCA, the method provided 21% and 60% improvements, respectively. Furthermore, evaluation on the independent pneumonitis RTOG dataset demonstrated good generalizabilty beyond institutional data in contrast with other models. This indicates that the prediction of treatment response can be improved by utilizing nonlinear kernel methods for discovering important nonlinear interactions among model

  5. Predicting radiotherapy outcomes using statistical learning techniques

    NASA Astrophysics Data System (ADS)

    El Naqa, Issam; Bradley, Jeffrey D.; Lindsay, Patricia E.; Hope, Andrew J.; Deasy, Joseph O.

    2009-09-01

    Radiotherapy outcomes are determined by complex interactions between treatment, anatomical and patient-related variables. A common obstacle to building maximally predictive outcome models for clinical practice is the failure to capture potential complexity of heterogeneous variable interactions and applicability beyond institutional data. We describe a statistical learning methodology that can automatically screen for nonlinear relations among prognostic variables and generalize to unseen data before. In this work, several types of linear and nonlinear kernels to generate interaction terms and approximate the treatment-response function are evaluated. Examples of institutional datasets of esophagitis, pneumonitis and xerostomia endpoints were used. Furthermore, an independent RTOG dataset was used for 'generalizabilty' validation. We formulated the discrimination between risk groups as a supervised learning problem. The distribution of patient groups was initially analyzed using principle components analysis (PCA) to uncover potential nonlinear behavior. The performance of the different methods was evaluated using bivariate correlations and actuarial analysis. Over-fitting was controlled via cross-validation resampling. Our results suggest that a modified support vector machine (SVM) kernel method provided superior performance on leave-one-out testing compared to logistic regression and neural networks in cases where the data exhibited nonlinear behavior on PCA. For instance, in prediction of esophagitis and pneumonitis endpoints, which exhibited nonlinear behavior on PCA, the method provided 21% and 60% improvements, respectively. Furthermore, evaluation on the independent pneumonitis RTOG dataset demonstrated good generalizabilty beyond institutional data in contrast with other models. This indicates that the prediction of treatment response can be improved by utilizing nonlinear kernel methods for discovering important nonlinear interactions among model

  6. Heroin: Statistics and Trends

    MedlinePlus

    ... Naloxone Pain Prevention Treatment Trends & Statistics Women and Drugs Publications Funding Funding Opportunities Clinical Research Post-Award Concerns General Information Grant & Contract Application ...

  7. Statistical distribution sampling

    NASA Technical Reports Server (NTRS)

    Johnson, E. S.

    1975-01-01

    Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.

  8. Increasing genotype-phenotype model determinism: application to bivariate reading/language traits and epistatic interactions in language-impaired families.

    PubMed

    Simmons, Tabatha R; Flax, Judy F; Azaro, Marco A; Hayter, Jared E; Justice, Laura M; Petrill, Stephen A; Bassett, Anne S; Tallal, Paula; Brzustowicz, Linda M; Bartlett, Christopher W

    2010-01-01

    While advances in network and pathway analysis have flourished in the era of genome-wide association analysis, understanding the genetic mechanism of individual loci on phenotypes is still readily accomplished using genetic modeling approaches. Here, we demonstrate two novel genotype-phenotype models implemented in a flexible genetic modeling platform. The examples come from analysis of families with specific language impairment (SLI), a failure to develop normal language without explanatory factors such as low IQ or inadequate environment. In previous genome-wide studies, we observed strong evidence for linkage to 13q21 with a reading phenotype in language-impaired families. First, we elucidate the genetic architecture of reading impairment and quantitative language variation in our samples using a bivariate analysis of reading impairment in affected individuals jointly with language quantitative phenotypes in unaffected individuals. This analysis largely recapitulates the baseline analysis using the categorical trait data (posterior probability of linkage (PPL) = 80%), indicating that our reading impairment phenotype captured poor readers who also have low language ability. Second, we performed epistasis analysis using a functional coding variant in the brain-derived neurotrophic factor (BDNF) gene previously associated with reduced performance on working memory tasks. Modeling epistasis doubled the evidence on 13q21 and raised the PPL to 99.9%, indicating that BDNF and 13q21 susceptibility alleles are jointly part of the genetic architecture of SLI. These analyses provide possible mechanistic insights for further cognitive neuroscience studies based on the models developed herein.

  9. Explorations in Statistics: Power

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2010-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fifth installment of "Explorations in Statistics" revisits power, a concept fundamental to the test of a null hypothesis. Power is the probability that we reject the null hypothesis when it is false. Four…

  10. Teaching Statistics without Sadistics.

    ERIC Educational Resources Information Center

    Forte, James A.

    1995-01-01

    Five steps designed to take anxiety out of statistics for social work students are outlined. First, statistics anxiety is identified as an educational problem. Second, instructional objectives and procedures to achieve them are presented and methods and tools for evaluating the course are explored. Strategies for, and obstacles to, making…

  11. STATSIM: Exercises in Statistics.

    ERIC Educational Resources Information Center

    Thomas, David B.; And Others

    A computer-based learning simulation was developed at Florida State University which allows for high interactive responding via a time-sharing terminal for the purpose of demonstrating descriptive and inferential statistics. The statistical simulation (STATSIM) is comprised of four modules--chi square, t, z, and F distribution--and elucidates the…

  12. Understanding Undergraduate Statistical Anxiety

    ERIC Educational Resources Information Center

    McKim, Courtney

    2014-01-01

    The purpose of this study was to understand undergraduate students' views of statistics. Results reveal that students with less anxiety have a higher interest in statistics and also believe in their ability to perform well in the course. Also students who have a more positive attitude about the class tend to have a higher belief in their…

  13. Water Quality Statistics

    ERIC Educational Resources Information Center

    Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain

    2004-01-01

    Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses…

  14. Towards Statistically Undetectable Steganography

    DTIC Science & Technology

    2011-06-30

    Statistically Undciectable Steganography 5a. CONTRACT NUMBER FA9550-08-1-0084 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Prof. Jessica...approved for public release: distribution is unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Fundamental asymptotic laws for imperfect steganography ...formats. 15. SUBJECT TERMS Steganography . covert communication, statistical detectability. asymptotic performance, secure pay load, minimum

  15. Explorations in Statistics: Regression

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2011-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This seventh installment of "Explorations in Statistics" explores regression, a technique that estimates the nature of the relationship between two things for which we may only surmise a mechanistic or predictive…

  16. Option Y, Statistics.

    ERIC Educational Resources Information Center

    Singer, Arlene

    This guide outlines a one semester Option Y course, which has seven learner objectives. The course is designed to provide students with an introduction to the concerns and methods of statistics, and to equip them to deal with the many statistical matters of importance to society. Topics covered include graphs and charts, collection and…

  17. On Statistical Testing.

    ERIC Educational Resources Information Center

    Huberty, Carl J.

    An approach to statistical testing, which combines Neyman-Pearson hypothesis testing and Fisher significance testing, is recommended. The use of P-values in this approach is discussed in some detail. The author also discusses some problems which are often found in introductory statistics textbooks. The problems involve the definitions of…

  18. Statistics and Measurements

    PubMed Central

    Croarkin, M. Carroll

    2001-01-01

    For more than 50 years, the Statistical Engineering Division (SED) has been instrumental in the success of a broad spectrum of metrology projects at NBS/NIST. This paper highlights fundamental contributions of NBS/NIST statisticians to statistics and to measurement science and technology. Published methods developed by SED staff, especially during the early years, endure as cornerstones of statistics not only in metrology and standards applications, but as data-analytic resources used across all disciplines. The history of statistics at NBS/NIST began with the formation of what is now the SED. Examples from the first five decades of the SED illustrate the critical role of the division in the successful resolution of a few of the highly visible, and sometimes controversial, statistical studies of national importance. A review of the history of major early publications of the division on statistical methods, design of experiments, and error analysis and uncertainty is followed by a survey of several thematic areas. The accompanying examples illustrate the importance of SED in the history of statistics, measurements and standards: calibration and measurement assurance, interlaboratory tests, development of measurement methods, Standard Reference Materials, statistical computing, and dissemination of measurement technology. A brief look forward sketches the expanding opportunity and demand for SED statisticians created by current trends in research and development at NIST. PMID:27500023

  19. [Statistics quantum satis].

    PubMed

    Pestana, Dinis

    2013-01-01

    Statistics is a privileged tool in building knowledge from information, since the purpose is to extract from a sample limited information conclusions to the whole population. The pervasive use of statistical software (that always provides an answer, the question being adequate or not), and the absence of statistics to confer a scientific flavour to so much bad science, has had a pernicious effect on some disbelief on statistical research. Would Lord Rutherford be alive today, it is almost certain that he would not condemn the use of statistics in research, as he did in the dawn of the 20th century. But he would indeed urge everyone to use statistics quantum satis, since to use bad data, too many data, and statistics to enquire on irrelevant questions, is a source of bad science, namely because with too many data we can establish statistical significance of irrelevant results. This is an important point that addicts of evidence based medicine should be aware of, since the meta analysis of two many data will inevitably establish senseless results.

  20. Reform in Statistical Education

    ERIC Educational Resources Information Center

    Huck, Schuyler W.

    2007-01-01

    Two questions are considered in this article: (a) What should professionals in school psychology do in an effort to stay current with developments in applied statistics? (b) What should they do with their existing knowledge to move from surface understanding of statistics to deep understanding? Written for school psychologists who have completed…

  1. Deconstructing Statistical Analysis

    ERIC Educational Resources Information Center

    Snell, Joel

    2014-01-01

    Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…

  2. Statistics 101 for Radiologists.

    PubMed

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced.

  3. Applied Statistics with SPSS

    ERIC Educational Resources Information Center

    Huizingh, Eelko K. R. E.

    2007-01-01

    Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…

  4. Overhead Image Statistics

    SciTech Connect

    Vijayaraj, Veeraraghavan; Cheriyadat, Anil M; Bhaduri, Budhendra L; Vatsavai, Raju; Bright, Eddie A

    2008-01-01

    Statistical properties of high-resolution overhead images representing different land use categories are analyzed using various local and global statistical image properties based on the shape of the power spectrum, image gradient distributions, edge co-occurrence, and inter-scale wavelet coefficient distributions. The analysis was performed on a database of high-resolution (1 meter) overhead images representing a multitude of different downtown, suburban, commercial, agricultural and wooded exemplars. Various statistical properties relating to these image categories and their relationship are discussed. The categorical variations in power spectrum contour shapes, the unique gradient distribution characteristics of wooded categories, the similarity in edge co-occurrence statistics for overhead and natural images, and the unique edge co-occurrence statistics of downtown categories are presented in this work. Though previous work on natural image statistics has showed some of the unique characteristics for different categories, the relationships for overhead images are not well understood. The statistical properties of natural images were used in previous studies to develop prior image models, to predict and index objects in a scene and to improve computer vision models. The results from our research findings can be used to augment and adapt computer vision algorithms that rely on prior image statistics to process overhead images, calibrate the performance of overhead image analysis algorithms, and derive features for better discrimination of overhead image categories.

  5. The statistical analysis of multivariate serological frequency data.

    PubMed

    Reyment, Richard A

    2005-11-01

    Data occurring in the form of frequencies are common in genetics-for example, in serology. Examples are provided by the AB0 group, the Rhesus group, and also DNA data. The statistical analysis of tables of frequencies is carried out using the available methods of multivariate analysis with usually three principal aims. One of these is to seek meaningful relationships between the components of a data set, the second is to examine relationships between populations from which the data have been obtained, the third is to bring about a reduction in dimensionality. This latter aim is usually realized by means of bivariate scatter diagrams using scores computed from a multivariate analysis. The multivariate statistical analysis of tables of frequencies cannot safely be carried out by standard multivariate procedures because they represent compositions and are therefore embedded in simplex space, a subspace of full space. Appropriate procedures for simplex space are compared and contrasted with simple standard methods of multivariate analysis ("raw" principal component analysis). The study shows that the differences between a log-ratio model and a simple logarithmic transformation of proportions may not be very great, particularly as regards graphical ordinations, but important discrepancies do occur. The divergencies between logarithmically based analyses and raw data are, however, great. Published data on Rhesus alleles observed for Italian populations are used to exemplify the subject.

  6. Statistics at a glance.

    PubMed

    Ector, Hugo

    2010-12-01

    I still remember my first book on statistics: "Elementary statistics with applications in medicine and the biological sciences" by Frederick E. Croxton. For me, it has been the start of pursuing understanding statistics in daily life and in medical practice. It was the first volume in a long row of books. In his introduction, Croxton pretends that"nearly everyone involved in any aspect of medicine needs to have some knowledge of statistics". The reality is that for many clinicians, statistics are limited to a "P < 0.05 = ok". I do not blame my colleagues who omit the paragraph on statistical methods. They have never had the opportunity to learn concise and clear descriptions of the key features. I have experienced how some authors can describe difficult methods in a well understandable language. Others fail completely. As a teacher, I tell my students that life is impossible without a basic knowledge of statistics. This feeling has resulted in an annual seminar of 90 minutes. This tutorial is the summary of this seminar. It is a summary and a transcription of the best pages I have detected.

  7. Statistical Association Criteria in Forensic Psychiatry–A criminological evaluation of casuistry

    PubMed Central

    Gheorghiu, V; Buda, O; Popescu, I; Trandafir, MS

    2011-01-01

    Purpose. Identification of potential shared primary psychoprophylaxis and crime prevention is measured by analyzing the rate of commitments for patients–subjects to forensic examination. Material and method. The statistic trial is a retrospective, document–based study. The statistical lot consists of 770 initial examination reports performed and completed during the whole year 2007, primarily analyzed in order to summarize the data within the National Institute of Forensic Medicine, Bucharest, Romania (INML), with one of the group variables being ‘particularities of the psychiatric patient history’, containing the items ‘forensic onset’, ‘commitments within the last year prior to the examination’ and ‘absence of commitments within the last year prior to the examination’. The method used was the Kendall bivariate correlation. For this study, the authors separately analyze only the two items regarding commitments by other correlation alternatives and by modern, elaborate statistical analyses, i.e. recording of the standard case study variables, Kendall bivariate correlation, cross tabulation, factor analysis and hierarchical cluster analysis. Results. The results are varied, from theoretically presumed clinical nosography (such as schizophrenia or manic depression), to non–presumed (conduct disorders) or unexpected behavioral acts, and therefore difficult to interpret. Conclusions. One took into consideration the features of the batch as well as the results of the previous standard correlation of the whole statistical lot. The authors emphasize the role of medical security measures that are actually applied in the therapeutic management in general and in risk and second offence management in particular, as well as the role of forensic psychiatric examinations in the detection of certain aspects related to the monitoring of mental patients. PMID:21505571

  8. Informal Statistics Help Desk

    NASA Technical Reports Server (NTRS)

    Young, M.; Koslovsky, M.; Schaefer, Caroline M.; Feiveson, A. H.

    2017-01-01

    Back by popular demand, the JSC Biostatistics Laboratory and LSAH statisticians are offering an opportunity to discuss your statistical challenges and needs. Take the opportunity to meet the individuals offering expert statistical support to the JSC community. Join us for an informal conversation about any questions you may have encountered with issues of experimental design, analysis, or data visualization. Get answers to common questions about sample size, repeated measures, statistical assumptions, missing data, multiple testing, time-to-event data, and when to trust the results of your analyses.

  9. Commentary: statistics for biomarkers.

    PubMed

    Lovell, David P

    2012-05-01

    This short commentary discusses Biomarkers' requirements for the reporting of statistical analyses in submitted papers. It is expected that submitters will follow the general instructions of the journal, the more detailed guidance given by the International Committee of Medical Journal Editors, the specific guidelines developed by the EQUATOR network, and those of various specialist groups. Biomarkers expects that the study design and subsequent statistical analyses are clearly reported and that the data reported can be made available for independent assessment. The journal recognizes that there is continuing debate about different approaches to statistical science. Biomarkers appreciates that the field continues to develop rapidly and encourages the use of new methodologies.

  10. LED champing: statistically blessed?

    PubMed

    Wang, Zhuo

    2015-06-10

    LED champing (smart mixing of individual LEDs to match the desired color and lumens) and color mixing strategies have been widely used to maintain the color consistency of light engines. Light engines with champed LEDs can easily achieve the color consistency of a couple MacAdam steps with widely distributed LEDs to begin with. From a statistical point of view, the distributions for the color coordinates and the flux after champing are studied. The related statistical parameters are derived, which facilitate process improvements such as Six Sigma and are instrumental to statistical quality control for mass productions.

  11. Exaggerated natriuretic response to isotonic volume expansion in hypertensive renal transplant recipients: evaluation of proximal and distal tubular reabsorption by simultaneous determination of renal plasma clearance of lithium and 51Cr-EDTA.

    PubMed

    Nielsen, A H; Knudsen, F; Danielsen, H; Pedersen, E B; Fjeldborg, P; Madsen, M; Brøchner-Mortensen, J; Kornerup, H J

    1987-02-01

    In fourteen hypertensive and fourteen normotensive renal transplant recipients, and in a group of thirteen healthy controls, changes in natriuresis, glomerular filtration rate (GFR), and tubular reabsorption of sodium were determined in relation to intravenous infusion of 2 mmol isotonic sodium chloride per kg body weight. An exaggerated natriuresis was demonstrated in the hypertensive renal transplant recipients. This new finding indicates that the augmented natriuresis following plasma volume expansion, which is a characteristic finding in subjects with arterial hypertension, is not mediated by the renal nerves. Investigation of the tubular reabsorption rates of sodium by simultaneous determination of the renal clearance of 51Cr-EDTA and lithium showed that in the hypertensives the changes in tubular handling of sodium were different from those registered in the normotensive subjects. The increased sodium excretion in the hypertensive renal transplant recipients was caused by an increased output of sodium from the proximal tubules which was not fully compensated for by an increased distal reabsorption. Whether this increased delivery of sodium to the distal segments was caused by changes in GFR or in the proximal tubular reabsorption of sodium could not be clarified in the present study and warrants further investigations.

  12. Playing at Statistical Mechanics

    ERIC Educational Resources Information Center

    Clark, Paul M.; And Others

    1974-01-01

    Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)

  13. Hemophilia Data and Statistics

    MedlinePlus

    ... Hemophilia Women Healthcare Providers Partners Media Policy Makers Data & Statistics Language: English Español (Spanish) Recommend on Facebook ... at a very young age. Based on CDC data, the median age at diagnosis is 36 months ...

  14. Cooperative Learning in Statistics.

    ERIC Educational Resources Information Center

    Keeler, Carolyn M.; And Others

    1994-01-01

    Formal use of cooperative learning techniques proved effective in improving student performance and retention in a freshman level statistics course. Lectures interspersed with group activities proved effective in increasing conceptual understanding and overall class performance. (11 references) (Author)

  15. Statistics of the sagas

    NASA Astrophysics Data System (ADS)

    Richfield, Jon; bookfeller

    2016-07-01

    In reply to Ralph Kenna and Pádraig Mac Carron's feature article “Maths meets myths” in which they describe how they are using techniques from statistical physics to characterize the societies depicted in ancient Icelandic sagas.

  16. Elements of Statistics

    NASA Astrophysics Data System (ADS)

    Grégoire, G.

    2016-05-01

    This chapter is devoted to two objectives. The first one is to answer the request expressed by attendees of the first Astrostatistics School (Annecy, October 2013) to be provided with an elementary vademecum of statistics that would facilitate understanding of the given courses. In this spirit we recall very basic notions, that is definitions and properties that we think sufficient to benefit from courses given in the Astrostatistical School. Thus we give briefly definitions and elementary properties on random variables and vectors, distributions, estimation and tests, maximum likelihood methodology. We intend to present basic ideas in a hopefully comprehensible way. We do not try to give a rigorous presentation, and due to the place devoted to this chapter, can cover only a rather limited field of statistics. The second aim is to focus on some statistical tools that are useful in classification: basic introduction to Bayesian statistics, maximum likelihood methodology, Gaussian vectors and Gaussian mixture models.

  17. Plague Maps and Statistics

    MedlinePlus

    ... and Statistics Recommend on Facebook Tweet Share Compartir Plague in the United States Plague was first introduced ... per year in the United States: 1900-2012. Plague Worldwide Plague epidemics have occurred in Africa, Asia, ...

  18. Understanding Solar Flare Statistics

    NASA Astrophysics Data System (ADS)

    Wheatland, M. S.

    2005-12-01

    A review is presented of work aimed at understanding solar flare statistics, with emphasis on the well known flare power-law size distribution. Although avalanche models are perhaps the favoured model to describe flare statistics, their physical basis is unclear, and they are divorced from developing ideas in large-scale reconnection theory. An alternative model, aimed at reconciling large-scale reconnection models with solar flare statistics, is revisited. The solar flare waiting-time distribution has also attracted recent attention. Observed waiting-time distributions are described, together with what they might tell us about the flare phenomenon. Finally, a practical application of flare statistics to flare prediction is described in detail, including the results of a year of automated (web-based) predictions from the method.

  19. Titanic: A Statistical Exploration.

    ERIC Educational Resources Information Center

    Takis, Sandra L.

    1999-01-01

    Uses the available data about the Titanic's passengers to interest students in exploring categorical data and the chi-square distribution. Describes activities incorporated into a statistics class and gives additional resources for collecting information about the Titanic. (ASK)

  20. Purposeful Statistical Investigations

    ERIC Educational Resources Information Center

    Day, Lorraine

    2014-01-01

    Lorraine Day provides us with a great range of statistical investigations using various resources such as maths300 and TinkerPlots. Each of the investigations link mathematics to students' lives and provide engaging and meaningful contexts for mathematical inquiry.

  1. Boosted Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Testa, Massimo

    2015-08-01

    Starting with the basic principles of Relativistic Quantum Mechanics, we give a rigorous, but completely elementary proof of the relation between fundamental observables of a statistical system, when measured within two inertial reference frames, related by a Lorentz transformation.

  2. How Statistics "Excel" Online.

    ERIC Educational Resources Information Center

    Chao, Faith; Davis, James

    2000-01-01

    Discusses the use of Microsoft Excel software and provides examples of its use in an online statistics course at Golden Gate University in the areas of randomness and probability, sampling distributions, confidence intervals, and regression analysis. (LRW)

  3. Inference for the Bivariate and Multivariate Hidden Truncated Pareto(type II) and Pareto(type IV) Distribution and Some Measures of Divergence Related to Incompatibility of Probability Distribution

    ERIC Educational Resources Information Center

    Ghosh, Indranil

    2011-01-01

    Consider a discrete bivariate random variable (X, Y) with possible values x[subscript 1], x[subscript 2],..., x[subscript I] for X and y[subscript 1], y[subscript 2],..., y[subscript J] for Y. Further suppose that the corresponding families of conditional distributions, for X given values of Y and of Y for given values of X are available. We…

  4. Bayesian bivariate meta-analysis of correlated effects: Impact of the prior distributions on the between-study correlation, borrowing of strength, and joint inferences.

    PubMed

    Burke, Danielle L; Bujkiewicz, Sylwia; Riley, Richard D

    2016-03-17

    Multivariate random-effects meta-analysis allows the joint synthesis of correlated results from multiple studies, for example, for multiple outcomes or multiple treatment groups. In a Bayesian univariate meta-analysis of one endpoint, the importance of specifying a sensible prior distribution for the between-study variance is well understood. However, in multivariate meta-analysis, there is little guidance about the choice of prior distributions for the variances or, crucially, the between-study correlation, ρB; for the latter, researchers often use a Uniform(-1,1) distribution assuming it is vague. In this paper, an extensive simulation study and a real illustrative example is used to examine the impact of various (realistically) vague prior distributions for ρB and the between-study variances within a Bayesian bivariate random-effects meta-analysis of two correlated treatment effects. A range of diverse scenarios are considered, including complete and missing data, to examine the impact of the prior distributions on posterior results (for treatment effect and between-study correlation), amount of borrowing of strength, and joint predictive distributions of treatment effectiveness in new studies. Two key recommendations are identified to improve the robustness of multivariate meta-analysis results. First, the routine use of a Uniform(-1,1) prior distribution for ρB should be avoided, if possible, as it is not necessarily vague. Instead, researchers should identify a sensible prior distribution, for example, by restricting values to be positive or negative as indicated by prior knowledge. Second, it remains critical to use sensible (e.g. empirically based) prior distributions for the between-study variances, as an inappropriate choice can adversely impact the posterior distribution for ρB, which may then adversely affect inferences such as joint predictive probabilities. These recommendations are especially important with a small number of studies and missing

  5. Diagnosis of pneumocystis pneumonia using serum (1-3)-β-D-Glucan: a bivariate meta-analysis and systematic review

    PubMed Central

    Li, Wei-Jie; Guo, Ya-Ling; Liu, Tang-Juan

    2015-01-01

    Background The (1-3)-β-D-Glucan (BG) assay has been approved for making a diagnosis of invasive fungal disease. However, the role of serum-BG assay for the diagnosis of pneumocystis pneumonia (PCP) is controversial, especially between patients with human immunodeficiency virus (HIV) and non-HIV. We conducted a meta-analysis to determine the difference of the overall accuracy of serum-BG assay for the diagnosis of PCP in immunocompromised patients with and without HIV. Methods After a systematic review of English-language studies and manual researching, sensitivity (Se), specificity (Sp), and other measures of accuracy of serum-BG for the diagnosis of PCP were pooled using random-effects models for bivariate meta-analysis. Summary receiver operating characteristic (SROC) curve was used to summarize overall test performance. Subgroup analyses were performed to explore the heterogeneity in Se and Sp. Results Thirteen studies met our inclusion criteria. The summary estimates for serum-BG assay for definite PCP were as follows: Se, 0.91 [95% confidence interval (CI), 0.88–0.93]; Sp, 0.75 (95% CI, 0.68–0.81). As for the patients with and without HIV, the Se and Sp were 0.92 and 0.78, 0.85 and 0.73, respectively. Significant heterogeneity between Se was presented (P=0.04). Conclusions Contrary to the results of the previous meta-analysis, a negative result of serum-BG determination is sufficient for ruling out PCP only in HIV cases. For non-HIV patients, the results should be interpreted in parallel with clinical and radiological findings. Besides, further prospective studies with larger sample size are needed to confirm the diagnosis strategy of BG detection. PMID:26793343

  6. Statistical Physics of Fracture

    SciTech Connect

    Alava, Mikko; Nukala, Phani K; Zapperi, Stefano

    2006-05-01

    Disorder and long-range interactions are two of the key components that make material failure an interesting playfield for the application of statistical mechanics. The cornerstone in this respect has been lattice models of the fracture in which a network of elastic beams, bonds, or electrical fuses with random failure thresholds are subject to an increasing external load. These models describe on a qualitative level the failure processes of real, brittle, or quasi-brittle materials. This has been particularly important in solving the classical engineering problems of material strength: the size dependence of maximum stress and its sample-to-sample statistical fluctuations. At the same time, lattice models pose many new fundamental questions in statistical physics, such as the relation between fracture and phase transitions. Experimental results point out to the existence of an intriguing crackling noise in the acoustic emission and of self-affine fractals in the crack surface morphology. Recent advances in computer power have enabled considerable progress in the understanding of such models. Among these partly still controversial issues, are the scaling and size-effects in material strength and accumulated damage, the statistics of avalanches or bursts of microfailures, and the morphology of the crack surface. Here we present an overview of the results obtained with lattice models for fracture, highlighting the relations with statistical physics theories and more conventional fracture mechanics approaches.

  7. Suite versus composite statistics

    USGS Publications Warehouse

    Balsillie, J.H.; Tanner, W.F.

    1999-01-01

    Suite and composite methodologies, two statistically valid approaches for producing statistical descriptive measures, are investigated for sample groups representing a probability distribution where, in addition, each sample is probability distribution. Suite and composite means (first moment measures) are always equivalent. Composite standard deviations (second moment measures) are always larger than suite standard deviations. Suite and composite values for higher moment measures have more complex relationships. Very seldom, however, are they equivalent, and they normally yield statistically significant but different results. Multiple samples are preferable to single samples (including composites) because they permit the investigator to examine sample-to-sample variability. These and other relationships for suite and composite probability distribution analyses are investigated and reported using granulometric data.

  8. Perception in statistical graphics

    NASA Astrophysics Data System (ADS)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  9. Statistical origin of gravity

    SciTech Connect

    Banerjee, Rabin; Majhi, Bibhas Ranjan

    2010-06-15

    Starting from the definition of entropy used in statistical mechanics we show that it is proportional to the gravity action. For a stationary black hole this entropy is expressed as S=E/2T, where T is the Hawking temperature and E is shown to be the Komar energy. This relation is also compatible with the generalized Smarr formula for mass.

  10. Statistical Reasoning over Lunch

    ERIC Educational Resources Information Center

    Selmer, Sarah J.; Bolyard, Johnna J.; Rye, James A.

    2011-01-01

    Students in the 21st century are exposed daily to a staggering amount of numerically infused media. In this era of abundant numeric data, students must be able to engage in sound statistical reasoning when making life decisions after exposure to varied information. The context of nutrition can be used to engage upper elementary and middle school…

  11. Learning Statistical Concepts

    ERIC Educational Resources Information Center

    Akram, Muhammad; Siddiqui, Asim Jamal; Yasmeen, Farah

    2004-01-01

    In order to learn the concept of statistical techniques one needs to run real experiments that generate reliable data. In practice, the data from some well-defined process or system is very costly and time consuming. It is difficult to run real experiments during the teaching period in the university. To overcome these difficulties, statisticians…

  12. Analogies for Understanding Statistics

    ERIC Educational Resources Information Center

    Hocquette, Jean-Francois

    2004-01-01

    This article describes a simple way to explain the limitations of statistics to scientists and students to avoid the publication of misleading conclusions. Biologists examine their results extremely critically and carefully choose the appropriate analytic methods depending on their scientific objectives. However, no such close attention is usually…

  13. Statistical Significance Testing.

    ERIC Educational Resources Information Center

    McLean, James E., Ed.; Kaufman, Alan S., Ed.

    1998-01-01

    The controversy about the use or misuse of statistical significance testing has become the major methodological issue in educational research. This special issue contains three articles that explore the controversy, three commentaries on these articles, an overall response, and three rejoinders by the first three authors. They are: (1)…

  14. Structurally Sound Statistics Instruction

    ERIC Educational Resources Information Center

    Casey, Stephanie A.; Bostic, Jonathan D.

    2016-01-01

    The Common Core's Standards for Mathematical Practice (SMP) call for all K-grade 12 students to develop expertise in the processes and proficiencies of doing mathematics. However, the Common Core State Standards for Mathematics (CCSSM) (CCSSI 2010) as a whole addresses students' learning of not only mathematics but also statistics. This situation…

  15. General Aviation Avionics Statistics.

    DTIC Science & Technology

    1980-12-01

    No. 2. Government Accession No. 3. Recipient’s Catalog No. 5" FAA-MS-80-7* a and. SubtitleDecember 1&80 "GENERAL AVIATION AVIONICS STATISTICS 0 6...Altimeter 8. Fuel gage 3. Compass 9. Landing gear 4. Tachometer 10. Belts 5. Oil temperature 11. Special equipment for 6. Emergency locator over water

  16. NACME Statistical Report 1986.

    ERIC Educational Resources Information Center

    Miranda, Luis A.; Ruiz, Esther

    This statistical report summarizes data on enrollment and graduation of minority students in engineering degree programs from 1974 to 1985. First, an introduction identifies major trends and briefly describes the Incentive Grants Program (IGP), the nation's largest privately supported source of scholarship funds available to minority engineering…

  17. Probability and Statistics.

    ERIC Educational Resources Information Center

    Barnes, Bernis, Ed.; And Others

    This teacher's guide to probability and statistics contains three major sections. The first section on elementary combinatorial principles includes activities, student problems, and suggested teaching procedures for the multiplication principle, permutations, and combinations. Section two develops an intuitive approach to probability through…

  18. Selected Manpower Statistics.

    ERIC Educational Resources Information Center

    Office of the Assistant Secretary of Defense -- Comptroller (DOD), Washington, DC.

    This document contains summaries of basic manpower statistical data for the Department of Defense, with the Army, Navy, Marine Corps, and Air Force totals shown separately and collectively. Included are figures for active duty military personnel, civilian personnel, reserve components, and retired military personnel. Some of the data show…

  19. Statistics of mass production

    NASA Astrophysics Data System (ADS)

    Williams, R. L.; Gateley, Wilson Y.

    1993-05-01

    This paper summarizes the statistical quality control methods and procedures that can be employed in mass producing electronic parts (integrated circuits, buffers, capacitors, connectors) to reduce variability and ensure performance to specified radiation, current, voltage, temperature, shock, and vibration levels. Producing such quality parts reduces uncertainties in performance and will aid materially in validating the survivability of components, subsystems, and systems to specified threats.

  20. Statistics for Learning Genetics

    ERIC Educational Resources Information Center

    Charles, Abigail Sheena

    2012-01-01

    This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing…

  1. Education Statistics Quarterly, 2003.

    ERIC Educational Resources Information Center

    Marenus, Barbara; Burns, Shelley; Fowler, William; Greene, Wilma; Knepper, Paula; Kolstad, Andrew; McMillen Seastrom, Marilyn; Scott, Leslie

    2003-01-01

    This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released in a 3-month period. Each issue also contains a message from the NCES on a timely…

  2. Whither Statistics Education Research?

    ERIC Educational Resources Information Center

    Watson, Jane

    2016-01-01

    This year marks the 25th anniversary of the publication of a "National Statement on Mathematics for Australian Schools", which was the first curriculum statement this country had including "Chance and Data" as a significant component. It is hence an opportune time to survey the history of the related statistics education…

  3. Quartiles in Elementary Statistics

    ERIC Educational Resources Information Center

    Langford, Eric

    2006-01-01

    The calculation of the upper and lower quartile values of a data set in an elementary statistics course is done in at least a dozen different ways, depending on the text or computer/calculator package being used (such as SAS, JMP, MINITAB, "Excel," and the TI-83 Plus). In this paper, we examine the various methods and offer a suggestion for a new…

  4. Mental Illness Statistics

    MedlinePlus

    ... of benign genes ID’s ASD suspects More Additional Mental Health Information from NIMH Medications Statistics Clinical Trials Coping ... Finder Publicaciones en Español The National Institute of Mental Health (NIMH) is part of the National Institutes of ...

  5. Statistical Energy Analysis Program

    NASA Technical Reports Server (NTRS)

    Ferebee, R. C.; Trudell, R. W.; Yano, L. I.; Nygaard, S. I.

    1985-01-01

    Statistical Energy Analysis (SEA) is powerful tool for estimating highfrequency vibration spectra of complex structural systems and incorporated into computer program. Basic SEA analysis procedure divided into three steps: Idealization, parameter generation, and problem solution. SEA computer program written in FORTRAN V for batch execution.

  6. Library Research and Statistics.

    ERIC Educational Resources Information Center

    Lynch, Mary Jo; St. Lifer, Evan; Halstead, Kent; Fox, Bette-Lee; Miller, Marilyn L.; Shontz, Marilyn L.

    2001-01-01

    These nine articles discuss research and statistics on libraries and librarianship, including libraries in the United States, Canada, and Mexico; acquisition expenditures in public, academic, special, and government libraries; price indexes; state rankings of public library data; library buildings; expenditures in school library media centers; and…

  7. Bivariate mixture modeling of transferrin saturation and serum ferritin concentration in Asians, African Americans, Hispanics, and whites in the Hemochromatosis and Iron Overload Screening (HEIRS) Study

    PubMed Central

    Mclaren, Christine E.; Gordeuk, Victor R.; Chen, Wen-Pin; Barton, James C.; Acton, Ronald T.; Speechley, Mark; Castro, Oswaldo; Adams, Paul C.; Snively, Beverly M.; Harris, Emily L.; Reboussin, David M.; Mclachlan, Geoffrey J.; Bean, Richard

    2013-01-01

    Bivariate mixture modeling was used to analyze joint population distributions of transferrin saturation (TS) and serum ferritin concentration (SF) measured in the Hemochromatosis and Iron Overload Screening (HEIRS) Study. Four components (C1, C2, C3, and C4) with successively age-adjusted increasing means for TS and SF were identified in data from 26,832 African Americans, 12,620 Asians, 12,264 Hispanics, and 43,254 whites. The largest component, C2, had normal mean TS (21% to 26% for women, 29% to 30% for men) and SF (43–82 μg/L for women, 165–242 μg/L for men), which consisted of component proportions greater than 0.59 for women and greater than 0.68 for men. C3 and C4 had progressively greater mean values for TS and SF with progressively lesser component proportions. C1 had mean TS values less than 16% for women (<20% for men) and SF values less than 28 μg/L for women (<47 μg/L for men). Compared with C2, adjusted odds of iron deficiency were significantly greater in C1 (14.9–47.5 for women, 60.6–3530 for men), adjusted odds of liver disease were significantly greater in C3 and C4 for African-American women and all men, and adjusted odds of any HFE mutation were increased in C3 (1.4–1.8 for women, 1.2–1.9 for men) and in C4 for Hispanic and white women (1.5 and 5.2, respectively) and men (2.8 and 4.7, respectively). Joint mixture modeling identifies a component with lesser SF and TS at risk for iron deficiency and 2 components with greater SF and TS at risk for liver disease or HFE mutations. This approach can identify populations in which hereditary or acquired factors influence metabolism measurement. PMID:18201677

  8. Statistics for Learning Genetics

    NASA Astrophysics Data System (ADS)

    Charles, Abigail Sheena

    This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing statistically-based genetics problems. This issue is at the emerging edge of modern college-level genetics instruction, and this study attempts to identify key theoretical components for creating a specialized biological statistics curriculum. The goal of this curriculum will be to prepare biology students with the skills for assimilating quantitatively-based genetic processes, increasingly at the forefront of modern genetics. To fulfill this, two college level classes at two universities were surveyed. One university was located in the northeastern US and the other in the West Indies. There was a sample size of 42 students and a supplementary interview was administered to a select 9 students. Interviews were also administered to professors in the field in order to gain insight into the teaching of statistics in genetics. Key findings indicated that students had very little to no background in statistics (55%). Although students did perform well on exams with 60% of the population receiving an A or B grade, 77% of them did not offer good explanations on a probability question associated with the normal distribution provided in the survey. The scope and presentation of the applicable statistics/mathematics in some of the most used textbooks in genetics teaching, as well as genetics syllabi used by instructors do not help the issue. It was found that the text books, often times, either did not give effective explanations for students, or completely left out certain topics. The omission of certain statistical/mathematical oriented topics was seen to be also true with the genetics syllabi reviewed for this study. Nonetheless

  9. The Statistical Drake Equation

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2010-12-01

    We provide the statistical generalization of the Drake equation. From a simple product of seven positive numbers, the Drake equation is now turned into the product of seven positive random variables. We call this "the Statistical Drake Equation". The mathematical consequences of this transformation are then derived. The proof of our results is based on the Central Limit Theorem (CLT) of Statistics. In loose terms, the CLT states that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable. This is called the Lyapunov Form of the CLT, or the Lindeberg Form of the CLT, depending on the mathematical constraints assumed on the third moments of the various probability distributions. In conclusion, we show that: The new random variable N, yielding the number of communicating civilizations in the Galaxy, follows the LOGNORMAL distribution. Then, as a consequence, the mean value of this lognormal distribution is the ordinary N in the Drake equation. The standard deviation, mode, and all the moments of this lognormal N are also found. The seven factors in the ordinary Drake equation now become seven positive random variables. The probability distribution of each random variable may be ARBITRARY. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into our statistical Drake equation by allowing an arbitrary probability distribution for each factor. This is both physically realistic and practically very useful, of course. An application of our statistical Drake equation then follows. The (average) DISTANCE between any two neighboring and communicating civilizations in the Galaxy may be shown to be inversely proportional to the cubic root of N. Then, in our approach, this distance becomes a new random variable. We derive the relevant probability density

  10. The Statistical Fermi Paradox

    NASA Astrophysics Data System (ADS)

    Maccone, C.

    In this paper is provided the statistical generalization of the Fermi paradox. The statistics of habitable planets may be based on a set of ten (and possibly more) astrobiological requirements first pointed out by Stephen H. Dole in his book Habitable planets for man (1964). The statistical generalization of the original and by now too simplistic Dole equation is provided by replacing a product of ten positive numbers by the product of ten positive random variables. This is denoted the SEH, an acronym standing for “Statistical Equation for Habitables”. The proof in this paper is based on the Central Limit Theorem (CLT) of Statistics, stating that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable (Lyapunov form of the CLT). It is then shown that: 1. The new random variable NHab, yielding the number of habitables (i.e. habitable planets) in the Galaxy, follows the log- normal distribution. By construction, the mean value of this log-normal distribution is the total number of habitable planets as given by the statistical Dole equation. 2. The ten (or more) astrobiological factors are now positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into the SEH by allowing an arbitrary probability distribution for each factor. This is both astrobiologically realistic and useful for any further investigations. 3. By applying the SEH it is shown that the (average) distance between any two nearby habitable planets in the Galaxy may be shown to be inversely proportional to the cubic root of NHab. This distance is denoted by new random variable D. The relevant probability density function is derived, which was named the "Maccone distribution" by Paul Davies in

  11. Statistical Inference: The Big Picture.

    PubMed

    Kass, Robert E

    2011-02-01

    Statistics has moved beyond the frequentist-Bayesian controversies of the past. Where does this leave our ability to interpret results? I suggest that a philosophy compatible with statistical practice, labelled here statistical pragmatism, serves as a foundation for inference. Statistical pragmatism is inclusive and emphasizes the assumptions that connect statistical models with observed data. I argue that introductory courses often mis-characterize the process of statistical inference and I propose an alternative "big picture" depiction.

  12. Statistical evaluation of forecasts.

    PubMed

    Mader, Malenka; Mader, Wolfgang; Gluckman, Bruce J; Timmer, Jens; Schelter, Björn

    2014-08-01

    Reliable forecasts of extreme but rare events, such as earthquakes, financial crashes, and epileptic seizures, would render interventions and precautions possible. Therefore, forecasting methods have been developed which intend to raise an alarm if an extreme event is about to occur. In order to statistically validate the performance of a prediction system, it must be compared to the performance of a random predictor, which raises alarms independent of the events. Such a random predictor can be obtained by bootstrapping or analytically. We propose an analytic statistical framework which, in contrast to conventional methods, allows for validating independently the sensitivity and specificity of a forecasting method. Moreover, our method accounts for the periods during which an event has to remain absent or occur after a respective forecast.

  13. Pain: A Statistical Account

    PubMed Central

    Thacker, Michael A.; Moseley, G. Lorimer

    2017-01-01

    Perception is seen as a process that utilises partial and noisy information to construct a coherent understanding of the world. Here we argue that the experience of pain is no different; it is based on incomplete, multimodal information, which is used to estimate potential bodily threat. We outline a Bayesian inference model, incorporating the key components of cue combination, causal inference, and temporal integration, which highlights the statistical problems in everyday perception. It is from this platform that we are able to review the pain literature, providing evidence from experimental, acute, and persistent phenomena to demonstrate the advantages of adopting a statistical account in pain. Our probabilistic conceptualisation suggests a principles-based view of pain, explaining a broad range of experimental and clinical findings and making testable predictions. PMID:28081134

  14. Statistical evaluation of forecasts

    NASA Astrophysics Data System (ADS)

    Mader, Malenka; Mader, Wolfgang; Gluckman, Bruce J.; Timmer, Jens; Schelter, Björn

    2014-08-01

    Reliable forecasts of extreme but rare events, such as earthquakes, financial crashes, and epileptic seizures, would render interventions and precautions possible. Therefore, forecasting methods have been developed which intend to raise an alarm if an extreme event is about to occur. In order to statistically validate the performance of a prediction system, it must be compared to the performance of a random predictor, which raises alarms independent of the events. Such a random predictor can be obtained by bootstrapping or analytically. We propose an analytic statistical framework which, in contrast to conventional methods, allows for validating independently the sensitivity and specificity of a forecasting method. Moreover, our method accounts for the periods during which an event has to remain absent or occur after a respective forecast.

  15. 1979 DOE statistical symposium

    SciTech Connect

    Gardiner, D.A.; Truett T.

    1980-09-01

    The 1979 DOE Statistical Symposium was the fifth in the series of annual symposia designed to bring together statisticians and other interested parties who are actively engaged in helping to solve the nation's energy problems. The program included presentations of technical papers centered around exploration and disposal of nuclear fuel, general energy-related topics, and health-related issues, and workshops on model evaluation, risk analysis, analysis of large data sets, and resource estimation.

  16. Relativistic statistical arbitrage

    NASA Astrophysics Data System (ADS)

    Wissner-Gross, A. D.; Freer, C. E.

    2010-11-01

    Recent advances in high-frequency financial trading have made light propagation delays between geographically separated exchanges relevant. Here we show that there exist optimal locations from which to coordinate the statistical arbitrage of pairs of spacelike separated securities, and calculate a representative map of such locations on Earth. Furthermore, trading local securities along chains of such intermediate locations results in a novel econophysical effect, in which the relativistic propagation of tradable information is effectively slowed or stopped by arbitrage.

  17. Statistical Methods in Cosmology

    NASA Astrophysics Data System (ADS)

    Verde, L.

    2010-03-01

    The advent of large data-set in cosmology has meant that in the past 10 or 20 years our knowledge and understanding of the Universe has changed not only quantitatively but also, and most importantly, qualitatively. Cosmologists rely on data where a host of useful information is enclosed, but is encoded in a non-trivial way. The challenges in extracting this information must be overcome to make the most of a large experimental effort. Even after having converged to a standard cosmological model (the LCDM model) we should keep in mind that this model is described by 10 or more physical parameters and if we want to study deviations from it, the number of parameters is even larger. Dealing with such a high dimensional parameter space and finding parameters constraints is a challenge on itself. Cosmologists want to be able to compare and combine different data sets both for testing for possible disagreements (which could indicate new physics) and for improving parameter determinations. Finally, cosmologists in many cases want to find out, before actually doing the experiment, how much one would be able to learn from it. For all these reasons, sophisiticated statistical techniques are being employed in cosmology, and it has become crucial to know some statistical background to understand recent literature in the field. I will introduce some statistical tools that any cosmologist should know about in order to be able to understand recently published results from the analysis of cosmological data sets. I will not present a complete and rigorous introduction to statistics as there are several good books which are reported in the references. The reader should refer to those.

  18. Statistical Challenges of Astronomy

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric D.; Babu, G. Jogesh

    Digital sky surveys, data from orbiting telescopes, and advances in computation have increased the quantity and quality of astronomical data by several orders of magnitude in recent years. Making sense of this wealth of data requires sophisticated statistical and data analytic techniques. Fortunately, statistical methodologies have similarly made great strides in recent years. Powerful synergies thus emerge when astronomers and statisticians join in examining astrostatistical problems and approaches. The volume focuses on several themes: · The increasing power of Bayesian approaches to modeling astronomical data · The growth of enormous databases, leading an emerging federated Virtual Observatory, and their impact on modern astronomical research · Statistical modeling of critical datasets, such as galaxy clustering and fluctuations in the microwave background radiation, leading to a new era of precision cosmology · Methodologies for uncovering clusters and patterns in multivariate data · The characterization of multiscale patterns in imaging and time series data As in earlier volumes in this series, research contributions discussing topics in one field are joined with commentary from scholars in the other. Short contributed papers covering dozens of astrostatistical topics are also included.

  19. Statistics in fusion experiments

    NASA Astrophysics Data System (ADS)

    McNeill, D. H.

    1997-11-01

    Since the reasons for the variability in data from plasma experiments are often unknown or uncontrollable, statistical methods must be applied. Reliable interpretation and public accountability require full data sets. Two examples of data misrepresentation at PPPL are analyzed: Te >100 eV on S-1 spheromak.(M. Yamada, Nucl. Fusion 25, 1327 (1985); reports to DoE; etc.) The reported high values (statistical artifacts of Thomson scattering measurements) were selected from a mass of data with an average of 40 eV or less. ``Correlated'' spectroscopic data were meaningless. (2) Extrapolation to Q >=0.5 for DT in TFTR.(D. Meade et al., IAEA Baltimore (1990), V. 1, p. 9; H. P. Furth, Statements to U. S. Congress (1989).) The DD yield used there was the highest through 1990 (>= 50% above average) and the DT to DD power ratio used was about twice any published value. Average DD yields and published yield ratios scale to Q<0.15 for DT, in accord with the observed performance over the last 3 1/2 years. Press reports of outlier data from TFTR have obscured the fact that the DT behavior follows from trivial scaling of the DD data. Good practice in future fusion research would have confidence intervals and other descriptive statistics accompanying reported numerical values (cf. JAMA).

  20. Statistical Inference at Work: Statistical Process Control as an Example

    ERIC Educational Resources Information Center

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  1. Statistical Properties of Photon-Added Two-Mode Squeezed Coherent States

    NASA Astrophysics Data System (ADS)

    Wang, Zhen; Li, Heng-Mei; Yuan, Hong-Chun; Wan, Zhi-Long; Meng, Xiang-Guo

    2017-03-01

    The nonclassical and non-Gaussian quantum states—photon-added two-mode squeezed coherent states have been theoretically introduced by adding multiple photons to each mode of the two-mode squeezed coherent states. Starting from the new expression of two-mode squeezing operator in entangled states representation, the normalization factor is obtained, which is directly related to bivariate Hermite polynomials. The sub-Poissonian photon statistics, cross-correlation between two modes, partial negative Wigner function are observed, which fully reflect the nonclassicality of the target states. The negative Wigner function often display non-Gaussian distribution meanwhile. The investigations may provide experimentalists with some better references in quantum engineering.

  2. Statistical methods for astronomical data with upper limits. II - Correlation and regression

    NASA Technical Reports Server (NTRS)

    Isobe, T.; Feigelson, E. D.; Nelson, P. I.

    1986-01-01

    Statistical methods for calculating correlations and regressions in bivariate censored data where the dependent variable can have upper or lower limits are presented. Cox's regression and the generalization of Kendall's rank correlation coefficient provide significant levels of correlations, and the EM algorithm, under the assumption of normally distributed errors, and its nonparametric analog using the Kaplan-Meier estimator, give estimates for the slope of a regression line. Monte Carlo simulations demonstrate that survival analysis is reliable in determining correlations between luminosities at different bands. Survival analysis is applied to CO emission in infrared galaxies, X-ray emission in radio galaxies, H-alpha emission in cooling cluster cores, and radio emission in Seyfert galaxies.

  3. Prediction of precipitation-induced phlebitis: a statistical validation of an in vitro model.

    PubMed

    Johnson, Jennifer L H; He, Yan; Yalkowsky, Samuel H

    2003-08-01

    To avoid phlebitis, new intravenous (IV) parenterals are often screened by injection into animals. This method is not only expensive and time consuming, it is also detrimental to the animals. An alternate method, focusing on precipitation as the cause, uses an in vitro dynamic injection model that requires less money and time and reduces the need for live models. Validation of the dynamic injection apparatus, for predicting mechanical phlebitis, is established. Twenty-one currently marketed IV products were injected into isotonic Sorenson's phosphate buffer flowing at 5 mL/min. The resulting opacities, produced by precipitation, are measured in an ultraviolet flow cell. These opacity data, coupled with literature reports on phlebitis occurrence, were used to generate a logistic regression that indicates the probability of phlebitis given an opacity value measured by the apparatus. Regression results are supported by a receiver operator characteristic curve that establishes the most ideal cut-off opacity value. This opacity value provides the highest combined sensitivity (statistical power) and specificity while minimizing false-positive and false-negative results. Both analyses show that an opacity value of 0.003 best delineates phlebitic and nonphlebitic products. Measures of sensitivity (0.83), specificity (0.93), positive predictive value (0.93), and negative predictive value (0.78) indicate the model's predictive accuracy and reliability. These results support the use of the dynamic model in place of animals for preliminary phlebitis testing of new IV injectables.

  4. Diagnostic accuracy of the Berlin questionnaire, STOP-BANG, STOP, and Epworth sleepiness scale in detecting obstructive sleep apnea: A bivariate meta-analysis.

    PubMed

    Chiu, Hsiao-Yean; Chen, Pin-Yuan; Chuang, Li-Pang; Chen, Ning-Hung; Tu, Yu-Kang; Hsieh, Yu-Jung; Wang, Yu-Chi; Guilleminault, Christian

    2016-11-05

    Obstructive sleep apnea (OSA) is a highly prevalent sleep disorder; however, it remains underdiagnosed and undertreated. Although screening tools such as the Berlin questionnaire (BQ), STOP-BANG questionnaire (SBQ), STOP questionnaire (STOP), and Epworth sleepiness scale (ESS) are widely used for OSA, the findings regarding their diagnostic accuracy are controversial. Therefore, this meta-analysis investigated and compared the summary sensitivity, specificity, and diagnostic odds ratio (DOR) among the BQ, SBQ, STOP, and ESS according to the severity of OSA. Electronic databases, namely the Embase, PubMed, PsycINFO, ProQuest dissertations and theses A&I databases, and China knowledge resource integrated database, were searched from their inception to July 15, 2016. We included studies examining the sensitivity and specificity of the BQ, SBQ, STOP, and ESS against the apnea-hypopnea index (AHI) or respiratory disturbance index (RDI). The revised quality assessment of diagnostic accuracy studies was used to evaluate the methodological quality of studies. A random-effects bivariate model was used to estimate the summary sensitivity, specificity, and DOR of the tools. We identified 108 studies including a total of 47 989 participants. The summary estimates were calculated for the BQ, SBQ, STOP, and ESS in detecting mild (AHI/RDI ≥ 5 events/h), moderate (AHI/RDI ≥ 15 events/h), and severe OSA (AHI/RDI ≥ 30 events/h). The performance levels of the BQ, SBQ, STOP, and ESS in detecting OSA of various severity levels are outlined as follows: for mild OSA, the pooled sensitivity levels were 76%, 88%, 87%, and 54%; pooled specificity levels were 59%, 42%, 42%, and 65%; and pooled DORs were 4.30, 5.13, 4.85, and 2.18, respectively. For moderate OSA, the pooled sensitivity levels were 77%, 90%, 89%, and 47%; pooled specificity levels were 44%, 36%, 32%, and 621%; and pooled DORs were 2.68, 5.05, 3.71, and 1.45, respectively. For severe OSA, the pooled sensitivity

  5. Truth, Damn Truth, and Statistics

    ERIC Educational Resources Information Center

    Velleman, Paul F.

    2008-01-01

    Statisticians and Statistics teachers often have to push back against the popular impression that Statistics teaches how to lie with data. Those who believe incorrectly that Statistics is solely a branch of Mathematics (and thus algorithmic), often see the use of judgment in Statistics as evidence that we do indeed manipulate our results. In the…

  6. Experimental Mathematics and Computational Statistics

    SciTech Connect

    Bailey, David H.; Borwein, Jonathan M.

    2009-04-30

    The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.

  7. Who Needs Statistics? | Poster

    Cancer.gov

    You may know the feeling. You have collected a lot of new data on an important experiment. Now you are faced with multiple groups of data, a sea of numbers, and a deadline for submitting your paper to a peer-reviewed journal. And you are not sure which data are relevant, or even the best way to present them. The statisticians at Data Management Services (DMS) know how to help. This small group of experts provides a wide array of statistical and mathematical consulting services to the scientific community at NCI at Frederick and NCI-Bethesda.

  8. International petroleum statistics report

    SciTech Connect

    1995-10-01

    The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report presents data on international oil production, demand, imports, exports and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). Section 2 presents an oil supply/demand balance for the world, in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries.

  9. NASA Pocket Statistics

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Pocket Statistics is published for the use of NASA managers and their staff. Included herein is Administrative and Organizational information, summaries of Space Flight Activity including the NASA Major Launch Record, and NASA Procurement, Financial, and Manpower data. The NASA Major Launch Record includes all launches of Scout class and larger vehicles. Vehicle and spacecraft development flights are also included in the Major Launch Record. Shuttle missions are counted as one launch and one payload, where free flying payloads are not involved. Satellites deployed from the cargo bay of the Shuttle and placed in a separate orbit or trajectory are counted as an additional payload.

  10. NASA Pocket Statistics

    NASA Technical Reports Server (NTRS)

    1995-01-01

    NASA Pocket Statistics is published for the use of NASA managers and their staff. Included herein is Administrative and Organizational information, summaries of Space Flight Activity including the NASA Major Launch Record, and NASA Procurement, Financial, and Manpower data. The NASA Major Launch Record includes all launches of Scout class and larger vehicles. Vehicle and spacecraft development flights are also included in the Major Launch Record. Shuttle missions are counted as one launch and one payload, where free flying payloads are not involved. Satellites deployed from the cargo bay of the Shuttle and placed in a separate orbit or trajectory are counted as an additional payload.

  11. NASA Pocket Statistics

    NASA Technical Reports Server (NTRS)

    1996-01-01

    This booklet of pocket statistics includes the 1996 NASA Major Launch Record, NASA Procurement, Financial, and Workforce data. The NASA Major Launch Record includes all launches of Scout class and larger vehicles. Vehicle and spacecraft development flights are also included in the Major Luanch Record. Shuttle missions are counted as one launch and one payload, where free flying payloads are not involved. Satellites deployed from the cargo bay of the Shuttle and placed in a separate orbit or trajectory are counted as an additional payload.

  12. Statistics of superior records

    NASA Astrophysics Data System (ADS)

    Ben-Naim, E.; Krapivsky, P. L.

    2013-08-01

    We study statistics of records in a sequence of random variables. These identical and independently distributed variables are drawn from the parent distribution ρ. The running record equals the maximum of all elements in the sequence up to a given point. We define a superior sequence as one where all running records are above the average record expected for the parent distribution ρ. We find that the fraction of superior sequences SN decays algebraically with sequence length N, SN˜N-β in the limit N→∞. Interestingly, the decay exponent β is nontrivial, being the root of an integral equation. For example, when ρ is a uniform distribution with compact support, we find β=0.450265. In general, the tail of the parent distribution governs the exponent β. We also consider the dual problem of inferior sequences, where all records are below average, and find that the fraction of inferior sequences IN decays algebraically, albeit with a different decay exponent, IN˜N-α. We use the above statistical measures to analyze earthquake data.

  13. Taking a statistical approach

    SciTech Connect

    Wild, M.; Rouhani, S.

    1995-02-01

    A typical site investigation entails extensive sampling and monitoring. In the past, sampling plans have been designed on purely ad hoc bases, leading to significant expenditures and, in some cases, collection of redundant information. In many instances, sampling costs exceed the true worth of the collected data. The US Environmental Protection Agency (EPA) therefore has advocated the use of geostatistics to provide a logical framework for sampling and analysis of environmental data. Geostatistical methodology uses statistical techniques for the spatial analysis of a variety of earth-related data. The use of geostatistics was developed by the mining industry to estimate ore concentrations. The same procedure is effective in quantifying environmental contaminants in soils for risk assessments. Unlike classical statistical techniques, geostatistics offers procedures to incorporate the underlying spatial structure of the investigated field. Sample points spaced close together tend to be more similar than samples spaced further apart. This can guide sampling strategies and determine complex contaminant distributions. Geostatistic techniques can be used to evaluate site conditions on the basis of regular, irregular, random and even spatially biased samples. In most environmental investigations, it is desirable to concentrate sampling in areas of known or suspected contamination. The rigorous mathematical procedures of geostatistics allow for accurate estimates at unsampled locations, potentially reducing sampling requirements. The use of geostatistics serves as a decision-aiding and planning tool and can significantly reduce short-term site assessment costs, long-term sampling and monitoring needs, as well as lead to more accurate and realistic remedial design criteria.

  14. Fragile entanglement statistics

    NASA Astrophysics Data System (ADS)

    Brody, Dorje C.; Hughston, Lane P.; Meier, David M.

    2015-10-01

    If X and Y are independent, Y and Z are independent, and so are X and Z, one might be tempted to conclude that X, Y, and Z are independent. But it has long been known in classical probability theory that, intuitive as it may seem, this is not true in general. In quantum mechanics one can ask whether analogous statistics can emerge for configurations of particles in certain types of entangled states. The explicit construction of such states, along with the specification of suitable sets of observables that have the purported statistical properties, is not entirely straightforward. We show that an example of such a configuration arises in the case of an N-particle GHZ state, and we are able to identify a family of observables with the property that the associated measurement outcomes are independent for any choice of 2,3,\\ldots ,N-1 of the particles, even though the measurement outcomes for all N particles are not independent. Although such states are highly entangled, the entanglement turns out to be ‘fragile’, i.e. the associated density matrix has the property that if one traces out the freedom associated with even a single particle, the resulting reduced density matrix is separable.

  15. International petroleum statistics report

    SciTech Connect

    1997-05-01

    The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. Publication of this report is in keeping with responsibilities given the Energy Information Administration in Public Law 95-91. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.

  16. Elements of Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Sachs, Ivo; Sen, Siddhartha; Sexton, James

    2006-05-01

    This textbook provides a concise introduction to the key concepts and tools of modern statistical mechanics. It also covers advanced topics such as non-relativistic quantum field theory and numerical methods. After introducing classical analytical techniques, such as cluster expansion and Landau theory, the authors present important numerical methods with applications to magnetic systems, Lennard-Jones fluids and biophysics. Quantum statistical mechanics is discussed in detail and applied to Bose-Einstein condensation and topics in astrophysics and cosmology. In order to describe emergent phenomena in interacting quantum systems, canonical non-relativistic quantum field theory is introduced and then reformulated in terms of Feynman integrals. Combining the authors' many years' experience of teaching courses in this area, this textbook is ideal for advanced undergraduate and graduate students in physics, chemistry and mathematics. Analytical and numerical techniques in one text, including sample codes and solved problems on the web at www.cambridge.org/0521841984 Covers a wide range of applications including magnetic systems, turbulence astrophysics, and biology Contains a concise introduction to Markov processes and molecular dynamics

  17. Statistical clumped isotope signatures

    PubMed Central

    Röckmann, T.; Popa, M. E.; Krol, M. C.; Hofmann, M. E. G.

    2016-01-01

    High precision measurements of molecules containing more than one heavy isotope may provide novel constraints on element cycles in nature. These so-called clumped isotope signatures are reported relative to the random (stochastic) distribution of heavy isotopes over all available isotopocules of a molecule, which is the conventional reference. When multiple indistinguishable atoms of the same element are present in a molecule, this reference is calculated from the bulk (≈average) isotopic composition of the involved atoms. We show here that this referencing convention leads to apparent negative clumped isotope anomalies (anti-clumping) when the indistinguishable atoms originate from isotopically different populations. Such statistical clumped isotope anomalies must occur in any system where two or more indistinguishable atoms of the same element, but with different isotopic composition, combine in a molecule. The size of the anti-clumping signal is closely related to the difference of the initial isotope ratios of the indistinguishable atoms that have combined. Therefore, a measured statistical clumped isotope anomaly, relative to an expected (e.g. thermodynamical) clumped isotope composition, may allow assessment of the heterogeneity of the isotopic pools of atoms that are the substrate for formation of molecules. PMID:27535168

  18. [Comment on] Statistical discrimination

    NASA Astrophysics Data System (ADS)

    Chinn, Douglas

    In the December 8, 1981, issue of Eos, a news item reported the conclusion of a National Research Council study that sexual discrimination against women with Ph.D.'s exists in the field of geophysics. Basically, the item reported that even when allowances are made for motherhood the percentage of female Ph.D.'s holding high university and corporate positions is significantly lower than the percentage of male Ph.D.'s holding the same types of positions. The sexual discrimination conclusion, based only on these statistics, assumes that there are no basic psychological differences between men and women that might cause different populations in the employment group studied. Therefore, the reasoning goes, after taking into account possible effects from differences related to anatomy, such as women stopping their careers in order to bear and raise children, the statistical distributions of positions held by male and female Ph.D.'s ought to be very similar to one another. Any significant differences between the distributions must be caused primarily by sexual discrimination.

  19. Relationship between Graduate Students' Statistics Self-Efficacy, Statistics Anxiety, Attitude toward Statistics, and Social Support

    ERIC Educational Resources Information Center

    Perepiczka, Michelle; Chandler, Nichelle; Becerra, Michael

    2011-01-01

    Statistics plays an integral role in graduate programs. However, numerous intra- and interpersonal factors may lead to successful completion of needed coursework in this area. The authors examined the extent of the relationship between self-efficacy to learn statistics and statistics anxiety, attitude towards statistics, and social support of 166…

  20. BIG DATA AND STATISTICS

    PubMed Central

    Rossell, David

    2016-01-01

    Big Data brings unprecedented power to address scientific, economic and societal issues, but also amplifies the possibility of certain pitfalls. These include using purely data-driven approaches that disregard understanding the phenomenon under study, aiming at a dynamically moving target, ignoring critical data collection issues, summarizing or preprocessing the data inadequately and mistaking noise for signal. We review some success stories and illustrate how statistical principles can help obtain more reliable information from data. We also touch upon current challenges that require active methodological research, such as strategies for efficient computation, integration of heterogeneous data, extending the underlying theory to increasingly complex questions and, perhaps most importantly, training a new generation of scientists to develop and deploy these strategies. PMID:27722040

  1. Statistical crack mechanics

    SciTech Connect

    Dienes, J.K.

    1983-01-01

    An alternative to the use of plasticity theory to characterize the inelastic behavior of solids is to represent the flaws by statistical methods. We have taken such an approach to study fragmentation because it offers a number of advantages. Foremost among these is that, by considering the effects of flaws, it becomes possible to address the underlying physics directly. For example, we have been able to explain why rocks exhibit large strain-rate effects (a consequence of the finite growth rate of cracks), why a spherical explosive imbedded in oil shale produces a cavity with a nearly square section (opening of bedding cracks) and why propellants may detonate following low-speed impact (a consequence of frictional hot spots).

  2. Statistics of lattice animals

    NASA Astrophysics Data System (ADS)

    Hsu, Hsiao-Ping; Nadler, Walder; Grassberger, Peter

    2005-07-01

    The scaling behavior of randomly branched polymers in a good solvent is studied in two to nine dimensions, modeled by lattice animals on simple hypercubic lattices. For the simulations, we use a biased sequential sampling algorithm with re-sampling, similar to the pruned-enriched Rosenbluth method (PERM) used extensively for linear polymers. We obtain high statistics of animals with up to several thousand sites in all dimension 2⩽d⩽9. The partition sum (number of different animals) and gyration radii are estimated. In all dimensions we verify the Parisi-Sourlas prediction, and we verify all exactly known critical exponents in dimensions 2, 3, 4, and ⩾8. In addition, we present the hitherto most precise estimates for growth constants in d⩾3. For clusters with one site attached to an attractive surface, we verify the superuniversality of the cross-over exponent at the adsorption transition predicted by Janssen and Lyssy.

  3. Conditional statistical model building

    NASA Astrophysics Data System (ADS)

    Hansen, Mads Fogtmann; Hansen, Michael Sass; Larsen, Rasmus

    2008-03-01

    We present a new statistical deformation model suited for parameterized grids with different resolutions. Our method models the covariances between multiple grid levels explicitly, and allows for very efficient fitting of the model to data on multiple scales. The model is validated on a data set consisting of 62 annotated MR images of Corpus Callosum. One fifth of the data set was used as a training set, which was non-rigidly registered to each other without a shape prior. From the non-rigidly registered training set a shape prior was constructed by performing principal component analysis on each grid level and using the results to construct a conditional shape model, conditioning the finer parameters with the coarser grid levels. The remaining shapes were registered with the constructed shape prior. The dice measures for the registration without prior and the registration with a prior were 0.875 +/- 0.042 and 0.8615 +/- 0.051, respectively.

  4. Statistical physics ""Beyond equilibrium

    SciTech Connect

    Ecke, Robert E

    2009-01-01

    The scientific challenges of the 21st century will increasingly involve competing interactions, geometric frustration, spatial and temporal intrinsic inhomogeneity, nanoscale structures, and interactions spanning many scales. We will focus on a broad class of emerging problems that will require new tools in non-equilibrium statistical physics and that will find application in new material functionality, in predicting complex spatial dynamics, and in understanding novel states of matter. Our work will encompass materials under extreme conditions involving elastic/plastic deformation, competing interactions, intrinsic inhomogeneity, frustration in condensed matter systems, scaling phenomena in disordered materials from glasses to granular matter, quantum chemistry applied to nano-scale materials, soft-matter materials, and spatio-temporal properties of both ordinary and complex fluids.

  5. Statistical Thermodynamics of Biomembranes

    PubMed Central

    Devireddy, Ram V.

    2010-01-01

    An overview of the major issues involved in the statistical thermodynamic treatment of phospholipid membranes at the atomistic level is summarized: thermodynamic ensembles, initial configuration (or the physical system being modeled), force field representation as well as the representation of long-range interactions. This is followed by a description of the various ways that the simulated ensembles can be analyzed: area of the lipid, mass density profiles, radial distribution functions (RDFs), water orientation profile, Deuteurium order parameter, free energy profiles and void (pore) formation; with particular focus on the results obtained from our recent molecular dynamic (MD) simulations of phospholipids interacting with dimethylsulfoxide (Me2SO), a commonly used cryoprotective agent (CPA). PMID:19460363

  6. Statistical physics of vaccination

    NASA Astrophysics Data System (ADS)

    Wang, Zhen; Bauch, Chris T.; Bhattacharyya, Samit; d'Onofrio, Alberto; Manfredi, Piero; Perc, Matjaž; Perra, Nicola; Salathé, Marcel; Zhao, Dawei

    2016-12-01

    Historically, infectious diseases caused considerable damage to human societies, and they continue to do so today. To help reduce their impact, mathematical models of disease transmission have been studied to help understand disease dynamics and inform prevention strategies. Vaccination-one of the most important preventive measures of modern times-is of great interest both theoretically and empirically. And in contrast to traditional approaches, recent research increasingly explores the pivotal implications of individual behavior and heterogeneous contact patterns in populations. Our report reviews the developmental arc of theoretical epidemiology with emphasis on vaccination, as it led from classical models assuming homogeneously mixing (mean-field) populations and ignoring human behavior, to recent models that account for behavioral feedback and/or population spatial/social structure. Many of the methods used originated in statistical physics, such as lattice and network models, and their associated analytical frameworks. Similarly, the feedback loop between vaccinating behavior and disease propagation forms a coupled nonlinear system with analogs in physics. We also review the new paradigm of digital epidemiology, wherein sources of digital data such as online social media are mined for high-resolution information on epidemiologically relevant individual behavior. Armed with the tools and concepts of statistical physics, and further assisted by new sources of digital data, models that capture nonlinear interactions between behavior and disease dynamics offer a novel way of modeling real-world phenomena, and can help improve health outcomes. We conclude the review by discussing open problems in the field and promising directions for future research.

  7. Statistical Literacy: Developing a Youth and Adult Education Statistical Project

    ERIC Educational Resources Information Center

    Conti, Keli Cristina; Lucchesi de Carvalho, Dione

    2014-01-01

    This article focuses on the notion of literacy--general and statistical--in the analysis of data from a fieldwork research project carried out as part of a master's degree that investigated the teaching and learning of statistics in adult education mathematics classes. We describe the statistical context of the project that involved the…

  8. Key Statistics for Thyroid Cancer

    MedlinePlus

    ... and Treatment? Thyroid Cancer About Thyroid Cancer Key Statistics for Thyroid Cancer How common is thyroid cancer? ... remains very low compared with most other cancers. Statistics on survival rates for thyroid cancer are discussed ...

  9. HPV-Associated Cancers Statistics

    MedlinePlus

    ... What CDC Is Doing Related Links Stay Informed Statistics for Other Kinds of Cancer Breast Cervical Colorectal ( ... Vaginal and Vulvar Cancer Home HPV-Associated Cancer Statistics Language: English Español (Spanish) Recommend on Facebook Tweet ...

  10. Muscular Dystrophy: Data and Statistics

    MedlinePlus

    ... Statistics Recommend on Facebook Tweet Share Compartir MD STAR net Data and Statistics The following data and ... research [ Read Article ] For more information on MD STAR net see Research and Tracking . Key Findings Feature ...

  11. International petroleum statistics report

    SciTech Connect

    1996-05-01

    The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1084 through 1994.

  12. Topics in statistical mechanics

    SciTech Connect

    Elser, V.

    1984-05-01

    This thesis deals with four independent topics in statistical mechanics: (1) the dimer problem is solved exactly for a hexagonal lattice with general boundary using a known generating function from the theory of partitions. It is shown that the leading term in the entropy depends on the shape of the boundary; (2) continuum models of percolation and self-avoiding walks are introduced with the property that their series expansions are sums over linear graphs with intrinsic combinatorial weights and explicit dimension dependence; (3) a constrained SOS model is used to describe the edge of a simple cubic crystal. Low and high temperature results are derived as well as the detailed behavior near the crystal facet; (4) the microscopic model of the lambda-transition involving atomic permutation cycles is reexamined. In particular, a new derivation of the two-component field theory model of the critical behavior is presented. Results for a lattice model originally proposed by Kikuchi are extended with a high temperature series expansion and Monte Carlo simulation. 30 references.

  13. Statistics of indistinguishable particles.

    PubMed

    Wittig, Curt

    2009-07-02

    The wave function of a system containing identical particles takes into account the relationship between a particle's intrinsic spin and its statistical property. Specifically, the exchange of two identical particles having odd-half-integer spin results in the wave function changing sign, whereas the exchange of two identical particles having integer spin is accompanied by no such sign change. This is embodied in a term (-1)(2s), which has the value +1 for integer s (bosons), and -1 for odd-half-integer s (fermions), where s is the particle spin. All of this is well-known. In the nonrelativistic limit, a detailed consideration of the exchange of two identical particles shows that exchange is accompanied by a 2pi reorientation that yields the (-1)(2s) term. The same bookkeeping is applicable to the relativistic case described by the proper orthochronous Lorentz group, because any proper orthochronous Lorentz transformation can be expressed as the product of spatial rotations and a boost along the direction of motion.

  14. International petroleum statistics report

    SciTech Connect

    1996-10-01

    The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. Word oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.

  15. International petroleum statistics report

    SciTech Connect

    1995-11-01

    The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.

  16. International petroleum statistics report

    SciTech Connect

    1995-07-27

    The International Petroleum Statistics Report presents data on international oil production, demand, imports, and exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.

  17. Statistical Mechanics of Zooplankton.

    PubMed

    Hinow, Peter; Nihongi, Ai; Strickler, J Rudi

    2015-01-01

    Statistical mechanics provides the link between microscopic properties of many-particle systems and macroscopic properties such as pressure and temperature. Observations of similar "microscopic" quantities exist for the motion of zooplankton, as well as many species of other social animals. Herein, we propose to take average squared velocities as the definition of the "ecological temperature" of a population under different conditions on nutrients, light, oxygen and others. We test the usefulness of this definition on observations of the crustacean zooplankton Daphnia pulicaria. In one set of experiments, D. pulicaria is infested with the pathogen Vibrio cholerae, the causative agent of cholera. We find that infested D. pulicaria under light exposure have a significantly greater ecological temperature, which puts them at a greater risk of detection by visual predators. In a second set of experiments, we observe D. pulicaria in cold and warm water, and in darkness and under light exposure. Overall, our ecological temperature is a good discriminator of the crustacean's swimming behavior.

  18. Statistical Mechanics of Zooplankton

    PubMed Central

    Hinow, Peter; Nihongi, Ai; Strickler, J. Rudi

    2015-01-01

    Statistical mechanics provides the link between microscopic properties of many-particle systems and macroscopic properties such as pressure and temperature. Observations of similar “microscopic” quantities exist for the motion of zooplankton, as well as many species of other social animals. Herein, we propose to take average squared velocities as the definition of the “ecological temperature” of a population under different conditions on nutrients, light, oxygen and others. We test the usefulness of this definition on observations of the crustacean zooplankton Daphnia pulicaria. In one set of experiments, D. pulicaria is infested with the pathogen Vibrio cholerae, the causative agent of cholera. We find that infested D. pulicaria under light exposure have a significantly greater ecological temperature, which puts them at a greater risk of detection by visual predators. In a second set of experiments, we observe D. pulicaria in cold and warm water, and in darkness and under light exposure. Overall, our ecological temperature is a good discriminator of the crustacean’s swimming behavior. PMID:26270537

  19. International petroleum statistics report

    SciTech Connect

    1997-07-01

    The International Petroleum Statistics Report is a monthly publication that provides current international data. The report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent 12 months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996.

  20. Information in statistical physics

    NASA Astrophysics Data System (ADS)

    Balian, Roger

    We review with a tutorial scope the information theory foundations of quantum statistical physics. Only a small proportion of the variables that characterize a system at the microscopic scale can be controlled, for both practical and theoretical reasons, and a probabilistic description involving the observers is required. The criterion of maximum von Neumann entropy is then used for making reasonable inferences. It means that no spurious information is introduced besides the known data. Its outcomes can be given a direct justification based on the principle of indifference of Laplace. We introduce the concept of relevant entropy associated with some set of relevant variables; it characterizes the information that is missing at the microscopic level when only these variables are known. For equilibrium problems, the relevant variables are the conserved ones, and the Second Law is recovered as a second step of the inference process. For non-equilibrium problems, the increase of the relevant entropy expresses an irretrievable loss of information from the relevant variables towards the irrelevant ones. Two examples illustrate the flexibility of the choice of relevant variables and the multiplicity of the associated entropies: the thermodynamic entropy (satisfying the Clausius-Duhem inequality) and the Boltzmann entropy (satisfying the H -theorem). The identification of entropy with missing information is also supported by the paradox of Maxwell's demon. Spin-echo experiments show that irreversibility itself is not an absolute concept: use of hidden information may overcome the arrow of time.

  1. Statistical Mechanics of Money

    NASA Astrophysics Data System (ADS)

    Dragulescu, Adrian; Yakovenko, Victor

    2000-03-01

    We study a network of agents exchanging money between themselves. We find that the stationary probability distribution of money M is the Gibbs distribution exp(-M/T), where T is an effective ``temperature'' equal to the average amount of money per agent. This is in agreement with the general laws of statistical mechanics, because money is conserved during each transaction and the number of agents is held constant. We have verified the emergence of the Gibbs distribution in computer simulations of various trading rules and models. When the time-reversal symmetry of the trading rules is explicitly broken, deviations from the Gibbs distribution may occur, as follows from the Boltzmann-equation approach to the problem. Money distribution characterizes the purchasing power of a system. A seller would maximize his/her income by setting the price of a product equal to the temperature T of the system. Buying products from a system of temperature T1 and selling it to a system of temperature T2 would generate profit T_2-T_1>0, as in a thermal machine.

  2. Statistical mechanics of nucleosomes

    NASA Astrophysics Data System (ADS)

    Chereji, Razvan V.

    Eukaryotic cells contain long DNA molecules (about two meters for a human cell) which are tightly packed inside the micrometric nuclei. Nucleosomes are the basic packaging unit of the DNA which allows this millionfold compactification. A longstanding puzzle is to understand the principles which allow cells to both organize their genomes into chromatin fibers in the crowded space of their nuclei, and also to keep the DNA accessible to many factors and enzymes. With the nucleosomes covering about three quarters of the DNA, their positions are essential because these influence which genes can be regulated by the transcription factors and which cannot. We study physical models which predict the genome-wide organization of the nucleosomes and also the relevant energies which dictate this organization. In the last five years, the study of chromatin knew many important advances. In particular, in the field of nucleosome positioning, new techniques of identifying nucleosomes and the competing DNA-binding factors appeared, as chemical mapping with hydroxyl radicals, ChIP-exo, among others, the resolution of the nucleosome maps increased by using paired-end sequencing, and the price of sequencing an entire genome decreased. We present a rigorous statistical mechanics model which is able to explain the recent experimental results by taking into account nucleosome unwrapping, competition between different DNA-binding proteins, and both the interaction between histones and DNA, and between neighboring histones. We show a series of predictions of our new model, all in agreement with the experimental observations.

  3. Statistics: It's in the Numbers!

    ERIC Educational Resources Information Center

    Deal, Mary M.; Deal, Walter F., III

    2007-01-01

    Mathematics and statistics play important roles in peoples' lives today. A day hardly passes that they are not bombarded with many different kinds of statistics. As consumers they see statistical information as they surf the web, watch television, listen to their satellite radios, or even read the nutrition facts panel on a cereal box in the…

  4. Digest of Education Statistics, 1980.

    ERIC Educational Resources Information Center

    Grant, W. Vance; Eiden, Leo J.

    The primary purpose of this publication is to provide an abstract of statistical information covering the broad field of American education from prekindergarten through graduate school. Statistical information is presented in 14 figures and 200 tables with brief trend analyses. In addition to updating many of the statistics that have appeared in…

  5. Statistical log analysis made practical

    SciTech Connect

    Mitchell, W.K.; Nelson, R.J. )

    1991-06-01

    This paper discusses the advantages of a statistical approach to log analysis. Statistical techniques use inverse methods to calculate formation parameters. The use of statistical techniques has been limited, however, by the complexity of the mathematics and lengthy computer time required to minimize traditionally used nonlinear equations.

  6. Invention Activities Support Statistical Reasoning

    ERIC Educational Resources Information Center

    Smith, Carmen Petrick; Kenlan, Kris

    2016-01-01

    Students' experiences with statistics and data analysis in middle school are often limited to little more than making and interpreting graphs. Although students may develop fluency in statistical procedures and vocabulary, they frequently lack the skills necessary to apply statistical reasoning in situations other than clear-cut textbook examples.…

  7. Teaching Statistics Online Using "Excel"

    ERIC Educational Resources Information Center

    Jerome, Lawrence

    2011-01-01

    As anyone who has taught or taken a statistics course knows, statistical calculations can be tedious and error-prone, with the details of a calculation sometimes distracting students from understanding the larger concepts. Traditional statistics courses typically use scientific calculators, which can relieve some of the tedium and errors but…

  8. Breast cancer statistics, 2011.

    PubMed

    DeSantis, Carol; Siegel, Rebecca; Bandi, Priti; Jemal, Ahmedin

    2011-01-01

    In this article, the American Cancer Society provides an overview of female breast cancer statistics in the United States, including trends in incidence, mortality, survival, and screening. Approximately 230,480 new cases of invasive breast cancer and 39,520 breast cancer deaths are expected to occur among US women in 2011. Breast cancer incidence rates were stable among all racial/ethnic groups from 2004 to 2008. Breast cancer death rates have been declining since the early 1990s for all women except American Indians/Alaska Natives, among whom rates have remained stable. Disparities in breast cancer death rates are evident by state, socioeconomic status, and race/ethnicity. While significant declines in mortality rates were observed for 36 states and the District of Columbia over the past 10 years, rates for 14 states remained level. Analyses by county-level poverty rates showed that the decrease in mortality rates began later and was slower among women residing in poor areas. As a result, the highest breast cancer death rates shifted from the affluent areas to the poor areas in the early 1990s. Screening rates continue to be lower in poor women compared with non-poor women, despite much progress in increasing mammography utilization. In 2008, 51.4% of poor women had undergone a screening mammogram in the past 2 years compared with 72.8% of non-poor women. Encouraging patients aged 40 years and older to have annual mammography and a clinical breast examination is the single most important step that clinicians can take to reduce suffering and death from breast cancer. Clinicians should also ensure that patients at high risk of breast cancer are identified and offered appropriate screening and follow-up. Continued progress in the control of breast cancer will require sustained and increased efforts to provide high-quality screening, diagnosis, and treatment to all segments of the population.

  9. Ideal statistically quasi Cauchy sequences

    NASA Astrophysics Data System (ADS)

    Savas, Ekrem; Cakalli, Huseyin

    2016-08-01

    An ideal I is a family of subsets of N, the set of positive integers which is closed under taking finite unions and subsets of its elements. A sequence (xk) of real numbers is said to be S(I)-statistically convergent to a real number L, if for each ɛ > 0 and for each δ > 0 the set { n ∈N :1/n | { k ≤n :| xk-L | ≥ɛ } | ≥δ } belongs to I. We introduce S(I)-statistically ward compactness of a subset of R, the set of real numbers, and S(I)-statistically ward continuity of a real function in the senses that a subset E of R is S(I)-statistically ward compact if any sequence of points in E has an S(I)-statistically quasi-Cauchy subsequence, and a real function is S(I)-statistically ward continuous if it preserves S(I)-statistically quasi-Cauchy sequences where a sequence (xk) is called to be S(I)-statistically quasi-Cauchy when (Δxk) is S(I)-statistically convergent to 0. We obtain results related to S(I)-statistically ward continuity, S(I)-statistically ward compactness, Nθ-ward continuity, and slowly oscillating continuity.

  10. Basic statistics in cell biology.

    PubMed

    Vaux, David L

    2014-01-01

    The physicist Ernest Rutherford said, "If your experiment needs statistics, you ought to have done a better experiment." Although this aphorism remains true for much of today's research in cell biology, a basic understanding of statistics can be useful to cell biologists to help in monitoring the conduct of their experiments, in interpreting the results, in presenting them in publications, and when critically evaluating research by others. However, training in statistics is often focused on the sophisticated needs of clinical researchers, psychologists, and epidemiologists, whose conclusions depend wholly on statistics, rather than the practical needs of cell biologists, whose experiments often provide evidence that is not statistical in nature. This review describes some of the basic statistical principles that may be of use to experimental biologists, but it does not cover the sophisticated statistics needed for papers that contain evidence of no other kind.

  11. Cancer Statistics, 2017.

    PubMed

    Siegel, Rebecca L; Miller, Kimberly D; Jemal, Ahmedin

    2017-01-01

    Each year, the American Cancer Society estimates the numbers of new cancer cases and deaths that will occur in the United States in the current year and compiles the most recent data on cancer incidence, mortality, and survival. Incidence data were collected by the Surveillance, Epidemiology, and End Results Program; the National Program of Cancer Registries; and the North American Association of Central Cancer Registries. Mortality data were collected by the National Center for Health Statistics. In 2017, 1,688,780 new cancer cases and 600,920 cancer deaths are projected to occur in the United States. For all sites combined, the cancer incidence rate is 20% higher in men than in women, while the cancer death rate is 40% higher. However, sex disparities vary by cancer type. For example, thyroid cancer incidence rates are 3-fold higher in women than in men (21 vs 7 per 100,000 population), despite equivalent death rates (0.5 per 100,000 population), largely reflecting sex differences in the "epidemic of diagnosis." Over the past decade of available data, the overall cancer incidence rate (2004-2013) was stable in women and declined by approximately 2% annually in men, while the cancer death rate (2005-2014) declined by about 1.5% annually in both men and women. From 1991 to 2014, the overall cancer death rate dropped 25%, translating to approximately 2,143,200 fewer cancer deaths than would have been expected if death rates had remained at their peak. Although the cancer death rate was 15% higher in blacks than in whites in 2014, increasing access to care as a result of the Patient Protection and Affordable Care Act may expedite the narrowing racial gap; from 2010 to 2015, the proportion of blacks who were uninsured halved, from 21% to 11%, as it did for Hispanics (31% to 16%). Gains in coverage for traditionally underserved Americans will facilitate the broader application of existing cancer control knowledge across every segment of the population. CA Cancer J Clin

  12. Chemists, Access, Statistics

    NASA Astrophysics Data System (ADS)

    Holmes, Jon L.

    2000-06-01

    IP-number access. Current subscriptions can be upgraded to IP-number access at little additional cost. We are pleased to be able to offer to institutions and libraries this convenient mode of access to subscriber only resources at JCE Online. JCE Online Usage Statistics We are continually amazed by the activity at JCE Online. So far, the year 2000 has shown a marked increase. Given the phenomenal overall growth of the Internet, perhaps our surprise is not warranted. However, during the months of January and February 2000, over 38,000 visitors requested over 275,000 pages. This is a monthly increase of over 33% from the October-December 1999 levels. It is good to know that people are visiting, but we would very much like to know what you would most like to see at JCE Online. Please send your suggestions to JCEOnline@chem.wisc.edu. For those who are interested, JCE Online year-to-date statistics are available. Biographical Snapshots of Famous Chemists: Mission Statement Feature Editor: Barbara Burke Chemistry Department, California State Polytechnic University-Pomona, Pomona, CA 91768 phone: 909/869-3664 fax: 909/869-4616 email: baburke@csupomona.edu The primary goal of this JCE Internet column is to provide information about chemists who have made important contributions to chemistry. For each chemist, there is a short biographical "snapshot" that provides basic information about the person's chemical work, gender, ethnicity, and cultural background. Each snapshot includes links to related websites and to a biobibliographic database. The database provides references for the individual and can be searched through key words listed at the end of each snapshot. All students, not just science majors, need to understand science as it really is: an exciting, challenging, human, and creative way of learning about our natural world. Investigating the life experiences of chemists can provide a means for students to gain a more realistic view of chemistry. In addition students

  13. "Just Another Statistic"

    PubMed

    Machtay; Glatstein

    1998-01-01

    On returning from a medical meeting, we learned that sadly a patient, "Mr. B.," had passed away. His death was a completely unexpected surprise. He had been doing well nine months after a course of intensive radiotherapy for a locally advanced head and neck cancer; in his most recent follow-up notes, he was described as a "complete remission." Nonetheless, he apparently died peacefully in his sleep from a cardiac arrest one night and was found the next day by a concerned neighbor. In our absence, after Mr. B. expired, his death certificate was filled out by a physician who didn't know him in detail, but did know why he recently was treated in our department. The cause of death was listed as head and neck cancer. It wasn't long after his death before we began to receive those notorious "requests for additional information," letters from the statistical office of a well-known cooperative group. Mr. B., as it turns out, was on a clinical trial, and it was "vital" to know further details of the circumstances of his passing. Perhaps this very large cancer had been controlled and Mr. B. succumbed to old age (helped along by the tobacco industry). On the other hand, maybe the residual "fibrosis" in his neck was actually packed with active tumor and his left carotid artery was finally 100% pinched off, or maybe he suffered a massive pulmonary embolism from cancer-related hypercoagulability. The forms and requests were completed with a succinct "cause of death uncertain," adding, "please have the Study Chairs call to discuss this difficult case." Often clinical reports of outcomes utilize and emphasize the endpoint "disease specific survival" (DSS). Like overall survival (OS), the DSS can be calculated by actuarial methods, with patients who have incomplete follow-up "censored" at the time of last follow-up pending further information. In the DSS, however, deaths unrelated to the index cancer of interest are censored at the time of death; thus, a death from intercurrent

  14. Statistics without Tears: Complex Statistics with Simple Arithmetic

    ERIC Educational Resources Information Center

    Smith, Brian

    2011-01-01

    One of the often overlooked aspects of modern statistics is the analysis of time series data. Modern introductory statistics courses tend to rush to probabilistic applications involving risk and confidence. Rarely does the first level course linger on such useful and fascinating topics as time series decomposition, with its practical applications…

  15. SOCR: Statistics Online Computational Resource

    PubMed Central

    Dinov, Ivo D.

    2011-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741

  16. GIS application on spatial landslide analysis using statistical based models

    NASA Astrophysics Data System (ADS)

    Pradhan, Biswajeet; Lee, Saro; Buchroithner, Manfred F.

    2009-09-01

    This paper presents the assessment results of spatially based probabilistic three models using Geoinformation Techniques (GIT) for landslide susceptibility analysis at Penang Island in Malaysia. Landslide locations within the study areas were identified by interpreting aerial photographs, satellite images and supported with field surveys. Maps of the topography, soil type, lineaments and land cover were constructed from the spatial data sets. There are ten landslide related factors were extracted from the spatial database and the frequency ratio, fuzzy logic, and bivariate logistic regression coefficients of each factor was computed. Finally, landslide susceptibility maps were drawn for study area using frequency ratios, fuzzy logic and bivariate logistic regression models. For verification, the results of the analyses were compared with actual landslide locations in study area. The verification results show that bivariate logistic regression model provides slightly higher prediction accuracy than the frequency ratio and fuzzy logic models.

  17. Education Statistics Quarterly, Fall 2000.

    ERIC Educational Resources Information Center

    Dillow, Sally, Ed.

    2000-01-01

    The "Education Statistics Quarterly" gives a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released during a 3-month period. Each message also contains a…

  18. Zemstvo Statistics on Public Education.

    ERIC Educational Resources Information Center

    Abramov, V. F.

    1997-01-01

    Surveys the general organizational principles and forms of keeping the zemstvo (regional) statistics on Russian public education. Conveys that they were subdivided into three types: (1) the current statistics that continuously monitored schools; (2) basic surveys that provided a comprehensive characterization of a given territory's public…

  19. Representational Versatility in Learning Statistics

    ERIC Educational Resources Information Center

    Graham, Alan T.; Thomas, Michael O. J.

    2005-01-01

    Statistical data can be represented in a number of qualitatively different ways, the choice depending on the following three conditions: the concepts to be investigated; the nature of the data; and the purpose for which they were collected. This paper begins by setting out frameworks that describe the nature of statistical thinking in schools, and…

  20. Modern Statistical Methods for Astronomy

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric D.; Babu, G. Jogesh

    2012-07-01

    1. Introduction; 2. Probability; 3. Statistical inference; 4. Probability distribution functions; 5. Nonparametric statistics; 6. Density estimation or data smoothing; 7. Regression; 8. Multivariate analysis; 9. Clustering, classification and data mining; 10. Nondetections: censored and truncated data; 11. Time series analysis; 12. Spatial point processes; Appendices; Index.

  1. Digest of Education Statistics, 1998.

    ERIC Educational Resources Information Center

    Snyder, Thomas D.; Hoffman, Charlene M.; Geddes, Claire M.

    This 1998 edition of the "Digest of Education Statistics" is the 34th in a series of publications initiated in 1962. Its primary purpose is to provide a compilation of statistical information covering the broad field of American education from kindergarten through graduate school. The digest includes data from many government and private…

  2. Statistics Anxiety among Postgraduate Students

    ERIC Educational Resources Information Center

    Koh, Denise; Zawi, Mohd Khairi

    2014-01-01

    Most postgraduate programmes, that have research components, require students to take at least one course of research statistics. Not all postgraduate programmes are science based, there are a significant number of postgraduate students who are from the social sciences that will be taking statistics courses, as they try to complete their…

  3. Explorations in Statistics: Confidence Intervals

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2009-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This third installment of "Explorations in Statistics" investigates confidence intervals. A confidence interval is a range that we expect, with some level of confidence, to include the true value of a population parameter…

  4. Book Trade Research and Statistics.

    ERIC Educational Resources Information Center

    Bosch, Stephen; Ink, Gary; Lofquist, William S.

    1998-01-01

    Provides data on prices of U.S. and foreign materials; book title output and average prices, 1996 final and 1997 preliminary figures; book sales statistics, 1997--AAP preliminary estimates; U.S. trade in books, 1997; international book title output, 1990-95; book review media statistics; and number of book outlets in the U.S. and Canada. (PEN)

  5. Book Trade Research and Statistics.

    ERIC Educational Resources Information Center

    Sullivan, Sharon G.; Ink, Gary; Grabois, Andrew; Barr, Catherine

    2001-01-01

    Includes six articles that discuss research and statistics relating to the book trade. Topics include prices of U.S. and foreign materials; book title output and average prices; book sales statistics; book exports and imports; book outlets in the U.S. and Canada; and books and other media reviewed. (LRW)

  6. Canadian Statistics in the Classroom.

    ERIC Educational Resources Information Center

    School Libraries in Canada, 2002

    2002-01-01

    Includes 22 articles that address the use of Canadian statistics in the classroom. Highlights include the Statistics Canada Web site; other Web resources; original sources; critical thinking; debating with talented and gifted students; teaching marketing; environmental resources; data management; social issues and values; math instruction; reading…

  7. Statistical Factors in Complexation Reactions.

    ERIC Educational Resources Information Center

    Chung, Chung-Sun

    1985-01-01

    Four cases which illustrate statistical factors in complexation reactions (where two of the reactants are monodentate ligands) are presented. Included are tables showing statistical factors for the reactions of: (1) square-planar complexes; (2) tetrahedral complexes; and (3) octahedral complexes. (JN)

  8. SOCR: Statistics Online Computational Resource

    ERIC Educational Resources Information Center

    Dinov, Ivo D.

    2006-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an…

  9. Statistical Methods in Psychology Journals.

    ERIC Educational Resources Information Center

    Willkinson, Leland

    1999-01-01

    Proposes guidelines for revising the American Psychological Association (APA) publication manual or other APA materials to clarify the application of statistics in research reports. The guidelines are intended to induce authors and editors to recognize the thoughtless application of statistical methods. Contains 54 references. (SLD)

  10. Book Trade Research and Statistics.

    ERIC Educational Resources Information Center

    Alexander, Adrian W.; And Others

    1994-01-01

    The six articles in this section examine prices of U.S. and foreign materials; book title output and average prices; book sales statistics; U.S. book exports and imports; number of book outlets in the United States and Canada; and book review media statistics. (LRW)

  11. Nursing student attitudes toward statistics.

    PubMed

    Mathew, Lizy; Aktan, Nadine M

    2014-04-01

    Nursing is guided by evidence-based practice. To understand and apply research to practice, nurses must be knowledgeable in statistics; therefore, it is crucial to promote a positive attitude toward statistics among nursing students. The purpose of this quantitative cross-sectional study was to assess differences in attitudes toward statistics among undergraduate nursing, graduate nursing, and undergraduate non-nursing students. The Survey of Attitudes Toward Statistics Scale-36 (SATS-36) was used to measure student attitudes, with higher scores denoting more positive attitudes. The convenience sample was composed of 175 students from a public university in the northeastern United States. Statistically significant relationships were found among some of the key demographic variables. Graduate nursing students had a significantly lower score on the SATS-36, compared with baccalaureate nursing and non-nursing students. Therefore, an innovative nursing curriculum that incorporates knowledge of student attitudes and key demographic variables may result in favorable outcomes.

  12. [Big data in official statistics].

    PubMed

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  13. (Errors in statistical tests)3.

    PubMed

    Phillips, Carl V; MacLehose, Richard F; Kaufman, Jay S

    2008-07-14

    In 2004, Garcia-Berthou and Alcaraz published "Incongruence between test statistics and P values in medical papers," a critique of statistical errors that received a tremendous amount of attention. One of their observations was that the final reported digit of p-values in articles published in the journal Nature departed substantially from the uniform distribution that they suggested should be expected. In 2006, Jeng critiqued that critique, observing that the statistical analysis of those terminal digits had been based on comparing the actual distribution to a uniform continuous distribution, when digits obviously are discretely distributed. Jeng corrected the calculation and reported statistics that did not so clearly support the claim of a digit preference. However delightful it may be to read a critique of statistical errors in a critique of statistical errors, we nevertheless found several aspects of the whole exchange to be quite troubling, prompting our own meta-critique of the analysis.The previous discussion emphasized statistical significance testing. But there are various reasons to expect departure from the uniform distribution in terminal digits of p-values, so that simply rejecting the null hypothesis is not terribly informative. Much more importantly, Jeng found that the original p-value of 0.043 should have been 0.086, and suggested this represented an important difference because it was on the other side of 0.05. Among the most widely reiterated (though often ignored) tenets of modern quantitative research methods is that we should not treat statistical significance as a bright line test of whether we have observed a phenomenon. Moreover, it sends the wrong message about the role of statistics to suggest that a result should be dismissed because of limited statistical precision when it is so easy to gather more data.In response to these limitations, we gathered more data to improve the statistical precision, and analyzed the actual pattern of the

  14. Characterizations of linear sufficient statistics

    NASA Technical Reports Server (NTRS)

    Peters, B. C., Jr.; Reoner, R.; Decell, H. P., Jr.

    1977-01-01

    A surjective bounded linear operator T from a Banach space X to a Banach space Y must be a sufficient statistic for a dominated family of probability measures defined on the Borel sets of X. These results were applied, so that they characterize linear sufficient statistics for families of the exponential type, including as special cases the Wishart and multivariate normal distributions. The latter result was used to establish precisely which procedures for sampling from a normal population had the property that the sample mean was a sufficient statistic.

  15. The Statistical Basis of Chemical Equilibria.

    ERIC Educational Resources Information Center

    Hauptmann, Siegfried; Menger, Eva

    1978-01-01

    Describes a machine which demonstrates the statistical bases of chemical equilibrium, and in doing so conveys insight into the connections among statistical mechanics, quantum mechanics, Maxwell Boltzmann statistics, statistical thermodynamics, and transition state theory. (GA)

  16. Spina Bifida Data and Statistics

    MedlinePlus

    ... Materials About Us Information For... Media Policy Makers Data and Statistics Recommend on Facebook Tweet Share Compartir ... non-Hispanic white and non-Hispanic black women. Data from 12 state-based birth defects tracking programs ...

  17. Statistical ecology comes of age.

    PubMed

    Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-12-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.

  18. Middle atmosphere general circulation statistics

    NASA Technical Reports Server (NTRS)

    Geller, M. A.

    1985-01-01

    With the increased availability of remote sensing data for the middle atmosphere from satellites, more analyses of the middle atmosphere circulation are being published. Some of these are process studies for limited periods, and some are statistical analyses of middle atmosphere general circulation statistics. Results from the latter class of studies will be reviewed. These include analysis of the zonally averaged middle atmosphere structure, temperature, and zonal winds; analysis of planetary wave structures, analysis of heat and momentum fluxes; and analysis of Eliassen-and-Palm flux vectors and flux divergences. Emphasis is on the annual march of these quantities; Northern and Southern Hemisphere asymmetries; and interannual variability in these statistics. Statistics involving the global ozone distribution and transports of ozone are also discussed.

  19. Summary statistics in auditory perception.

    PubMed

    McDermott, Josh H; Schemitsch, Michael; Simoncelli, Eero P

    2013-04-01

    Sensory signals are transduced at high resolution, but their structure must be stored in a more compact format. Here we provide evidence that the auditory system summarizes the temporal details of sounds using time-averaged statistics. We measured discrimination of 'sound textures' that were characterized by particular statistical properties, as normally result from the superposition of many acoustic features in auditory scenes. When listeners discriminated examples of different textures, performance improved with excerpt duration. In contrast, when listeners discriminated different examples of the same texture, performance declined with duration, a paradoxical result given that the information available for discrimination grows with duration. These results indicate that once these sounds are of moderate length, the brain's representation is limited to time-averaged statistics, which, for different examples of the same texture, converge to the same values with increasing duration. Such statistical representations produce good categorical discrimination, but limit the ability to discern temporal detail.

  20. National Center for Health Statistics

    MedlinePlus

    ... Topics Data and Tools Publications News and Events Population Surveys National Health and Nutrition Examination Survey National Health Interview Survey National Survey of Family Growth Vital Records National Vital Statistics System National Death ...

  1. Heart Disease and Stroke Statistics

    MedlinePlus

    ... failure on the rise; cardiovascular diseases remain leading killer AHA News: Heart failure projected to increase dramatically, ... failure on the rise; cardiovascular diseases remain leading killer 2017 Statistics At-a-Glance Heart Disease and ...

  2. Statistical Theory of Breakup Reactions

    NASA Astrophysics Data System (ADS)

    Bertulani, Carlos A.; Descouvemont, Pierre; Hussein, Mahir S.

    2014-04-01

    We propose an alternative for Coupled-Channels calculations with looselybound exotic nuclei(CDCC), based on the the Random Matrix Model of the statistical theory of nuclear reactions. The coupled channels equations are divided into two sets. The first set, described by the CDCC, and the other set treated with RMT. The resulting theory is a Statistical CDCC (CDCCs), able in principle to take into account many pseudo channels.

  3. Hidden Statistics of Schroedinger Equation

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2011-01-01

    Work was carried out in determination of the mathematical origin of randomness in quantum mechanics and creating a hidden statistics of Schr dinger equation; i.e., to expose the transitional stochastic process as a "bridge" to the quantum world. The governing equations of hidden statistics would preserve such properties of quantum physics as superposition, entanglement, and direct-product decomposability while allowing one to measure its state variables using classical methods.

  4. Statistics

    NASA Astrophysics Data System (ADS)

    Gorzkowski, Waldemar

    The following sections are included: * NATIONAL PHYSICS OLYMPIADS * DISTRIBUTION OF PRIZES IN TWENTY INTERNATIONAL PHYSICS OLYMPIADS * NUMBERS OF PRIZES IN SUBSEQUENT INTERNATIONAL PHYSICS OLYMPIADS * PROBLEMS AND THEIR MARKING

  5. Statistical inference and Aristotle's Rhetoric.

    PubMed

    Macdonald, Ranald R

    2004-11-01

    Formal logic operates in a closed system where all the information relevant to any conclusion is present, whereas this is not the case when one reasons about events and states of the world. Pollard and Richardson drew attention to the fact that the reasoning behind statistical tests does not lead to logically justifiable conclusions. In this paper statistical inferences are defended not by logic but by the standards of everyday reasoning. Aristotle invented formal logic, but argued that people mostly get at the truth with the aid of enthymemes--incomplete syllogisms which include arguing from examples, analogies and signs. It is proposed that statistical tests work in the same way--in that they are based on examples, invoke the analogy of a model and use the size of the effect under test as a sign that the chance hypothesis is unlikely. Of existing theories of statistical inference only a weak version of Fisher's takes this into account. Aristotle anticipated Fisher by producing an argument of the form that there were too many cases in which an outcome went in a particular direction for that direction to be plausibly attributed to chance. We can therefore conclude that Aristotle would have approved of statistical inference and there is a good reason for calling this form of statistical inference classical.

  6. Fractional statistical potential in graphene

    NASA Astrophysics Data System (ADS)

    Ardenghi, J. S.

    2017-03-01

    In this work the fractional statistics is applied to an anyon gas in graphene to obtain the special features that the arbitrary phase interchange of the particle coordinates introduce in the thermodynamic properties. The electron gas is constituted by N anyons in the long wavelength approximation obeying fractional exclusion statistics and the partition function is analyzed in terms of a perturbation expansion up to first order in the dimensionless constant λ / L being L the length of the graphene sheet and λ = βℏvF the thermal wavelength. By considering the correct permutation expansion of the many-anyons wavefunction, taking into account that the phase changes with the number of inversions in each permutation, the statistical fermionic/bosonic potential is obtained and the intermediate statistical behavior is found. It is shown that "extra" fermonic and bosonic particles states appears and this "statistical particle" distribution depends on N. Entropy and specific heat is obtained up to first order in λ / L showing that the results obtained differs from those obtained in different approximation to the fractional exclusion statistics.

  7. Calculating statistical distributions from operator relations: The statistical distributions of various intermediate statistics

    SciTech Connect

    Dai, Wu-Sheng Xie, Mi

    2013-05-15

    In this paper, we give a general discussion on the calculation of the statistical distribution from a given operator relation of creation, annihilation, and number operators. Our result shows that as long as the relation between the number operator and the creation and annihilation operators can be expressed as a{sup †}b=Λ(N) or N=Λ{sup −1}(a{sup †}b), where N, a{sup †}, and b denote the number, creation, and annihilation operators, i.e., N is a function of quadratic product of the creation and annihilation operators, the corresponding statistical distribution is the Gentile distribution, a statistical distribution in which the maximum occupation number is an arbitrary integer. As examples, we discuss the statistical distributions corresponding to various operator relations. In particular, besides the Bose–Einstein and Fermi–Dirac cases, we discuss the statistical distributions for various schemes of intermediate statistics, especially various q-deformation schemes. Our result shows that the statistical distributions corresponding to various q-deformation schemes are various Gentile distributions with different maximum occupation numbers which are determined by the deformation parameter q. This result shows that the results given in much literature on the q-deformation distribution are inaccurate or incomplete. -- Highlights: ► A general discussion on calculating statistical distribution from relations of creation, annihilation, and number operators. ► A systemic study on the statistical distributions corresponding to various q-deformation schemes. ► Arguing that many results of q-deformation distributions in literature are inaccurate or incomplete.

  8. A Statistical Argument for the Weak Twin Primes Conjecture

    ERIC Educational Resources Information Center

    Bruckman, P. S.

    2006-01-01

    Certain definitions introduce appropriate concepts, among which are the definitions of the counting functions of the primes and twin primes, along with definitions of the correlation coefficient in a bivariate sample space. It is argued conjecturally that the characteristic functions of the prime "p" and of the quantity "p"+2 are highly…

  9. On More Sensitive Periodogram Statistics

    NASA Astrophysics Data System (ADS)

    Bélanger, G.

    2016-05-01

    Period searches in event data have traditionally used the Rayleigh statistic, R 2. For X-ray pulsars, the standard has been the Z 2 statistic, which sums over more than one harmonic. For γ-rays, the H-test, which optimizes the number of harmonics to sum, is often used. These periodograms all suffer from the same problem, namely artifacts caused by correlations in the Fourier components that arise from testing frequencies with a non-integer number of cycles. This article addresses this problem. The modified Rayleigh statistic is discussed, its generalization to any harmonic, {{ R }}k2, is formulated, and from the latter, the modified Z 2 statistic, {{ Z }}2, is constructed. Versions of these statistics for binned data and point measurements are derived, and it is shown that the variance in the uncertainties can have an important influence on the periodogram. It is shown how to combine the information about the signal frequency from the different harmonics to estimate its value with maximum accuracy. The methods are applied to an XMM-Newton observation of the Crab pulsar for which a decomposition of the pulse profile is presented, and shows that most of the power is in the second, third, and fifth harmonics. Statistical detection power of the {{ R }}k2 statistic is superior to the FFT and equivalent to the Lomb--Scargle (LS). Response to gaps in the data is assessed, and it is shown that the LS does not protect against the distortions they cause. The main conclusion of this work is that the classical R 2 and Z 2 should be replaced by {{ R }}k2 and {{ Z }}2 in all applications with event data, and the LS should be replaced by the {{ R }}k2 when the uncertainty varies from one point measurement to another.

  10. Statistical methods in translational medicine.

    PubMed

    Chow, Shein-Chung; Tse, Siu-Keung; Lin, Min

    2008-12-01

    This study focuses on strategies and statistical considerations for assessment of translation in language (e.g. translation of case report forms in multinational clinical trials), information (e.g. translation of basic discoveries to the clinic) and technology (e.g. translation of Chinese diagnostic techniques to well-established clinical study endpoints) in pharmaceutical/clinical research and development. However, most of our efforts will be directed to statistical considerations for translation in information. Translational medicine has been defined as bench-to-bedside research, where a basic laboratory discovery becomes applicable to the diagnosis, treatment or prevention of a specific disease, and is brought forth by either a physicianscientist who works at the interface between the research laboratory and patient care, or by a team of basic and clinical science investigators. Statistics plays an important role in translational medicine to ensure that the translational process is accurate and reliable with certain statistical assurance. Statistical inference for the applicability of an animal model to a human model is also discussed. Strategies for selection of clinical study endpoints (e.g. absolute changes, relative changes, or responder-defined, based on either absolute or relative change) are reviewed.

  11. Integrable matrix theory: Level statistics.

    PubMed

    Scaramazza, Jasen A; Shastry, B Sriram; Yuzbashyan, Emil A

    2016-09-01

    We study level statistics in ensembles of integrable N×N matrices linear in a real parameter x. The matrix H(x) is considered integrable if it has a prescribed number n>1 of linearly independent commuting partners H^{i}(x) (integrals of motion) [H(x),H^{i}(x)]=0, [H^{i}(x),H^{j}(x)]=0, for all x. In a recent work [Phys. Rev. E 93, 052114 (2016)2470-004510.1103/PhysRevE.93.052114], we developed a basis-independent construction of H(x) for any n from which we derived the probability density function, thereby determining how to choose a typical integrable matrix from the ensemble. Here, we find that typical integrable matrices have Poisson statistics in the N→∞ limit provided n scales at least as logN; otherwise, they exhibit level repulsion. Exceptions to the Poisson case occur at isolated coupling values x=x_{0} or when correlations are introduced between typically independent matrix parameters. However, level statistics cross over to Poisson at O(N^{-0.5}) deviations from these exceptions, indicating that non-Poissonian statistics characterize only subsets of measure zero in the parameter space. Furthermore, we present strong numerical evidence that ensembles of integrable matrices are stationary and ergodic with respect to nearest-neighbor level statistics.

  12. Equivalent statistics and data interpretation.

    PubMed

    Francis, Gregory

    2016-10-14

    Recent reform efforts in psychological science have led to a plethora of choices for scientists to analyze their data. A scientist making an inference about their data must now decide whether to report a p value, summarize the data with a standardized effect size and its confidence interval, report a Bayes Factor, or use other model comparison methods. To make good choices among these options, it is necessary for researchers to understand the characteristics of the various statistics used by the different analysis frameworks. Toward that end, this paper makes two contributions. First, it shows that for the case of a two-sample t test with known sample sizes, many different summary statistics are mathematically equivalent in the sense that they are based on the very same information in the data set. When the sample sizes are known, the p value provides as much information about a data set as the confidence interval of Cohen's d or a JZS Bayes factor. Second, this equivalence means that different analysis methods differ only in their interpretation of the empirical data. At first glance, it might seem that mathematical equivalence of the statistics suggests that it does not matter much which statistic is reported, but the opposite is true because the appropriateness of a reported statistic is relative to the inference it promotes. Accordingly, scientists should choose an analysis method appropriate for their scientific investigation. A direct comparison of the different inferential frameworks provides some guidance for scientists to make good choices and improve scientific practice.

  13. Thermodynamic Limit in Statistical Physics

    NASA Astrophysics Data System (ADS)

    Kuzemsky, A. L.

    2014-03-01

    The thermodynamic limit in statistical thermodynamics of many-particle systems is an important but often overlooked issue in the various applied studies of condensed matter physics. To settle this issue, we review tersely the past and present disposition of thermodynamic limiting procedure in the structure of the contemporary statistical mechanics and our current understanding of this problem. We pick out the ingenious approach by Bogoliubov, who developed a general formalism for establishing the limiting distribution functions in the form of formal series in powers of the density. In that study, he outlined the method of justification of the thermodynamic limit when he derived the generalized Boltzmann equations. To enrich and to weave our discussion, we take this opportunity to give a brief survey of the closely related problems, such as the equipartition of energy and the equivalence and nonequivalence of statistical ensembles. The validity of the equipartition of energy permits one to decide what are the boundaries of applicability of statistical mechanics. The major aim of this work is to provide a better qualitative understanding of the physical significance of the thermodynamic limit in modern statistical physics of the infinite and "small" many-particle systems.

  14. Ethical Statistics and Statistical Ethics: Making an Interdisciplinary Module

    ERIC Educational Resources Information Center

    Lesser, Lawrence M.; Nordenhaug, Erik

    2004-01-01

    This article describes an innovative curriculum module the first author created on the two-way exchange between statistics and applied ethics. The module, having no particular mathematical prerequisites beyond high school algebra, is part of an undergraduate interdisciplinary ethics course which begins with a 3-week introduction to basic applied…

  15. Statistics for People Who (Think They) Hate Statistics. Third Edition

    ERIC Educational Resources Information Center

    Salkind, Neil J.

    2007-01-01

    This text teaches an often intimidating and difficult subject in a way that is informative, personable, and clear. The author takes students through various statistical procedures, beginning with correlation and graphical representation of data and ending with inferential techniques and analysis of variance. In addition, the text covers SPSS, and…

  16. Writing to Learn Statistics in an Advanced Placement Statistics Course

    ERIC Educational Resources Information Center

    Northrup, Christian Glenn

    2012-01-01

    This study investigated the use of writing in a statistics classroom to learn if writing provided a rich description of problem-solving processes of students as they solved problems. Through analysis of 329 written samples provided by students, it was determined that writing provided a rich description of problem-solving processes and enabled…

  17. Statistical validation of stochastic models

    SciTech Connect

    Hunter, N.F.; Barney, P.; Paez, T.L.; Ferregut, C.; Perez, L.

    1996-12-31

    It is common practice in structural dynamics to develop mathematical models for system behavior, and the authors are now capable of developing stochastic models, i.e., models whose parameters are random variables. Such models have random characteristics that are meant to simulate the randomness in characteristics of experimentally observed systems. This paper suggests a formal statistical procedure for the validation of mathematical models of stochastic systems when data taken during operation of the stochastic system are available. The statistical characteristics of the experimental system are obtained using the bootstrap, a technique for the statistical analysis of non-Gaussian data. The authors propose a procedure to determine whether or not a mathematical model is an acceptable model of a stochastic system with regard to user-specified measures of system behavior. A numerical example is presented to demonstrate the application of the technique.

  18. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  19. The Relationship between Statistics Self-Efficacy, Statistics Anxiety, and Performance in an Introductory Graduate Statistics Course

    ERIC Educational Resources Information Center

    Schneider, William R.

    2011-01-01

    The purpose of this study was to determine the relationship between statistics self-efficacy, statistics anxiety, and performance in introductory graduate statistics courses. The study design compared two statistics self-efficacy measures developed by Finney and Schraw (2003), a statistics anxiety measure developed by Cruise and Wilkins (1980),…

  20. Illustrating the practice of statistics

    SciTech Connect

    Hamada, Christina A; Hamada, Michael S

    2009-01-01

    The practice of statistics involves analyzing data and planning data collection schemes to answer scientific questions. Issues often arise with the data that must be dealt with and can lead to new procedures. In analyzing data, these issues can sometimes be addressed through the statistical models that are developed. Simulation can also be helpful in evaluating a new procedure. Moreover, simulation coupled with optimization can be used to plan a data collection scheme. The practice of statistics as just described is much more than just using a statistical package. In analyzing the data, it involves understanding the scientific problem and incorporating the scientist's knowledge. In modeling the data, it involves understanding how the data were collected and accounting for limitations of the data where possible. Moreover, the modeling is likely to be iterative by considering a series of models and evaluating the fit of these models. Designing a data collection scheme involves understanding the scientist's goal and staying within hislher budget in terms of time and the available resources. Consequently, a practicing statistician is faced with such tasks and requires skills and tools to do them quickly. We have written this article for students to provide a glimpse of the practice of statistics. To illustrate the practice of statistics, we consider a problem motivated by some precipitation data that our relative, Masaru Hamada, collected some years ago. We describe his rain gauge observational study in Section 2. We describe modeling and an initial analysis of the precipitation data in Section 3. In Section 4, we consider alternative analyses that address potential issues with the precipitation data. In Section 5, we consider the impact of incorporating additional infonnation. We design a data collection scheme to illustrate the use of simulation and optimization in Section 6. We conclude this article in Section 7 with a discussion.