Science.gov

Sample records for isotonic bivariate statistical

  1. Neural systems with numerically matched input-output statistic: isotonic bivariate statistical modeling.

    PubMed

    Fiori, Simone

    2007-01-01

    Bivariate statistical modeling from incomplete data is a useful statistical tool that allows to discover the model underlying two data sets when the data in the two sets do not correspond in size nor in ordering. Such situation may occur when the sizes of the two data sets do not match (i.e., there are "holes" in the data) or when the data sets have been acquired independently. Also, statistical modeling is useful when the amount of available data is enough to show relevant statistical features of the phenomenon underlying the data. We propose to tackle the problem of statistical modeling via a neural (nonlinear) system that is able to match its input-output statistic to the statistic of the available data sets. A key point of the new implementation proposed here is that it is based on look-up-table (LUT) neural systems, which guarantee a computationally advantageous way of implementing neural systems. A number of numerical experiments, performed on both synthetic and real-world data sets, illustrate the features of the proposed modeling procedure.

  2. Neural Systems with Numerically Matched Input-Output Statistic: Isotonic Bivariate Statistical Modeling

    PubMed Central

    Fiori, Simone

    2007-01-01

    Bivariate statistical modeling from incomplete data is a useful statistical tool that allows to discover the model underlying two data sets when the data in the two sets do not correspond in size nor in ordering. Such situation may occur when the sizes of the two data sets do not match (i.e., there are “holes” in the data) or when the data sets have been acquired independently. Also, statistical modeling is useful when the amount of available data is enough to show relevant statistical features of the phenomenon underlying the data. We propose to tackle the problem of statistical modeling via a neural (nonlinear) system that is able to match its input-output statistic to the statistic of the available data sets. A key point of the new implementation proposed here is that it is based on look-up-table (LUT) neural systems, which guarantee a computationally advantageous way of implementing neural systems. A number of numerical experiments, performed on both synthetic and real-world data sets, illustrate the features of the proposed modeling procedure. PMID:18566641

  3. Bivariate statistical modeling of color and range in natural scenes

    NASA Astrophysics Data System (ADS)

    Su, Che-Chun; Cormack, Lawrence K.; Bovik, Alan C.

    2014-02-01

    The statistical properties embedded in visual stimuli from the surrounding environment guide and affect the evolutionary processes of human vision systems. There are strong statistical relationships between co-located luminance/chrominance and disparity bandpass coefficients in natural scenes. However, these statistical rela- tionships have only been deeply developed to create point-wise statistical models, although there exist spatial dependencies between adjacent pixels in both 2D color images and range maps. Here we study the bivariate statistics of the joint and conditional distributions of spatially adjacent bandpass responses on both luminance/chrominance and range data of naturalistic scenes. We deploy bivariate generalized Gaussian distributions to model the underlying statistics. The analysis and modeling results show that there exist important and useful statistical properties of both joint and conditional distributions, which can be reliably described by the corresponding bivariate generalized Gaussian models. Furthermore, by utilizing these robust bivariate models, we are able to incorporate measurements of bivariate statistics between spatially adjacent luminance/chrominance and range information into various 3D image/video and computer vision applications, e.g., quality assessment, 2D-to-3D conversion, etc.

  4. Bivariate ensemble model output statistics approach for joint forecasting of wind speed and temperature

    NASA Astrophysics Data System (ADS)

    Baran, Sándor; Möller, Annette

    2016-06-01

    Forecast ensembles are typically employed to account for prediction uncertainties in numerical weather prediction models. However, ensembles often exhibit biases and dispersion errors, thus they require statistical post-processing to improve their predictive performance. Two popular univariate post-processing models are the Bayesian model averaging (BMA) and the ensemble model output statistics (EMOS). In the last few years, increased interest has emerged in developing multivariate post-processing models, incorporating dependencies between weather quantities, such as for example a bivariate distribution for wind vectors or even a more general setting allowing to combine any types of weather variables. In line with a recently proposed approach to model temperature and wind speed jointly by a bivariate BMA model, this paper introduces an EMOS model for these weather quantities based on a bivariate truncated normal distribution. The bivariate EMOS model is applied to temperature and wind speed forecasts of the 8-member University of Washington mesoscale ensemble and the 11-member ALADIN-HUNEPS ensemble of the Hungarian Meteorological Service and its predictive performance is compared to the performance of the bivariate BMA model and a multivariate Gaussian copula approach, post-processing the margins with univariate EMOS. While the predictive skills of the compared methods are similar, the bivariate EMOS model requires considerably lower computation times than the bivariate BMA method.

  5. INLAND DISSOLVED SALT CHEMISTRY: STATISTICAL EVALUATION OF BIVARIATE AND TERNARY DIAGRAM MODELS FOR SURFACE AND SUBSURFACE WATERS

    EPA Science Inventory

    We compared the use of ternary and bivariate diagrams to distinguish the effects of atmospheric precipitation, rock weathering, and evaporation on inland surface and subsurface water chemistry. The three processes could not be statistically differentiated using bivariate models e...

  6. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    ERIC Educational Resources Information Center

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  7. Testing independence of bivariate interval-censored data using modified Kendall's tau statistic.

    PubMed

    Kim, Yuneung; Lim, Johan; Park, DoHwan

    2015-11-01

    In this paper, we study a nonparametric procedure to test independence of bivariate interval censored data; for both current status data (case 1 interval-censored data) and case 2 interval-censored data. To do it, we propose a score-based modification of the Kendall's tau statistic for bivariate interval-censored data. Our modification defines the Kendall's tau statistic with expected numbers of concordant and disconcordant pairs of data. The performance of the modified approach is illustrated by simulation studies and application to the AIDS study. We compare our method to alternative approaches such as the two-stage estimation method by Sun et al. (Scandinavian Journal of Statistics, 2006) and the multiple imputation method by Betensky and Finkelstein (Statistics in Medicine, 1999b).

  8. Source apportionment advances using polar plots of bivariate correlation and regression statistics

    NASA Astrophysics Data System (ADS)

    Grange, Stuart K.; Lewis, Alastair C.; Carslaw, David C.

    2016-11-01

    This paper outlines the development of enhanced bivariate polar plots that allow the concentrations of two pollutants to be compared using pair-wise statistics for exploring the sources of atmospheric pollutants. The new method combines bivariate polar plots, which provide source characteristic information, with pair-wise statistics that provide information on how two pollutants are related to one another. The pair-wise statistics implemented include weighted Pearson correlation and slope from two linear regression methods. The development uses a Gaussian kernel to locally weight the statistical calculations on a wind speed-direction surface together with variable-scaling. Example applications of the enhanced polar plots are presented by using routine air quality data for two monitoring sites in London, United Kingdom for a single year (2013). The London examples demonstrate that the combination of bivariate polar plots, correlation, and regression techniques can offer considerable insight into air pollution source characteristics, which would be missed if only scatter plots and mean polar plots were used for analysis. Specifically, using correlation and slopes as pair-wise statistics, long-range transport processes were isolated and black carbon (BC) contributions to PM2.5 for a kerbside monitoring location were quantified. Wider applications and future advancements are also discussed.

  9. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  10. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  11. [Bivariate statistical model for calculating phosphorus input loads to the river from point and nonpoint sources].

    PubMed

    Chen, Ding-Jiang; Sun, Si-Yang; Jia, Ying-Na; Chen, Jia-Bo; Lü, Jun

    2013-01-01

    Based on the hydrological difference between the point source (PS) and nonpoint source (NPS) pollution processes and the major influencing mechanism of in-stream retention processes, a bivariate statistical model was developed for relating river phosphorus load to river water flow rate and temperature. Using the calibrated and validated four model coefficients from in-stream monitoring data, monthly phosphorus input loads to the river from PS and NPS can be easily determined by the model. Compared to current hydrologica methods, this model takes the in-stream retention process and the upstream inflow term into consideration; thus it improves the knowledge on phosphorus pollution processes and can meet the requirements of both the district-based and watershed-based wate quality management patterns. Using this model, total phosphorus (TP) input load to the Changle River in Zhejiang Province was calculated. Results indicated that annual total TP input load was (54.6 +/- 11.9) t x a(-1) in 2004-2009, with upstream water inflow, PS and NPS contributing to 5% +/- 1%, 12% +/- 3% and 83% +/- 3%, respectively. The cumulative NPS TP input load during the high flow periods (i. e. , June, July, August and September) in summer accounted for 50% +/- 9% of the annual amount, increasing the alga blooming risk in downstream water bodies. Annual in-stream TP retention load was (4.5 +/- 0.1) t x a(-1) and occupied 9% +/- 2% of the total input load. The cumulative in-stream TP retention load during the summer periods (i. e. , June-September) accounted for 55% +/- 2% of the annual amount, indicating that in-stream retention function plays an important role in seasonal TP transport and transformation processes. This bivariate statistical model only requires commonly available in-stream monitoring data (i. e. , river phosphorus load, water flow rate and temperature) with no requirement of special software knowledge; thus it offers researchers an managers with a cost-effective tool for

  12. Meta-analysis for diagnostic accuracy studies: a new statistical model using beta-binomial distributions and bivariate copulas.

    PubMed

    Kuss, Oliver; Hoyer, Annika; Solms, Alexander

    2014-01-15

    There are still challenges when meta-analyzing data from studies on diagnostic accuracy. This is mainly due to the bivariate nature of the response where information on sensitivity and specificity must be summarized while accounting for their correlation within a single trial. In this paper, we propose a new statistical model for the meta-analysis for diagnostic accuracy studies. This model uses beta-binomial distributions for the marginal numbers of true positives and true negatives and links these margins by a bivariate copula distribution. The new model comes with all the features of the current standard model, a bivariate logistic regression model with random effects, but has the additional advantages of a closed likelihood function and a larger flexibility for the correlation structure of sensitivity and specificity. In a simulation study, which compares three copula models and two implementations of the standard model, the Plackett and the Gauss copula do rarely perform worse but frequently better than the standard model. We use an example from a meta-analysis to judge the diagnostic accuracy of telomerase (a urinary tumor marker) for the diagnosis of primary bladder cancer for illustration.

  13. The return period analysis of natural disasters with statistical modeling of bivariate joint probability distribution.

    PubMed

    Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng

    2013-01-01

    New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis. PMID:22616629

  14. The return period analysis of natural disasters with statistical modeling of bivariate joint probability distribution.

    PubMed

    Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng

    2013-01-01

    New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis.

  15. A Statistical Reconstruction of Bivariate Climate from Tree Ring Width Measurements Using Scientifically Motivated Process Models.

    NASA Astrophysics Data System (ADS)

    Tipton, J.; Hooten, M.; Pederson, N.; Tingley, M.; Bishop, D. A.

    2014-12-01

    The ability to reconstruct historical climate is important to understanding how climate has changed in the past. The instrumental record of temperature and precipitation only spans the most recent centuries. Thus, reconstructions of the climate features are typically based on proxy archives. The proxy archives integrate climate information through biological, geological, physical, and chemical processes. Tree ring widths provide one of the most spatially and temporally rich sources of high quality climate proxy data. However, the statistical reconstruction of paleoclimate from tree ring widths is quite challenging because the climate signal is inherently multi-dimensional while tree ring widths are a one dimensional data source. We propose a Bayesian Hierarchical model using a non-linear, scientifically motivated tree ring growth models to reconstruct multivariate climate (i.e., temperature and precipitation) in the Hudson Valley region of New York. Our proposed model extends and enhances former methods in a number of ways. We allow for species-specific responses to climate, which further constrains the many-to-one relationship between tree rings and climate. The resulting model allows for prediction of reasonable climate scenarios given tree ring widths. We explore a natural model selection framework that weighs the influence of multiple candidate growth models in terms of their predictive ability. To enable prediction backcasts, the climate variables are modeled with an underlying continuous time latent process. The continuous time process allows for added flexibility in the climate response through time at different temporal scales and enables investigation of differences in climate between the reconstruction period and the instrumental period. Validation of the model's predictive abilities is achieved through a pseudo-proxy simulation experiment where the quality of climate predictions are measured by out of sample performance based on a proper local scoring

  16. Determination of statistics for any rotation of axes of a bivariate normal elliptical distribution. [of wind vector components

    NASA Technical Reports Server (NTRS)

    Falls, L. W.; Crutcher, H. L.

    1976-01-01

    Transformation of statistics from a dimensional set to another dimensional set involves linear functions of the original set of statistics. Similarly, linear functions will transform statistics within a dimensional set such that the new statistics are relevant to a new set of coordinate axes. A restricted case of the latter is the rotation of axes in a coordinate system involving any two correlated random variables. A special case is the transformation for horizontal wind distributions. Wind statistics are usually provided in terms of wind speed and direction (measured clockwise from north) or in east-west and north-south components. A direct application of this technique allows the determination of appropriate wind statistics parallel and normal to any preselected flight path of a space vehicle. Among the constraints for launching space vehicles are critical values selected from the distribution of the expected winds parallel to and normal to the flight path. These procedures are applied to space vehicle launches at Cape Kennedy, Florida.

  17. Landslide susceptibility analysis in central Vietnam based on an incomplete landslide inventory: Comparison of a new method to calculate weighting factors by means of bivariate statistics

    NASA Astrophysics Data System (ADS)

    Meinhardt, Markus; Fink, Manfred; Tünschel, Hannes

    2015-04-01

    Vietnam is regarded as a country strongly impacted by climate change. Population and economic growth result in additional pressures on the ecosystems in the region. In particular, changes in landuse and precipitation extremes lead to a higher landslide susceptibility in the study area (approx. 12,400 km2), located in central Vietnam and impacted by a tropical monsoon climate. Hence, this natural hazard is a serious problem in the study area. A probability assessment of landslides is therefore undertaken through the use of bivariate statistics. However, the landslide inventory based only on field campaigns does not cover the whole area. To avoid a systematic bias due to the limited mapping area, the investigated regions are depicted as the viewshed in the calculations. On this basis, the distribution of the landslides is evaluated in relation to the maps of 13 parameters, showing the strongest correlation to distance to roads and precipitation increase. An additional weighting of the input parameters leads to better results, since some parameters contribute more to landslides than others. The method developed in this work is based on the validation of different parameter sets used within the statistical index method. It is called "omit error" because always omitting another parameter leads to the weightings, which describe how strong every single parameter improves or reduces the objective function. Furthermore, this approach is used to find a better input parameter set by excluding some parameters. After this optimization, nine input parameters are left, and they are weighted by the omit error method, providing the best susceptibility map with a success rate of 92.9% and a prediction rate of 92.3%. This is an improvement of 4.4% and 4.2%, respectively, compared to the basic statistical index method with the 13 input parameters.

  18. Landslide susceptibility assessment in Lianhua County (China): A comparison between a random forest data mining technique and bivariate and multivariate statistical models

    NASA Astrophysics Data System (ADS)

    Hong, Haoyuan; Pourghasemi, Hamid Reza; Pourtaghi, Zohre Sadat

    2016-04-01

    Landslides are an important natural hazard that causes a great amount of damage around the world every year, especially during the rainy season. The Lianhua area is located in the middle of China's southern mountainous area, west of Jiangxi Province, and is known to be an area prone to landslides. The aim of this study was to evaluate and compare landslide susceptibility maps produced using the random forest (RF) data mining technique with those produced by bivariate (evidential belief function and frequency ratio) and multivariate (logistic regression) statistical models for Lianhua County, China. First, a landslide inventory map was prepared using aerial photograph interpretation, satellite images, and extensive field surveys. In total, 163 landslide events were recognized in the study area, with 114 landslides (70%) used for training and 49 landslides (30%) used for validation. Next, the landslide conditioning factors-including the slope angle, altitude, slope aspect, topographic wetness index (TWI), slope-length (LS), plan curvature, profile curvature, distance to rivers, distance to faults, distance to roads, annual precipitation, land use, normalized difference vegetation index (NDVI), and lithology-were derived from the spatial database. Finally, the landslide susceptibility maps of Lianhua County were generated in ArcGIS 10.1 based on the random forest (RF), evidential belief function (EBF), frequency ratio (FR), and logistic regression (LR) approaches and were validated using a receiver operating characteristic (ROC) curve. The ROC plot assessment results showed that for landslide susceptibility maps produced using the EBF, FR, LR, and RF models, the area under the curve (AUC) values were 0.8122, 0.8134, 0.7751, and 0.7172, respectively. Therefore, we can conclude that all four models have an AUC of more than 0.70 and can be used in landslide susceptibility mapping in the study area; meanwhile, the EBF and FR models had the best performance for Lianhua

  19. Bivariate extreme value distributions

    NASA Technical Reports Server (NTRS)

    Elshamy, M.

    1992-01-01

    In certain engineering applications, such as those occurring in the analyses of ascent structural loads for the Space Transportation System (STS), some of the load variables have a lower bound of zero. Thus, the need for practical models of bivariate extreme value probability distribution functions with lower limits was identified. We discuss the Gumbel models and present practical forms of bivariate extreme probability distributions of Weibull and Frechet types with two parameters. Bivariate extreme value probability distribution functions can be expressed in terms of the marginal extremel distributions and a 'dependence' function subject to certain analytical conditions. Properties of such bivariate extreme distributions, sums and differences of paired extremals, as well as the corresponding forms of conditional distributions, are discussed. Practical estimation techniques are also given.

  20. Local osmosis and isotonic transport.

    PubMed

    Mathias, R T; Wang, H

    2005-11-01

    Osmotically driven water flow, u (cm/s), between two solutions of identical osmolarity, c(o) (300 mM: in mammals), has a theoretical isotonic maximum given by u = j/c(o), where j (moles/cm(2)/s) is the rate of salt transport. In many experimental studies, transport was found to be indistinguishable from isotonic. The purpose of this work is to investigate the conditions for u to approach isotonic. A necessary condition is that the membrane salt/water permeability ratio, epsilon, must be small: typical physiological values are epsilon = 10(-3) to 10(-5), so epsilon is generally small but this is not sufficient to guarantee near-isotonic transport. If we consider the simplest model of two series membranes, which secrete a tear or drop of sweat (i.e., there are no externally-imposed boundary conditions on the secretion), diffusion is negligible and the predicted osmolarities are: basal = c(o), intracellular approximately (1 + epsilon)c(o), secretion approximately (1 + 2epsilon)c(o), and u approximately (1 - 2epsilon)j/c(o). Note that this model is also appropriate when the transported solution is experimentally collected. Thus, in the absence of external boundary conditions, transport is experimentally indistinguishable from isotonic. However, if external boundary conditions set salt concentrations to c(o) on both sides of the epithelium, then fluid transport depends on distributed osmotic gradients in lateral spaces. If lateral spaces are too short and wide, diffusion dominates convection, reduces osmotic gradients and fluid flow is significantly less than isotonic. Moreover, because apical and basolateral membrane water fluxes are linked by the intracellular osmolarity, water flow is maximum when the total water permeability of basolateral membranes equals that of apical membranes. In the context of the renal proximal tubule, data suggest it is transporting at near optimal conditions. Nevertheless, typical physiological values suggest the newly filtered fluid is

  1. Isotonic water transport in secretory epithelia.

    PubMed

    Swanson, C H

    1977-01-01

    The model proposed by Diamond and Bossert [1] for isotonic water transport has received wide acceptance in recent years. It assumes that the local driving force for water transport is a standing osmotic gradient produced in the lateral intercellular spaces of the epithelial cell layer by active solute transport. While this model is based on work done in absorptive epithelia where the closed to open direction of the lateral space and the direction of net transport are the same, it has been proposed that the lateral spaces could also serve as the site of the local osmotic gradients for water transport in secretory epithelia, where the closed to open direction of the lateral space and net transport are opposed, by actively transporting solute out of the space rather than into it. Operation in the backward direction, however, requires a lower than ambient hydrostatic pressure within the lateral space which would seem more likely to cause the space to collapse with loss of function. On the other hand, most secretory epithelia are characterized by transport into a restricted ductal system which is similar to the lateral intercellular space in the absorptive epithelia in that its closed to open direction is the same as that of net transport. In vitro micropuncture studies on the exocrine pancreas of the rabbit indicate the presence of a small but statistically significant increase in juice osmolality, 6 mOsm/kg H(2)O, at the site of electrolyte and water secretion in the smallest extralobular ducts with secretin stimulation which suggests that the ductal system in the secretory epithelia rather than the lateral intercellular space is the site of the local osmotic gradients responsible for isotonic water transport. PMID:331693

  2. Covariate analysis of bivariate survival data

    SciTech Connect

    Bennett, L.E.

    1992-01-01

    The methods developed are used to analyze the effects of covariates on bivariate survival data when censoring and ties are present. The proposed method provides models for bivariate survival data that include differential covariate effects and censored observations. The proposed models are based on an extension of the univariate Buckley-James estimators which replace censored data points by their expected values, conditional on the censoring time and the covariates. For the bivariate situation, it is necessary to determine the expectation of the failure times for one component conditional on the failure or censoring time of the other component. Two different methods have been developed to estimate these expectations. In the semiparametric approach these expectations are determined from a modification of Burke's estimate of the bivariate empirical survival function. In the parametric approach censored data points are also replaced by their conditional expected values where the expected values are determined from a specified parametric distribution. The model estimation will be based on the revised data set, comprised of uncensored components and expected values for the censored components. The variance-covariance matrix for the estimated covariate parameters has also been derived for both the semiparametric and parametric methods. Data from the Demographic and Health Survey was analyzed by these methods. The two outcome variables are post-partum amenorrhea and breastfeeding; education and parity were used as the covariates. Both the covariate parameter estimates and the variance-covariance estimates for the semiparametric and parametric models will be compared. In addition, a multivariate test statistic was used in the semiparametric model to examine contrasts. The significance of the statistic was determined from a bootstrap distribution of the test statistic.

  3. Assessment of Mass-Transport Deposits occurrence offshore Espírito Santo Basin (SE Brazil) using a bivariate statistical method

    NASA Astrophysics Data System (ADS)

    Piedade, Aldina; Alves, Tiago; Luís Zêzere, José

    2016-04-01

    Mass Transport Deposits (MTDs) are one of the most important process shaping passive and active margins. It is frequently happening and its characteristics, features and processes has been very well documented from diverse approaches and methodologies. In this work a methodology for evaluation of MTDs occurrence is tested in an area offshore Espírito Santo Basin, SE Brazil. MTDs inventory was made on three-dimensional (3D) seismic volume interpreting a high amplitude reflection which correspond to the top and base of the MTDs. The inventory consists of four MTDs which were integrated into a GIS database. MTDs favourability scores are computed using algorithms based on statistical/probabilistic analysis (Information Value Method) over unique condition terrain units in a raster basis. Terrain attributes derived from the Digital Terrain Model (DTM) are interpreted as proxies of driving factors of MTDs and are used as predictors in our models which are based on a set of different MTDs inventories. Three models are elaborated independently according to the area of the MTDs body (Model 1, Model 2 and Model 3). The final result is prepared by sorting all pixels according to the pixel favourability value in descending order. The robustness and accuracy of the MTDs favourability models are evaluated by the success-rate curves, which are used for the quantitative interpretation of the models expressing the goodness of fit of the MTDs. In addition, a sensitivity analysis was performed and the predisposing factors which have highest prediction performance on MTDs occurrence were identified. The obtained results allow to conclude the method is valid to apply to submarine slopes as it is demonstrated by the highest obtained goodness of fit (0.862). This work is very pioneer, the methodology used was never applied to submarine environment. It is a very promising and valid methodology within the prediction of submarine slopes regarding failing and instability to the industry. In

  4. Collective structure of the N=40 isotones

    SciTech Connect

    Gaudefroy, L.; Peru, S.; Pillet, N.; Hilaire, S.; Delaroche, J.-P.; Girod, M.; Obertelli, A.

    2009-12-15

    The structure of even-even N=40 isotones is studied from drip line to drip line through the systematic investigation of their quadrupole modes of excitation. Calculations are performed within the Hartree-Fock-Bogoliubov approach using the Gogny D1S effective interaction. Where relevant, these calculations are extended beyond mean field within a generator-coordinate-based method. An overall good agreement with available experimental data is reported, showing that collectivity increases from the neutron to the proton drip line. Whereas {sup 60}Ca and {sup 68}Ni display a calculated spherical shape in their ground states, all other isotones show a prolate-deformed ground-state band and a quasi-{gamma} band. Coexistence features are predicted in the neutron-deficient N=40 isotones above {sup 74}Se.

  5. Solvable rational extensions of the isotonic oscillator

    SciTech Connect

    Grandati, Yves

    2011-08-15

    Highlights: > We obtain in a new way the solvable rational extensions of the isotonic oscillator. > The method is systematic without resorting to any ansatz. > We use a generalization of the SUSY quantum partnership to excited states. > They are regularized by specific discrete symmetries of the potential. > The proof of the shape invariance of the extensions is direct. - Abstract: Combining recent results on rational solutions of the Riccati-Schroedinger equations for shape invariant potentials to the finite difference Baecklund algorithm and specific symmetries of the isotonic potential, we show that it is possible to generate the three infinite sets (L1, L2 and L3 families) of regular rational solvable extensions of this potential in a very direct and transparent way.

  6. Isotonic Modeling with Non-Differentiable Loss Functions with Application to Lasso Regularization.

    PubMed

    Painsky, Amichai; Rosset, Saharon

    2016-02-01

    In this paper we present an algorithmic approach for fitting isotonic models under convex, yet non-differentiable, loss functions. It is a generalization of the greedy non-regret approach proposed by Luss and Rosset (2014) for differentiable loss functions, taking into account the sub-gradiental extensions required. We prove that our suggested algorithm solves the isotonic modeling problem while maintaining favorable computational and statistical properties. As our suggested algorithm may be used for any non-differentiable loss function, we focus our interest on isotonic modeling for either regression or two-class classification with appropriate log-likelihood loss and lasso penalty on the fitted values. This combination allows us to maintain the non-parametric nature of isotonic modeling, while controlling model complexity through regularization. We demonstrate the efficiency and usefulness of this approach on both synthetic and real world data. An implementation of our suggested solution is publicly available from the first author's website (https://sites.google.com/site/amichaipainsky/software).

  7. Bivariate Kumaraswamy distribution with an application on earthquake data

    SciTech Connect

    Özel, Gamze

    2015-03-10

    Bivariate Kumaraswamy (BK) distribution whose marginals are Kumaraswamy distributions has been recently introduced. However, its statistical properties are not studied in detail. In this study, statistical properties of the BK distribution are investigated. We suggest that the BK could provide suitable description for the earthquakes characteristics of Turkey. We support this argument using earthquakesoccurred in Turkey between 1900 and 2009. We also find that the BK distribution simulates earthquakes well.

  8. [The influence of an isotonic solution containing benzalkonium chloride and a hypertonic seawater solution on the function of ciliary epithelium from the nasal cavity in vitro].

    PubMed

    Laberko, E L; Bogomil'sky, M R; Soldatsky, Yu L; Pogosova, I E

    2016-01-01

    The objective of the present study was to evaluate the influence of an isotonic saline solution containing benzalconium chloride and of a hypertonic seawater solution on the function of ciliary epithelium in the nasal cavity in vitro. To this effect, we investigated the cytological material obtained from 35 children presenting with adenoid tissue hypertrophy. The tissue samples were taken from the nasal cavity by the standard method. A cellular biopsy obtained from each patient was distributed between three tubes that contained isotonic saline solution supplemented by benzalconium chloride (0.1 mg/ml), a hypertonic seawater solution, and a standard physiological saline solution. It was shown that the number of the viable cells in both isotonic solutions was statistically comparable and significantly higher than in the hypertonic solution (p<0.05). The ciliary beat frequency of the cells embedded in the two isotonic solutions was not significantly different but considerably exceeded that in the hypertonic seawater solution (p<0.05). Thus, the present study has demonstrated the absence of the ciliotoxic influence of isotonic saline solution containing benzalconium chloride at a concentration of 0.1 mg/ml and the strong ciliotoxic effect of the hypertonic seawater solution. This finding gives reason to recommend isotonic solutions for the regular application whereas hypertonic solutions can be prescribed only during infectious and/or inflammatory ENT diseases. PMID:27213656

  9. [The influence of an isotonic solution containing benzalkonium chloride and a hypertonic seawater solution on the function of ciliary epithelium from the nasal cavity in vitro].

    PubMed

    Laberko, E L; Bogomil'sky, M R; Soldatsky, Yu L; Pogosova, I E

    2016-01-01

    The objective of the present study was to evaluate the influence of an isotonic saline solution containing benzalconium chloride and of a hypertonic seawater solution on the function of ciliary epithelium in the nasal cavity in vitro. To this effect, we investigated the cytological material obtained from 35 children presenting with adenoid tissue hypertrophy. The tissue samples were taken from the nasal cavity by the standard method. A cellular biopsy obtained from each patient was distributed between three tubes that contained isotonic saline solution supplemented by benzalconium chloride (0.1 mg/ml), a hypertonic seawater solution, and a standard physiological saline solution. It was shown that the number of the viable cells in both isotonic solutions was statistically comparable and significantly higher than in the hypertonic solution (p<0.05). The ciliary beat frequency of the cells embedded in the two isotonic solutions was not significantly different but considerably exceeded that in the hypertonic seawater solution (p<0.05). Thus, the present study has demonstrated the absence of the ciliotoxic influence of isotonic saline solution containing benzalconium chloride at a concentration of 0.1 mg/ml and the strong ciliotoxic effect of the hypertonic seawater solution. This finding gives reason to recommend isotonic solutions for the regular application whereas hypertonic solutions can be prescribed only during infectious and/or inflammatory ENT diseases.

  10. Nonparametric Analysis of Bivariate Gap Time with Competing Risks

    PubMed Central

    Huang, Chiung-Yu; Wang, Chenguang; Wang, Mei-Cheng

    2016-01-01

    Summary This article considers nonparametric methods for studying recurrent disease and death with competing risks. We first point out that comparisons based on the well-known cumulative incidence function can be confounded by different prevalence rates of the competing events, and that comparisons of the conditional distribution of the survival time given the failure event type are more relevant for investigating the prognosis of different patterns of recurrence disease. We then propose nonparametric estimators for the conditional cumulative incidence function as well as the conditional bivariate cumulative incidence function for the bivariate gap times, that is, the time to disease recurrence and the residual lifetime after recurrence. To quantify the association between the two gap times in the competing risks setting, a modified Kendall’s tau statistic is proposed. The proposed estimators for the conditional bivariate cumulative incidence distribution and the association measure account for the induced dependent censoring for the second gap time. Uniform consistency and weak convergence of the proposed estimators are established. Hypothesis testing procedures for two-sample comparisons are discussed. Numerical simulation studies with practical sample sizes are conducted to evaluate the performance of the proposed nonparametric estimators and tests. An application to data from a pancreatic cancer study is presented to illustrate the methods developed in this article. PMID:26990686

  11. A new form of bivariate generalized Poisson regression model

    NASA Astrophysics Data System (ADS)

    Faroughi, Pouya; Ismail, Noriszura

    2014-09-01

    This paper introduces a new form of bivariate generalized Poisson (BGP) regression which can be fitted to bivariate and correlated count data with covariates. The BGP regression suggested in this study can be fitted not only to bivariate count data with positive, zero or negative correlations, but also to underdispersed or overdispersed bivariate count data. Applications of bivariate Poisson (BP) regression and the new BGP regression are illustrated on Malaysian motor insurance data.

  12. Mineral Composition and Nutritive Value of Isotonic and Energy Drinks.

    PubMed

    Leśniewicz, Anna; Grzesiak, Magdalena; Żyrnicki, Wiesław; Borkowska-Burnecka, Jolanta

    2016-04-01

    Several very popular brands of isotonic and energy drinks consumed for fluid and electrolyte supplementation and stimulation of mental or physical alertness were chosen for investigation. Liquid beverages available in polyethylene bottles and aluminum cans as well as products in the form of tablets and powder in sachets were studied. The total concentrations of 21 elements (Ag, Al, B, Ba, Ca, Cd, Co, Cr, Cu, Fe, Mg, Mn, Mo, Na, Ni, P, Pb, Sr, Ti, V, and Zn), both essential and toxic, were simultaneously determined in preconcentrated drink samples by inductively coupled plasma-optical emission spectrometry (ICP-OES) equipped with pneumatic and ultrasonic nebulizers. Differences between the mineral compositions of isotonic and energy drinks were evaluated and discussed. The highest content of Na was found in both isotonic and energy drinks, whereas quite high concentrations of Mg were found in isotonic drinks, and the highest amount of calcium was quantified in energy drinks. The concentrations of B, Co, Cu, Ni, and P were higher in isotonic drinks, but energy drinks contained greater quantities of Ag, Cr, Zn, Mn, and Mo and toxic elements, as Cd and Pb. A comparison of element contents with micronutrient intake and tolerable levels was performed to evaluate contribution of the investigated beverages to the daily diet. The consumption of 250 cm(3) of an isotonic drink provides from 0.32% (for Mn) up to 14.8% (for Na) of the recommended daily intake. For the energy drinks, the maximum recommended daily intake fulfillment ranged from 0.02% (for V) to 19.4 or 19.8% (for Mg and Na). PMID:26286964

  13. Mineral Composition and Nutritive Value of Isotonic and Energy Drinks.

    PubMed

    Leśniewicz, Anna; Grzesiak, Magdalena; Żyrnicki, Wiesław; Borkowska-Burnecka, Jolanta

    2016-04-01

    Several very popular brands of isotonic and energy drinks consumed for fluid and electrolyte supplementation and stimulation of mental or physical alertness were chosen for investigation. Liquid beverages available in polyethylene bottles and aluminum cans as well as products in the form of tablets and powder in sachets were studied. The total concentrations of 21 elements (Ag, Al, B, Ba, Ca, Cd, Co, Cr, Cu, Fe, Mg, Mn, Mo, Na, Ni, P, Pb, Sr, Ti, V, and Zn), both essential and toxic, were simultaneously determined in preconcentrated drink samples by inductively coupled plasma-optical emission spectrometry (ICP-OES) equipped with pneumatic and ultrasonic nebulizers. Differences between the mineral compositions of isotonic and energy drinks were evaluated and discussed. The highest content of Na was found in both isotonic and energy drinks, whereas quite high concentrations of Mg were found in isotonic drinks, and the highest amount of calcium was quantified in energy drinks. The concentrations of B, Co, Cu, Ni, and P were higher in isotonic drinks, but energy drinks contained greater quantities of Ag, Cr, Zn, Mn, and Mo and toxic elements, as Cd and Pb. A comparison of element contents with micronutrient intake and tolerable levels was performed to evaluate contribution of the investigated beverages to the daily diet. The consumption of 250 cm(3) of an isotonic drink provides from 0.32% (for Mn) up to 14.8% (for Na) of the recommended daily intake. For the energy drinks, the maximum recommended daily intake fulfillment ranged from 0.02% (for V) to 19.4 or 19.8% (for Mg and Na).

  14. The Effects of Selection Strategies for Bivariate Loglinear Smoothing Models on NEAT Equating Functions

    ERIC Educational Resources Information Center

    Moses, Tim; Holland, Paul W.

    2010-01-01

    In this study, eight statistical strategies were evaluated for selecting the parameterizations of loglinear models for smoothing the bivariate test score distributions used in nonequivalent groups with anchor test (NEAT) equating. Four of the strategies were based on significance tests of chi-square statistics (Likelihood Ratio, Pearson,…

  15. Systematics of. cap alpha. decay of even--even isotones

    SciTech Connect

    Poplavskii-breve, I.V.

    1987-02-01

    On the basis of an analysis of experimental data we have investigated for the first time the ..cap alpha.. decay of even--even isotones. We have established that the ..cap alpha..-decay energy of isotones depends on the number of protons approximately according to a linear law. We have shown that the Geiger--Nuttall law is valid both for isotopes and isobars, and also for isotones. The deviations from the Geiger--Nuttall law are due to the shell structure of the nucleus. The regularities observed in the ..cap alpha.. decay of isotones have been used to estimate the magnitudes of the ..cap alpha..-decay energies, the kinetic energies of the emitted ..cap alpha.. particles, and the partial half-lives for ..cap alpha.. decay of the known and unknown neutron--deficient nuclei /sup 202//sup ,//sup 204/Ra, /sup 210/Th, /sup 228//sup ,//sup 230/Pu, /sup 234//sup ,//sup 236/Cm, /sup 242//sup ,//sup 244/Fm, /sup 250//sup ,//sup 258/No, and /sup 254//sup ,//sup 256/Ku.

  16. An Annotated Bibliography of Isotonic Weight-Training Methods.

    ERIC Educational Resources Information Center

    Wysong, John V.

    This literature study was conducted to compare and evaluate various types and techniques of weight lifting so that a weight lifting program could be selected or devised for a secondary school. Annotations of 32 research reports, journal articles, and monographs on isotonic strength training are presented. The literature in the first part of the…

  17. Survival Analysis using Bivariate Archimedean Copulas

    NASA Astrophysics Data System (ADS)

    Chandra, Krishnendu

    In this dissertation we solve the nonidentifiability problem of Archimedean copula models based on dependent censored data (see [Wang, 2012]). We give a set of identifiability conditions for a special class of bivariate frailty models. Our simulation results show that our proposed model is identifiable under our proposed conditions. We use EM algorithm to estimate unknown parameters and the proposed estimation approach can be applied to fit dependent censored data when the dependence is of research interest. The marginal survival functions can be estimated using the copula-graphic estimator (see [Zheng and Klein, 1995] and [Rivest and Wells, 2001]) or the estimator proposed by [Wang, 2014]. We also propose two model selection procedures for Archimedean copula models, one for uncensored data and the other one for right censored bivariate data. Our simulation results are similar to that of [Wang and Wells, 2000] and suggest that both procedures work quite well. The idea of our proposed model selection procedure originates from the model selection procedure for Archimedean copula models proposed by [Wang and Wells, 2000] for right censored bivariate data using the L2 norm corresponding to the Kendall distribution function. A suitable bootstrap procedure is yet to be suggested for our method. We further propose a new parameter estimator and a simple goodness-of-fit test for Archimedean copula models when the bivariate data is under fixed left truncation. Our simulation results suggest that our procedure needs to be improved so that it can be more powerful, reliable and efficient. In our strategy, to obtain estimates for the unknown parameters, we heavily exploit the concept of truncated tau (a measure of association established by [Manatunga and Oakes, 1996] for left truncated data). The idea of our goodness of fit test originates from the goodness-of-fit test for Archimedean copula models proposed by [Wang, 2010] for right censored bivariate data.

  18. Bivariate copula in fitting rainfall data

    NASA Astrophysics Data System (ADS)

    Yee, Kong Ching; Suhaila, Jamaludin; Yusof, Fadhilah; Mean, Foo Hui

    2014-07-01

    The usage of copula to determine the joint distribution between two variables is widely used in various areas. The joint distribution of rainfall characteristic obtained using the copula model is more ideal than the standard bivariate modelling where copula is belief to have overcome some limitation. Six copula models will be applied to obtain the most suitable bivariate distribution between two rain gauge stations. The copula models are Ali-Mikhail-Haq (AMH), Clayton, Frank, Galambos, Gumbel-Hoogaurd (GH) and Plackett. The rainfall data used in the study is selected from rain gauge stations which are located in the southern part of Peninsular Malaysia, during the period from 1980 to 2011. The goodness-of-fit test in this study is based on the Akaike information criterion (AIC).

  19. Nonparametric causal inference for bivariate time series.

    PubMed

    McCracken, James M; Weigel, Robert S

    2016-02-01

    We introduce new quantities for exploratory causal inference between bivariate time series. The quantities, called penchants and leanings, are computationally straightforward to apply, follow directly from assumptions of probabilistic causality, do not depend on any assumed models for the time series generating process, and do not rely on any embedding procedures; these features may provide a clearer interpretation of the results than those from existing time series causality tools. The penchant and leaning are computed based on a structured method for computing probabilities.

  20. Nonparametric causal inference for bivariate time series

    NASA Astrophysics Data System (ADS)

    McCracken, James M.; Weigel, Robert S.

    2016-02-01

    We introduce new quantities for exploratory causal inference between bivariate time series. The quantities, called penchants and leanings, are computationally straightforward to apply, follow directly from assumptions of probabilistic causality, do not depend on any assumed models for the time series generating process, and do not rely on any embedding procedures; these features may provide a clearer interpretation of the results than those from existing time series causality tools. The penchant and leaning are computed based on a structured method for computing probabilities.

  1. Predicting Number of Zombies in a DDoS Attacks Using Isotonic Regression

    NASA Astrophysics Data System (ADS)

    Gupta, B. B.; Jamali, Nadeem

    Anomaly based DDoS detection systems construct profile of the traffic normally seen in the network, and identify anomalies whenever traffic deviate from normal profile beyond a threshold. This deviation in traffic beyond threshold is used in the past for DDoS detection but not for finding number of zombies. This chapter presents an approach that utilizes this deviation in traffic to predict number of zombies using isotonic regression model. A relationship is established between number of zombies and observed deviation in sample entropy. Internet type topologies used for simulation are generated using Transit-Stub model of GT-ITM topology generator. NS-2 network simulator on Linux platform is used as simulation test bed for launching DDoS attacks with varied number of zombies. Various statistical performance measures are used to measure the performance of the regression model. The simulation results are promising as we are able to predict number of zombies efficiently with very less error rate using isotonic regression model.

  2. Application of the bivariate spectrophotometric method for the determination of metronidazole, furazolidone and di-iodohydroxyquinoline in pharmaceutical formulations.

    PubMed

    López-de-Alba, P L; Wróbel, K; López-Martínez, L; Wróbel, K; Yepez-Murrieta, M L; Amador-Hernández, J

    1997-10-01

    The bivariate calibration algorithm was applied to the spectrophotometric determination of metronidazole, furazolidone and di-iodohydroxyquinoline in pharmaceutical dosage forms. The results obtained were compared with the results of derivative spectrophotometry. The statistical evaluation of method bias was carried out, and it was shown that the proposed procedure may be competitive with commonly used first-derivative spectrophotometry. The advantage of the bivariate calibration is its simplicity, and the fact that there is no need to use the derivatization procedures.

  3. Current misuses of multiple regression for investigating bivariate hypotheses: an example from the organizational domain.

    PubMed

    O'Neill, Thomas A; McLarnon, Matthew J W; Schneider, Travis J; Gardner, Robert C

    2014-09-01

    By definition, multiple regression (MR) considers more than one predictor variable, and each variable's beta will depend on both its correlation with the criterion and its correlation with the other predictor(s). Despite ad nauseam coverage of this characteristic in organizational psychology and statistical texts, researchers' applications of MR in bivariate hypothesis testing has been the subject of recent and renewed interest. Accordingly, we conducted a targeted survey of the literature by coding articles, covering a five-year span from two top-tier organizational journals, that employed MR for testing bivariate relations. The results suggest that MR coefficients, rather than correlation coefficients, were most common for testing hypotheses of bivariate relations, yet supporting theoretical rationales were rarely offered. Regarding the potential impact on scientific advancement, in almost half of the articles reviewed (44 %), at least one conclusion of each study (i.e., that the hypothesis was or was not supported) would have been different, depending on the author's use of correlation or beta to test the bivariate hypothesis. It follows that inappropriate decisions to interpret the correlation versus the beta will affect the accumulation of consistent and replicable scientific evidence. We conclude with recommendations for improving bivariate hypothesis testing. PMID:24142838

  4. Synchronization Analysis of Nonstationary Bivariate Time Series

    NASA Astrophysics Data System (ADS)

    Kurths, J.

    First the concept of synchronization in coupled complex systems is presented and it is shown that synchronization phenomena are abundant in science, nature, engineer- ing etc. We use this concept to treat the inverse problem and to reveal interactions between oscillating systems from observational data. First it is discussed how time varying phases and frequencies can be estimated from time series and second tech- niques for detection and quantification of hidden synchronization is presented. We demonstrate that this technique is effective for the analysis of systems' interrelation from noisy nonstationary bivariate data and provides other insights than traditional cross correlation and spectral analysis. For this, model examples and geophysical data are discussed.

  5. Statistics

    Cancer.gov

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  6. Evolution of Collectivity in the N = 80 Isotones

    NASA Astrophysics Data System (ADS)

    Bauer, C.; Stegmann, R.; Rainovski, G.; Pietralla, N.; Blazhev, A.; Bönig, S.; Damyanova, A.; Danchev, M.; Gladnishki, K. A.; Lutter, R.; Marsh, B. A.; Möller, T.; Pakarinen, J.; Radeck, D.; Rapisarda, E.; Reiter, P.; Scheck, M.; Seidlitz, M.; Siebeck, B.; Stahl, C.; Thoele, P.; Thomas, T.; Thürauf, M.; Warr, N.; Werner, V.; de Witte, H.

    2015-11-01

    Recent data on transition strengths, namely the hitherto unknown B(E2) values of radioactive 140Nd and 142Sm in the N=80 isotones, have suggested that the proton 1g7/2 subshell closure at Z=58 has an impact on the properties of low-lying collective states. The unstable, neutron-rich nuclei 140Nd and 142Sm were investigated via projectile Coulomb excitation at the REX-ISOLDE facility at CERN with the high-purity Germanium detector array MINIBALL. The measurement of 140Nd and the preliminary result for 142Sm demonstrate that the reduced collectivity of 138Ce is a local effect, possibly due to the Z=58 subshell closure, and requests refined theoretical calculations.

  7. Neutron-hole strength in the N = 81 isotones

    NASA Astrophysics Data System (ADS)

    Howard, A. M.; Freeman, S. J.; Schiffer, J. P.; Bloxham, T.; Clark, J. A.; Deibel, C. M.; Kay, B. P.; Parker, P. D.; Sharp, D. K.; Thomas, J. S.

    2012-09-01

    The distribution of neutron-hole strength has been studied in the N = 81 isotones 137Ba, 139Ce, 141Nd and 143Sm through the single-neutron removing reactions (p,d) and (3He,α), at energies of 23 and 34 MeV, respectively. Systematic cross section measurements were made at angles sensitive to the transferred angular momentum, and spectroscopic factors extracted through a distorted-wave Born approximation analysis. Application of the MacFarlane-French sum rules indicate an anomalously low summed g7/2 spectroscopic factor, most likely due to extensive fragmentation of the single-particle strength. Single-particle energies, based upon the centroids of observed strength, are presented.

  8. Novel bivariate moment-closure approximations.

    PubMed

    Krishnarajah, Isthrinayagy; Marion, Glenn; Gibson, Gavin

    2007-08-01

    Nonlinear stochastic models are typically intractable to analytic solutions and hence, moment-closure schemes are used to provide approximations to these models. Existing closure approximations are often unable to describe transient aspects caused by extinction behaviour in a stochastic process. Recent work has tackled this problem in the univariate case. In this study, we address this problem by introducing novel bivariate moment-closure methods based on mixture distributions. Novel closure approximations are developed, based on the beta-binomial, zero-modified distributions and the log-Normal, designed to capture the behaviour of the stochastic SIS model with varying population size, around the threshold between persistence and extinction of disease. The idea of conditional dependence between variables of interest underlies these mixture approximations. In the first approximation, we assume that the distribution of infectives (I) conditional on population size (N) is governed by the beta-binomial and for the second form, we assume that I is governed by zero-modified beta-binomial distribution where in either case N follows a log-Normal distribution. We analyse the impact of coupling and inter-dependency between population variables on the behaviour of the approximations developed. Thus, the approximations are applied in two situations in the case of the SIS model where: (1) the death rate is independent of disease status; and (2) the death rate is disease-dependent. Comparison with simulation shows that these mixture approximations are able to predict disease extinction behaviour and describe transient aspects of the process.

  9. The bivariate Rogers Szegö polynomials

    NASA Astrophysics Data System (ADS)

    Chen, William Y. C.; Saad, Husam L.; Sun, Lisa H.

    2007-06-01

    We present an operator approach to deriving Mehler's formula and the Rogers formula for the bivariate Rogers-Szegö polynomials hn(x, y|q). The proof of Mehler's formula can be considered as a new approach to the nonsymmetric Poisson kernel formula for the continuous big q-Hermite polynomials Hn(x; a|q) due to Askey, Rahman and Suslov. Mehler's formula for hn(x, y|q) involves a 3phi2 sum and the Rogers formula involves a 2phi1 sum. The proofs of these results are based on parameter augmentation with respect to the q-exponential operator and the homogeneous q-shift operator in two variables. By extending recent results on the Rogers-Szegö polynomials hn(x|q) due to Hou, Lascoux and Mu, we obtain another Rogers-type formula for hn(x, y|q). Finally, we give a change of base formula for Hn(x; a|q) which can be used to evaluate some integrals by using the Askey-Wilson integral.

  10. Determining the bivariate brightness distribution of galaxies.

    NASA Astrophysics Data System (ADS)

    Boyce, P. J.; Phillipps, S.

    1995-04-01

    In this paper we describe a set of criteria which we propose a sample of galaxies must satisfy if it is to be useful for determining the bivariate brightness distribution (BBD) of galaxies in luminosity and surface brightness and we consider the prospects for deriving such a sample. First, we note that determinations of the galaxy luminosity function can be seriously in error if surface brightness (visibility) selection effects are ignored. We suggest that a determination of the BBD is a more physically useful aim. A straightforward way to obtain the BBD would be to determine a luminosity function in a set of narrow surface brightness bins. We propose a set of criteria which the sample of galaxies in each surface brightness bin must satisfy if it is to be reliably used in such a determination. Each sample should be restricted to a well defined range in morphological type, the measured isophotal size and magnitude and the surface brightness of each galaxy should be corrected to a common galactic inclination, all galaxies should have measured redshifts and the sample should be complete to a known isophotal size and/or magnitude. We then describe a rigorous method for selecting samples which satisfy these criteria from existing catalogues of galaxies. We apply this method to the ESO-LV catalogue and find that from the intial sample of 11000 galaxies with a disk component we can only find 5 subsamples in half-magnitude wide surface brightness bins which satisfy our proposed criteria. The largest derived subsample contains only 27 galaxies, far too few to determine a luminosity function at its surface brightness. We suggest that had our proposed criteria been applied to the samples used in previous determinations of the BBD or the galaxy luminosity function then sample sizes would have been greatly reduced. For this reason, we suggest that the conclusions of previous work should be treated with caution.

  11. A Vehicle for Bivariate Data Analysis

    ERIC Educational Resources Information Center

    Roscoe, Matt B.

    2016-01-01

    Instead of reserving the study of probability and statistics for special fourth-year high school courses, the Common Core State Standards for Mathematics (CCSSM) takes a "statistics for all" approach. The standards recommend that students in grades 6-8 learn to summarize and describe data distributions, understand probability, draw…

  12. Dose-schedule finding in phase I/II clinical trials using a Bayesian isotonic transformation

    PubMed Central

    Li, Yisheng; Bekele, B. Nebiyou; Ji, Yuan; Cook, John D.

    2015-01-01

    Summary A dose-schedule-finding trial is a new type of oncology trial in which investigators aim to find a combination of dose and treatment schedule that has a large probability of efficacy yet a relatively small probability of toxicity. We demonstrate that a major difference between traditional dose-finding and dose-schedule-finding trials is that while the toxicity probabilities follow a simple nondecreasing order in dose-finding trials, those of dose-schedule-finding trials may adhere to a matrix order. We show that the success of a dose-schedule-finding method requires careful statistical modeling and a sensible dose-schedule allocation scheme. We propose a Bayesian hierarchical model that jointly models the unordered probabilities of toxicity and efficacy, and apply a Bayesian isotonic transformation to the posterior samples of the toxicity probabilities, so that the transformed posterior samples adhere to the matrix order constraints. Based on the joint posterior distribution of the order-constrained toxicity probabilities and the unordered efficacy probabilities, we develop a dose-schedule-finding algorithm that sequentially allocates patients to the best dose-schedule combination under certain criteria. We illustrate our methodology through its application to a clinical trial in leukemia, and compare it to two alternative approaches. PMID:18563789

  13. Simultaneous estimation of parameters in the bivariate Emax model.

    PubMed

    Magnusdottir, Bergrun T; Nyquist, Hans

    2015-12-10

    In this paper, we explore inference in multi-response, nonlinear models. By multi-response, we mean models with m > 1 response variables and accordingly m relations. Each parameter/explanatory variable may appear in one or more of the relations. We study a system estimation approach for simultaneous computation and inference of the model and (co)variance parameters. For illustration, we fit a bivariate Emax model to diabetes dose-response data. Further, the bivariate Emax model is used in a simulation study that compares the system estimation approach to equation-by-equation estimation. We conclude that overall, the system estimation approach performs better for the bivariate Emax model when there are dependencies among relations. The stronger the dependencies, the more we gain in precision by using system estimation rather than equation-by-equation estimation.

  14. Probabilistic floodplain hazard mapping: managing uncertainty by using a bivariate approach for flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Candela, Angela; Tito Aronica, Giuseppe

    2014-05-01

    Floods are a global problem and are considered the most frequent natural disaster world-wide. Many studies show that the severity and frequency of floods have increased in recent years and underline the difficulty to separate the effects of natural climatic changes and human influences as land management practices, urbanization etc. Flood risk analysis and assessment is required to provide information on current or future flood hazard and risks in order to accomplish flood risk mitigation, to propose, evaluate and select measures to reduce it. Both components of risk can be mapped individually and are affected by multiple uncertainties as well as the joint estimate of flood risk. Major sources of uncertainty include statistical analysis of extremes events, definition of hydrological input, channel and floodplain topography representation, the choice of effective hydraulic roughness coefficients. The classical procedure to estimate flood discharge for a chosen probability of exceedance is to deal with a rainfall-runoff model associating to risk the same return period of original rainfall, in accordance with the iso-frequency criterion. Alternatively, a flood frequency analysis to a given record of discharge data is applied, but again the same probability is associated to flood discharges and respective risk. Moreover, since flood peaks and corresponding flood volumes are variables of the same phenomenon, they should be, directly, correlated and, consequently, multivariate statistical analyses must be applied. This study presents an innovative approach to obtain flood hazard maps where hydrological input (synthetic flood design event) to a 2D hydraulic model has been defined by generating flood peak discharges and volumes from: a) a classical univariate approach, b) a bivariate statistical analysis, through the use of copulas. The univariate approach considers flood hydrographs generation by an indirect approach (rainfall-runoff transformation using input rainfall

  15. Farlie-Gumbel-Morgenstern bivariate densities: Are they applicable in hydrology?

    NASA Astrophysics Data System (ADS)

    Long, D.; Krzysztofowicz, R.

    1992-03-01

    Certain bivariate densities constructed from marginals have recently been suggested as models of hydrologic variates such as rainfall intensity and depth. It is pointed out that (i) these densities belong to the families of the Farlie-Gumbel-Morgenstern densities and the Farlie polynomial densities, which have been extensively studied in the statistical literature, and that (ii) these densities have a limited potential applicability in hydrology since they can model only weakly associated variates, whose product-moment correlation R is within the range | R|≤1/3, under the first family of densities, and | R|≤1/2 under the second family.

  16. Constructing a bivariate distribution function with given marginals and correlation: application to the galaxy luminosity function

    NASA Astrophysics Data System (ADS)

    Takeuchi, Tsutomu T.

    2010-08-01

    We provide an analytic method to construct a bivariate distribution function (DF) with given marginal distributions and correlation coefficient. We introduce a convenient mathematical tool, called a copula, to connect two DFs with any prescribed dependence structure. If the correlation of two variables is weak (Pearson's correlation coefficient |ρ| < 1/3), the Farlie-Gumbel-Morgenstern (FGM) copula provides an intuitive and natural way to construct such a bivariate DF. When the linear correlation is stronger, the FGM copula cannot work anymore. In this case, we propose using a Gaussian copula, which connects two given marginals and is directly related to the linear correlation coefficient between two variables. Using the copulas, we construct the bivariate luminosity function (BLF) and discuss its statistical properties. We focus especially on the far-infrared-far-ulatraviolet (FUV-FIR) BLF, since these two wavelength regions are related to star-formation (SF) activity. Though both the FUV and FIR are related to SF activity, the univariate LFs have a very different functional form: the former is well described by the Schechter function whilst the latter has a much more extended power-law-like luminous end. We construct the FUV-FIR BLFs using the FGM and Gaussian copulas with different strengths of correlation, and examine their statistical properties. We then discuss some further possible applications of the BLF: the problem of a multiband flux-limited sample selection, the construction of the star-formation rate (SFR) function, and the construction of the stellar mass of galaxies (M*)-specific SFR (SFR/M*) relation. The copulas turn out to be a very useful tool to investigate all these issues, especially for including complicated selection effects.

  17. Evaluating Univariate, Bivariate, and Multivariate Normality Using Graphical Procedures.

    ERIC Educational Resources Information Center

    Burdenski, Thomas K., Jr.

    This paper reviews graphical and nongraphical procedures for evaluating multivariate normality by guiding the reader through univariate and bivariate procedures that are necessary, but insufficient, indications of a multivariate normal distribution. A data set using three dependent variables for two groups provided by D. George and P. Mallery…

  18. Univariate and Bivariate Loglinear Models for Discrete Test Score Distributions.

    ERIC Educational Resources Information Center

    Holland, Paul W.; Thayer, Dorothy T.

    2000-01-01

    Applied the theory of exponential families of distributions to the problem of fitting the univariate histograms and discrete bivariate frequency distributions that often arise in the analysis of test scores. Considers efficient computation of the maximum likelihood estimates of the parameters using Newton's Method and computationally efficient…

  19. A New Measure Of Bivariate Asymmetry And Its Evaluation

    SciTech Connect

    Ferreira, Flavio Henn; Kolev, Nikolai Valtchev

    2008-11-06

    In this paper we propose a new measure of bivariate asymmetry, based on conditional correlation coefficients. A decomposition of the Pearson correlation coefficient in terms of its conditional versions is studied and an example of application of the proposed measure is given.

  20. Multimodal Bivariate Thematic Maps: Auditory and Haptic Display.

    ERIC Educational Resources Information Center

    Jeong, Wooseob; Gluck, Myke

    2002-01-01

    Explores the possibility of multimodal bivariate thematic maps by utilizing auditory and haptic (sense of touch) displays. Measured completion time of tasks and the recall (retention) rate in two experiments, and findings confirmed the possibility of using auditory and haptic displays in geographic information systems (GIS). (Author/LRW)

  1. Modelling of Uncertainty and Bi-Variable Maps

    NASA Astrophysics Data System (ADS)

    Nánásiová, Ol'ga; Pykacz, Jarosław

    2016-05-01

    The paper gives an overview and compares various bi-varilable maps from orthomodular lattices into unit interval. It focuses mainly on such bi-variable maps that may be used for constructing joint probability distributions for random variables which are not defined on the same Boolean algebra.

  2. ASURV: Astronomical SURVival Statistics

    NASA Astrophysics Data System (ADS)

    Feigelson, E. D.; Nelson, P. I.; Isobe, T.; LaValley, M.

    2014-06-01

    ASURV (Astronomical SURVival Statistics) provides astronomy survival analysis for right- and left-censored data including the maximum-likelihood Kaplan-Meier estimator and several univariate two-sample tests, bivariate correlation measures, and linear regressions. ASURV is written in FORTRAN 77, and is stand-alone and does not call any specialized libraries.

  3. Work capacity during 30 days of bed rest with isotonic and isokinetic exercise training

    NASA Technical Reports Server (NTRS)

    Greenleaf, J. E.; Bernauer, E. M.; Ertl, A. C.; Trowbridge, T. S.; Wade, C. E.

    1989-01-01

    Results are presented from a study to determine whether or not short-term variable intensity isotonic and intermittent high-intensity isokinetic short-duration leg exercise is effective for the maintenance of peak O2 (VO2) uptake and muscular strength and endurance, respectively, during 30 days of -6 deg head-down bed rest deconditioning. The results show no significant changes in leg peak torque, leg mean total work, arm total peak torque, or arm mean total work for members of the isotonic, isokinetic, and controls groups. Changes are observed, however, in peak VO2 levels. The results suggest that near-peak variabile intensity, isotonic leg excercise maintains peak VO2 during 30 days of bed rest, while peak intermittent, isokinetic leg excercise protocol does not.

  4. On limit relations between some families of bivariate hypergeometric orthogonal polynomials

    NASA Astrophysics Data System (ADS)

    Area, I.; Godoy, E.

    2013-01-01

    In this paper we deal with limit relations between bivariate hypergeometric polynomials. We analyze the limit relation from trinomial distribution to bivariate Gaussian distribution, obtaining the limit transition from the second-order partial difference equation satisfied by bivariate hypergeometric Kravchuk polynomials to the second-order partial differential equation verified by bivariate hypergeometric Hermite polynomials. As a consequence the limit relation between both families of orthogonal polynomials is established. A similar analysis between bivariate Hahn and bivariate Appell orthogonal polynomials is also presented.

  5. Effects of isotonic and isometric exercises with mist sauna bathing on cardiovascular, thermoregulatory, and metabolic functions.

    PubMed

    Iwase, Satoshi; Kawahara, Yuko; Nishimura, Naoki; Nishimura, Rumiko; Sugenoya, Junichi; Miwa, Chihiro; Takada, Masumi

    2014-08-01

    To clarify the effects of isometric and isotonic exercise during mist sauna bathing on the cardiovascular function, thermoregulatory function, and metabolism, six healthy young men (22 ± 1 years old, height 173 ± 4 cm, weight 65.0 ± 5.0 kg) were exposed to a mist sauna for 10 min at a temperature of 40 °C, and relative humidity of 100 % while performing or not performing ∼30 W of isometric or isotonic exercise. The effect of the exercise was assessed by measuring tympanic temperature, heart rate, systolic and diastolic blood pressure, chest sweat rate, chest skin blood flow, and plasma catecholamine and cortisol, glucose, lactate, and free fatty acid levels. Repeated measures ANOVA showed no significant differences in blood pressure, skin blood flow, sweat rate, and total amount of sweating. Tympanic temperature increased more during isotonic exercise, and heart rate increase was more marked during isotonic exercise. The changes in lactate indicated that fatigue was not very great during isometric exercise. The glucose level indicated greater energy expenditure during isometric exercise. The free fatty acid and catecholamine levels indicated that isometric exercise did not result in very great energy expenditure and stress, respectively. The results for isotonic exercise of a decrease in lactate level and an increase in plasma free fatty acid level indicated that fatigue and energy expenditure were rather large while the perceived stress was comparatively low. We concluded that isotonic exercise may be a more desirable form of exercise during mist sauna bathing given the changes in glucose and free fatty acid levels.

  6. Effects of isotonic and isometric exercises with mist sauna bathing on cardiovascular, thermoregulatory, and metabolic functions

    NASA Astrophysics Data System (ADS)

    Iwase, Satoshi; Kawahara, Yuko; Nishimura, Naoki; Nishimura, Rumiko; Sugenoya, Junichi; Miwa, Chihiro; Takada, Masumi

    2014-08-01

    To clarify the effects of isometric and isotonic exercise during mist sauna bathing on the cardiovascular function, thermoregulatory function, and metabolism, six healthy young men (22 ± 1 years old, height 173 ± 4 cm, weight 65.0 ± 5.0 kg) were exposed to a mist sauna for 10 min at a temperature of 40 °C, and relative humidity of 100 % while performing or not performing ˜30 W of isometric or isotonic exercise. The effect of the exercise was assessed by measuring tympanic temperature, heart rate, systolic and diastolic blood pressure, chest sweat rate, chest skin blood flow, and plasma catecholamine and cortisol, glucose, lactate, and free fatty acid levels. Repeated measures ANOVA showed no significant differences in blood pressure, skin blood flow, sweat rate, and total amount of sweating. Tympanic temperature increased more during isotonic exercise, and heart rate increase was more marked during isotonic exercise. The changes in lactate indicated that fatigue was not very great during isometric exercise. The glucose level indicated greater energy expenditure during isometric exercise. The free fatty acid and catecholamine levels indicated that isometric exercise did not result in very great energy expenditure and stress, respectively. The results for isotonic exercise of a decrease in lactate level and an increase in plasma free fatty acid level indicated that fatigue and energy expenditure were rather large while the perceived stress was comparatively low. We concluded that isotonic exercise may be a more desirable form of exercise during mist sauna bathing given the changes in glucose and free fatty acid levels.

  7. Microscopic study of deformation systematics in some isotones in the A ≈ 100 mass region

    NASA Astrophysics Data System (ADS)

    Bharti, Arun; Sharma, Chetan; Singh, Suram; Khosa, S. K.

    2012-09-01

    Variation after projection (VAP) calculations in conjunction with Hartree-Bogoliubov (HB) ansatz have been carried out for N=60, 62 isotones in the mass region A=100. In this framework, the yrast spectra with JΠ >= 10+ B(E2) transition probabilities, quadrupole deformation parameter and occupation numbers for various shell model orbits have been obtained. The results of calculations indicate that the simultaneous increase in polarization of p1/2, p3/2 and f5/2 proton sub-shells is a significant factor into the development of the deformation in neutron rich isotones in the mass region A=100.

  8. Half-lives of N = 126 Isotones and the r-Process

    SciTech Connect

    Suzuki, Toshio; Yoshida, Takashi; Utsuno, Yutaka

    2010-08-12

    Beta decays of N = 126 isotones are studied by shell model calculations. Both the Gamow-Teller (GT) and first-forbidden (FF) transitions are taken into account to evaluate the half-lives of the isotones (Z = 64-72) with the use of shell model interactions based on G-matrix. The FF transitions are found to be important to reduce the half-lives by twice to several times of those obtained by the GT contributions only. Possible implications of the short half-lives of the waiting point nuclei on the r-process nucleosynthesis during the supernova explosions are discussed.

  9. Acceptance of isotonic and hypotonic rehydrating beverages by athletes during training.

    PubMed

    Décombaz, J; Gmünder, B; Daget, N; Munoz-Box, R; Howald, H

    1992-01-01

    This study compared the acceptance of two beverages (5% carbohydrate) of distinct osmolarities (hypotonic, 180 mOsm/kg and isotonic, 295 mOsm/kg) during the usual training practice of 97 athletes. A quantitative sensory profile by independent tasters ensured that organoleptic recognition would be unlikely during the tests. Each drink was consumed ad libitum during 3 different training sessions, at home. At each session, a subjective appreciation of hedonic and post-ingestive physiological effects (6 criteria) was obtained by means of a questionnaire. At the end of the experiment, the athletes were asked to express a preference for one of the "six" drinks. More athletes (blindly) chose the isotonic compared to the hypotonic drink (p = 0.03). This difference was not due intrinsically to the drinks, which the subjects were unable to distinguish on any of the criteria, but was related to certain aspects of the consumer's characteristics. Both groups had different drinking practices: the subjects choosing the isotonic beverage drank less before (p = 0.001) and more during (p = 0.013) the exercise. Age, sex, dimensions or type of physical activity (i.e. endurance vs speed/strength disciplines) were unrelated to the preference, except perhaps the duration of habitual exercise (p less than 0.05). We concluded that athletes, although unable to distinguish a hypotonic from an isotonic drink, may have specific habits and/or personal characteristics prompting them to favour one of them.

  10. Nebulized Isotonic Saline versus Water following a Laryngeal Desiccation Challenge in Classically Trained Sopranos

    ERIC Educational Resources Information Center

    Tanner, Kristine; Roy, Nelson; Merrill, Ray M.; Muntz, Faye; Houtz, Daniel R.; Sauder, Cara; Elstad, Mark; Wright-Costa, Julie

    2010-01-01

    Purpose: To examine the effects of nebulized isotonic saline (IS) versus sterile water (SW) on self-perceived phonatory effort (PPE) and phonation threshold pressure (PTP) following a surface laryngeal dehydration challenge in classically trained sopranos. Method: In a double-blind, within-subject crossover design, 34 sopranos breathed dry air…

  11. Monoclinic sphere packings. I. Invariant, univariant and bivariant lattice complexes.

    PubMed

    Sowa, Heidrun; Fischer, Werner

    2016-05-01

    All homogeneous sphere packings were derived that refer to the two invariant, the four univariant and the three bivariant lattice complexes belonging to the monoclinic crystal system. In total, sphere packings of 29 types have been found. Only for five types is the maximal inherent symmetry of their sphere packings monoclinic whereas the inherent symmetry is orthorhombic for nine types, tetragonal for five types, hexagonal for six types and cubic for four types. PMID:27126112

  12. The Noval Properties and Construction of Multi-scale Matrix-valued Bivariate Wavelet wraps

    NASA Astrophysics Data System (ADS)

    Zhang, Hai-mo

    In this paper, we introduce matrix-valued multi-resolution structure and matrix-valued bivariate wavelet wraps. A constructive method of semi-orthogonal matrix-valued bivari-ate wavelet wraps is presented. Their properties have been characterized by using time-frequency analysis method, unitary extension principle and operator theory. The direct decom-position relation is obtained.

  13. Modeling continuous covariates with a "spike" at zero: Bivariate approaches.

    PubMed

    Jenkner, Carolin; Lorenz, Eva; Becher, Heiko; Sauerbrei, Willi

    2016-07-01

    In epidemiology and clinical research, predictors often take value zero for a large amount of observations while the distribution of the remaining observations is continuous. These predictors are called variables with a spike at zero. Examples include smoking or alcohol consumption. Recently, an extension of the fractional polynomial (FP) procedure, a technique for modeling nonlinear relationships, was proposed to deal with such situations. To indicate whether or not a value is zero, a binary variable is added to the model. In a two stage procedure, called FP-spike, the necessity of the binary variable and/or the continuous FP function for the positive part are assessed for a suitable fit. In univariate analyses, the FP-spike procedure usually leads to functional relationships that are easy to interpret. This paper introduces four approaches for dealing with two variables with a spike at zero (SAZ). The methods depend on the bivariate distribution of zero and nonzero values. Bi-Sep is the simplest of the four bivariate approaches. It uses the univariate FP-spike procedure separately for the two SAZ variables. In Bi-D3, Bi-D1, and Bi-Sub, proportions of zeros in both variables are considered simultaneously in the binary indicators. Therefore, these strategies can account for correlated variables. The methods can be used for arbitrary distributions of the covariates. For illustration and comparison of results, data from a case-control study on laryngeal cancer, with smoking and alcohol intake as two SAZ variables, is considered. In addition, a possible extension to three or more SAZ variables is outlined. A combination of log-linear models for the analysis of the correlation in combination with the bivariate approaches is proposed. PMID:27072783

  14. Computational approach to Thornley's problem by bivariate operational calculus

    NASA Astrophysics Data System (ADS)

    Bazhlekova, E.; Dimovski, I.

    2012-10-01

    Thornley's problem is an initial-boundary value problem with a nonlocal boundary condition for linear onedimensional reaction-diffusion equation, used as a mathematical model of spiral phyllotaxis in botany. Applying a bivariate operational calculus we find explicit representation of the solution, containing two convolution products of special solutions and the arbitrary initial and boundary functions. We use a non-classical convolution with respect to the space variable, extending in this way the classical Duhamel principle. The special solutions involved are represented in the form of fast convergent series. Numerical examples are considered to show the application of the present technique and to analyze the character of the solution.

  15. Bivariate distribution, correlation, and transformation properties of two-color infrared systems.

    PubMed

    Clow, R; McNolty, F

    1974-05-01

    A two-dimensional (two-color) statistical structure is formulated that is applicable to pattern recognition, discrimination, and detection problems occurring in infrared signal-processing systems. The methodology relates physical quantities such as the temperature T of an object, its projected area A, emissivity , range R from the sensor, and noise equivalent flux density (NEFD) to the geometry of a local orthogonal coordinate system where the coordinate axes correspond to the apparent radiant intensity J in each micron bandwidth. The bivariate distribution, correlation, and transformation properties attendant to this framework are discussed in detail. Additional insight into the structure of the problem is achieved by investigating the two-color system in terms of a nonorthogonal local coordinate system. The various results presented in the paper may be extended to three-, four-, or five-color systems by direct analogies.

  16. Quality and microbial safety evaluation of new isotonic beverages upon thermal treatments.

    PubMed

    Gironés-Vilaplana, Amadeo; Huertas, Juan-Pablo; Moreno, Diego A; Periago, Paula M; García-Viguera, Cristina

    2016-03-01

    In the present study, it was evaluated how two different thermal treatments (Mild and Severe) may affect the anthocyanin content, antioxidant capacity (ABTS(+), DPPH, and FRAP), quality (CIELAB colour parameters), and microbiological safety of a new isotonic drink made of lemon and maqui berry over a commercial storage simulation using a shelf life of 56days at two preservation temperature (7°C and 37°C). Both heat treatments did not affect drastically the anthocyanins content and their percentage of retention. The antioxidant capacity, probably because of the short time, was also not affected. The CIELAB colour parameters were affected by the heat, although the isotonic drinks remained with attractive red colour during shelf life. From a microbiological point of view, the Mild heat treatment with storage at 7°C is the ideal for the preservation of microbial growth, being useful for keeping the quality and safety of beverages in commercial life.

  17. Microscopic investigation of structural evolution in even-even N = 60 isotones

    SciTech Connect

    Oudih, M. R.; Fellah, M.; Allal, N. H.; Benhamouda, N.

    2012-10-20

    The ground state properties of even-even N=60 isotones from the neutron-rich to the proton-rich side are investigated within the self-consistent Skyrme-Hartree-Fock-Bogoliubov theory in the triaxial landscape. Quantities such as binding energies and root-mean-square radii are investigated and compared with available experimental data. The evolution of the potential energy surfaces in the ({beta},{gamma}) deformation plane is presented and discussed.

  18. Isotonic contraction as a result of cooperation of sarcomeres--a model and simulation outcome.

    PubMed

    Wünsch, Z

    1996-01-01

    The molecular level of the functional structure of the contractile apparatus of cross-striated muscle has been mapped out almost minutely. Most authors accept the basic principles of the theory of sliding filaments and the theory of operation of molecular generators of force which, of course, are progressively updated by integrating new knowledge. The idea of the model delineated below does not contradict these theories, for it refers to another level of the system's hierarchy. The definition of the system, hereafter referred to Ideal Sarcomere (IS), takes into account the fact that, during isotonic contraction, a large number of not wholly independently working sarcomeres and molecular generators of force is active in a synergistic way. The shortening velocity of isotonically contracting IS is determined by the relation between quantities conveying different tasks of active generators of force and the influence of the system parameters. Although IS is derived from simple axiomatic predicates, it has properties which were not premediated in defining the system and which, in spite of this, correspond to some properties of the biological original. The equations of the system allow us to calculate the shortening velocity of 'isotonic contraction' and other variables and parameters and show, inter alia, an alternative way to derive and interpret the relations stated in Hill's force-velocity equation. The simulation results indicate that the macroscopic manifestations of isotonic contraction may be also contingent on the properties of the cooperating system of the multitude of sarcomeres, which also constitutes one part of the functional structure of muscle. PMID:8924648

  19. Internal dose assessment for 211At α-emitter in isotonic solution as radiopharmaceutical

    NASA Astrophysics Data System (ADS)

    Yuminov, O. A.; Fotina, O. V.; Priselkova, A. B.; Tultaev, A. V.; Platonov, S. Yu.; Eremenko, D. O.; Drozdov, V. A.

    2003-12-01

    The functional fitness of the α-emitter 211At for radiotherapy of the thyroid gland cancer is evaluated. Radiation doses are calculated using the MIRD method and previously obtained pharmacokinetic data for 211At in isotonic solution and for 123I as sodium iodide. Analysis of the 211At radiation dose to the thyroid gland suggests that this radiopharmaceutical may be predominantly used for the treatment of the thyroid cancer.

  20. The stability of citrate-capped silver nanoparticles in isotonic glucose solution for intravenous injection.

    PubMed

    Park, Kwangsik; Lee, Yeonjin

    2013-01-01

    Citrate-capped silver nanoparticles (AgNP) are widely used in industry, consumer products, and medical appliances. However, information on the environmental toxicity and human health is not comprehensive. Further, the physicochemical properties of AgNP make it difficult to test toxicity, as nanosized particles, due to their size, may increase by aggregation or agglomeration in some administration vehicles. In this study, stability of AgNP was investigated in different types of isotonic solutions, which is important for in vitro testing or toxicokinetic studies using intravenous (iv) injection. Size, morphology, zeta potential, and ion formation were investigated in isotonic solutions for the physicochemical characterization of AgNP. Aggregation and precipitation of AgNP were observed in phosphate-buffered saline or 0.9% NaCl, while AgNP were stable without aggregation or precipitation in 5% glucose in isotonic solution. The average size of AgNP in 5% glucose was approximately 10 nm at different temperatures of 10, 25, or 36°C and at varying concentrations from 10 to 1000 ppm. It is noteworthy that this is almost the same size distribution as that in the water-based suspension of AgNP supplied by the manufacturer. Zeta potential ranged from -40 to -60 mV, suggesting that the repulsion forces of AgNP are not disturbed to a sufficient degree to aggregate while osmolarity is in the isotonic range of 290 ± 10 mOsm/kg in 5% glucose solution. Data suggest that AgNP in a 5% glucose solution may be used in the toxicity test via iv injection without adverse consequences in blood. PMID:24283395

  1. Effect of isotonic and isokinetic exercise on muscle activity and balance of the ankle joint

    PubMed Central

    Kim, Mi-Kyoung; Yoo, Kyung-Tae

    2015-01-01

    [Purpose] This study was performed to examine how the balance of lower limbs and the muscle activities of the tibialis anterior (TA), the medial gastrocnemius (GCM), and the peroneus longus (PL) are influenced by isotonic and isokinetic exercise of the ankle joint. [Subjects] The subjects of this study were healthy adults (n=20), and they were divided into two groups (isotonic=10, isokinetic=10). [Methods] Isotonic group performed 3 sets of 10 contractions at 50% of MVIC and Isokinetic group performed 3 sets of 60°/sec. Muscle activity was measured by EMG and balance was measured by one-leg standing test. [Results] For muscle activity, a main effect of group was found in the non-dominant TA, and the dominant TA, GCM and PL. For balance, a main effect of time was found in both groups for the sway area measured support was provided by the non-dominant side. [Conclusion] In terms of muscle activity, the two groups showed a significant difference, and the isokinetic group showed higher muscle activities. In terms of balance, there was a significant difference between the pre-test and the post-test. The results of this study may help in the selection of exercises for physical therapy, because they show that muscle activity and balance vary according to the type of exercise. PMID:25729181

  2. Comparison of complex permittivities of isotonic colloids containing single-wall carbon nanotubes of varying chirality.

    PubMed

    Nair, Tejas; Symanowski, James T; Gach, H Michael

    2012-02-01

    The application of bio-compatible, conductive nanoparticles in combination with radiofrequency (RF) irradiation to raise tissue temperatures between 40 and 60 °C for hyperthermia and ablation spurred interest in the complex permittivities of isotonic nanoparticle-based colloids. Nanoparticles with large aspect ratios and high permittivities increase the bulk permittivity of the colloid and RF losses at the macroscopic scale. The complex permittivities of isotonic colloids with and without single-wall carbon nanotubes (SWCNTs) containing either metallic, semiconducting, or mixed chiralities were measured from 20 MHz to 1 GHz at room temperature. The colloids were made with one of three different isotonic solvents: phosphate buffered saline (PBS), and Dulbecco's modified eagle medium (DMEM) with and without 0.5% weight/volume bovine serum albumin to simulate cytosol and blood, respectively. The concentration of elemental carbon from the SWCNTs in the colloids ranged from 16 to 17 mM. The permittivities were corrected for electrode polarization effects by fitting the data to the Cole-Cole relaxation model with a constant phase angle element. The presence of SWCNTs increased both the real and imaginary components of the permittivities of the colloids. For all three solvents, the direct current (DC) components of the real and imaginary permittivities were greatest for the colloids containing the mixed chirality SWCNTs, followed by the colloids with semiconducting SWCNTs, and then metallic SWCNTs.

  3. A bivariate survival model with compound Poisson frailty.

    PubMed

    Wienke, A; Ripatti, S; Palmgren, J; Yashin, A

    2010-01-30

    A correlated frailty model is suggested for analysis of bivariate time-to-event data. The model is an extension of the correlated power variance function (PVF) frailty model (correlated three-parameter frailty model) (J. Epidemiol. Biostat. 1999; 4:53-60). It is based on a bivariate extension of the compound Poisson frailty model in univariate survival analysis (Ann. Appl. Probab. 1992; 4:951-972). It allows for a non-susceptible fraction (of zero frailty) in the population, overcoming the common assumption in survival analysis that all individuals are susceptible to the event under study. The model contains the correlated gamma frailty model and the correlated inverse Gaussian frailty model as special cases. A maximum likelihood estimation procedure for the parameters is presented and its properties are studied in a small simulation study. This model is applied to breast cancer incidence data of Swedish twins. The proportion of women susceptible to breast cancer is estimated to be 15 per cent.

  4. A Bayesian semiparametric model for bivariate sparse longitudinal data.

    PubMed

    Das, Kiranmoy; Li, Runze; Sengupta, Subhajit; Wu, Rongling

    2013-09-30

    Mixed-effects models have recently become popular for analyzing sparse longitudinal data that arise naturally in biological, agricultural and biomedical studies. Traditional approaches assume independent residuals over time and explain the longitudinal dependence by random effects. However, when bivariate or multivariate traits are measured longitudinally, this fundamental assumption is likely to be violated because of intertrait dependence over time. We provide a more general framework where the dependence of the observations from the same subject over time is not assumed to be explained completely by the random effects of the model. We propose a novel, mixed model-based approach and estimate the error-covariance structure nonparametrically under a generalized linear model framework. We use penalized splines to model the general effect of time, and we consider a Dirichlet process mixture of normal prior for the random-effects distribution. We analyze blood pressure data from the Framingham Heart Study where body mass index, gender and time are treated as covariates. We compare our method with traditional methods including parametric modeling of the random effects and independent residual errors over time. We conduct extensive simulation studies to investigate the practical usefulness of the proposed method. The current approach is very helpful in analyzing bivariate irregular longitudinal traits. PMID:23553747

  5. Predicting the Size of Sunspot Cycle 24 on the Basis of Single- and Bi-Variate Geomagnetic Precursor Methods

    NASA Technical Reports Server (NTRS)

    Wilson, Robert M.; Hathaway, David H.

    2009-01-01

    Examined are single- and bi-variate geomagnetic precursors for predicting the maximum amplitude (RM) of a sunspot cycle several years in advance. The best single-variate fit is one based on the average of the ap index 36 mo prior to cycle minimum occurrence (E(Rm)), having a coefficient of correlation (r) equal to 0.97 and a standard error of estimate (se) equal to 9.3. Presuming cycle 24 not to be a statistical outlier and its minimum in March 2008, the fit suggests cycle 24 s RM to be about 69 +/- 20 (the 90% prediction interval). The weighted mean prediction of 11 statistically important single-variate fits is 116 +/- 34. The best bi-variate fit is one based on the maximum and minimum values of the 12-mma of the ap index; i.e., APM# and APm*, where # means the value post-E(RM) for the preceding cycle and * means the value in the vicinity of cycle minimum, having r = 0.98 and se = 8.2. It predicts cycle 24 s RM to be about 92 +/- 27. The weighted mean prediction of 22 statistically important bi-variate fits is 112 32. Thus, cycle 24's RM is expected to lie somewhere within the range of about 82 to 144. Also examined are the late-cycle 23 behaviors of geomagnetic indices and solar wind velocity in comparison to the mean behaviors of cycles 2023 and the geomagnetic indices of cycle 14 (RM = 64.2), the weakest sunspot cycle of the modern era.

  6. Spectrum-based estimators of the bivariate Hurst exponent

    NASA Astrophysics Data System (ADS)

    Kristoufek, Ladislav

    2014-12-01

    We discuss two alternate spectrum-based estimators of the bivariate Hurst exponent in the power-law cross-correlations setting, the cross-periodogram and local X -Whittle estimators, as generalizations of their univariate counterparts. As the spectrum-based estimators are dependent on a part of the spectrum taken into consideration during estimation, a simulation study showing performance of the estimators under varying bandwidth parameter as well as correlation between processes and their specification is provided as well. These estimators are less biased than the already existent averaged periodogram estimator, which, however, has slightly lower variance. The spectrum-based estimators can serve as a good complement to the popular time domain estimators.

  7. {alpha}-decay studies of the exotic N=125, 126, and 127 isotones

    SciTech Connect

    Xu Chang; Ren Zhongzhou

    2007-08-15

    The {alpha}-decay half-lives of the exotic N=125, 126, and 127 isotones (Po, Rn, Ra, Th, and U) are systematically studied by the density-dependent cluster model (DDCM). The influence of the neutron shell closure N=126 on the {alpha}-cluster formation and penetration probabilities is analyzed and discussed in detail. By combining the DDCM and a two-level microscopic model together, the experimental half-lives of {alpha} transitions to both the ground state and the excited state in the daughter nuclei are reproduced very well.

  8. Bonn potential and shell-model calculations for N=126 isotones

    SciTech Connect

    Coraggio, L.; Covello, A.; Gargano, A.; Itaco, N.; Kuo, T. T. S.

    1999-12-01

    We have performed shell-model calculations for the N=126 isotones {sup 210}Po, {sup 211}At, and {sup 212}Rn using a realistic effective interaction derived from the Bonn-A nucleon-nucleon potential by means of a G-matrix folded-diagram method. The calculated binding energies, energy spectra, and electromagnetic properties show remarkably good agreement with the experimental data. The results of this paper complement those of our previous study on neutron hole Pb isotopes, confirming that realistic effective interactions are now able to reproduce with quantitative accuracy the spectroscopic properties of complex nuclei. (c) 1999 The American Physical Society.

  9. Selective recruitment of high-threshold human motor units during voluntary isotonic lengthening of active muscles.

    PubMed Central

    Nardone, A; Romanò, C; Schieppati, M

    1989-01-01

    1. We have investigated the possibility that voluntary muscle lengthening contractions can be performed by selective recruitment of fast-twitch motor units, accompanied by derecruitment of slow-twitch motor units. 2. The behaviour of motor units in soleus, gastrocnemius lateralis and gastrocnemius medialis muscles was studied during (a) controlled isotonic plantar flexion against a constant load (shortening contraction, S), maintained plantar flexion, or dorsal flexion resisting the load and gradually yielding to it (lengthening contraction, L), (b) isometric increasing or decreasing plantar torque accomplished by graded contraction or relaxation of the triceps surae muscles, (c) isometric or isotonic ballistic contractions, and (d) periodic, quasi-sinusoidal isotonic contractions at different velocities. The above tasks were performed under visual control of foot position, without activation of antagonist muscles. The motor units discharging during foot rotation were grouped on the basis of the phase(s) during which they were active as S, S + L and L. The units were also characterized according to both the level of isometric ramp plantar torque at which they were first recruited and the amplitude of their action potential. 3. S units were never active during dorsal flexion; some of them were active during the sustained contraction between plantar and dorsal flexion. Most S + L units were active also during the maintenance phase and were slowly derecruited during lengthening; their behaviour during foot rotations was similar to that during isometric contractions or relaxations. L units were never active during either plantar or maintained flexion, but discharged during lengthening contraction in a given range of rotation velocities; the velocity of lengthening consistently influenced the firing frequency of these units. Such dependence on velocity was not observed in S + L units. 4. A correlation was found between the amplitude of the action potential and the

  10. Spectroscopy of 155Yb: Structure evolution in the N =85 isotones

    NASA Astrophysics Data System (ADS)

    Li, X. Q.; Xu, C.; Zhang, S. Q.; Hua, H.; Meng, J.; Bark, R. A.; Chen, Q. B.; Niu, C. Y.; Han, R.; Wyngaardt, S. M.; Wang, S. Y.; Wang, S.; Qi, B.; Liu, L.; Zhu, L. H.; Shi, Z.; Zhang, G. L.; Sun, B. H.; Le, X. Y.; Song, C. Y.; Ye, Y. L.; Jiang, D. X.; Xu, F. R.; Li, Z. H.; Sun, J. J.; Shi, Y.; Zhao, P. W.; Liang, W. Y.; Li, C. G.; Wang, C. G.; Chen, X. C.; Li, Z. H.; Sun, D. P.; Liu, C.; Li, Z. Q.; Jones, P.; Lawrie, E. A.; Lawrie, J. J.; Wiedeking, M.; Bucher, T. D.; Dinoko, T.; Kheswa, B. V.; Makhathini, L.; Majola, S. N. T.; Ndayishimye, J.; Noncolela, S. P.; Shirinda, O.; Gál, J.; Kalinka, G.; Molnár, J.; Nyakó, B. M.; Timár, J.; Juhász, K.; Arogunjo, M.

    2016-08-01

    High-spin states in 155Yb have been studied via the 144Sm(16O,5 n )155Yb reaction at a beam energy of 118 MeV. One negative-parity and one positive-parity cascade built on the ν f7 /2 and ν i13 /2 states, respectively, are established for the first time. The structures observed in 155Yb are compared with those in the neighboring N =85 isotones and with semiempirical shell-model (SESM) calculations. According to adiabatic and configuration-fixed constrained triaxial covariant density functional theory (CDFT) calculations, a coexistence of prolate and oblate shapes is predicted to exist in 155Yb.

  11. Projected quasiparticle calculations for the N =82 odd-proton isotones

    SciTech Connect

    Losano, L. ); Dias, H. )

    1991-12-01

    The structure of low-lying states in odd-mass {ital N}=82 isotones (135{le}{ital A}{le}145) is investigated in terms of a number-projected one- and three-quasiparticles Tamm-Dancoff approximation. A surface-delta interaction is taken as the residual nucleon-nucleon interaction. Excitation energies, dipole and quadrupole moments, and {ital B}({ital M}1) and {ital B}({ital E}2) values are calculated and compared with the experimental data.

  12. Effect of isotonic solutions and peptide adsorption on zeta potential of porous silicon nanoparticle drug delivery formulations.

    PubMed

    Kaasalainen, Martti; Mäkilä, Ermei; Riikonen, Joakim; Kovalainen, Miia; Järvinen, Kristiina; Herzig, Karl-Heinz; Lehto, Vesa-Pekka; Salonen, Jarno

    2012-07-15

    Recently, highly promising results considering the use of porous silicon (PSi) nanoparticles as a controlled and targeted drug delivery system have been published. Drugs are typically loaded into PSi nanoparticles by electrostatic interactions, and the drug-loaded nanoparticles are then administered parenterally in isotonic solutions. Zeta potential has an important role in drug adsorption and overall physical stability of nanosuspensions. In the present study, we used zeta potential measurements to study the impact of the formulation components to the nanosuspension stability. The impact of medium was studied by measuring isoelectric points (IEP) and zeta potentials in isotonic media. The role of drug adsorption was demonstrated with gastrointestinal peptides GLP-1(7-37) and PYY (3-36) and the selection of isotonic additive was demonstrated with peptide-loaded PSi nanoparticles. The results show the notable effect of isotonic solutions and peptide adsorption on zeta potential of PSi nanosuspensions. As a rule of thumb, the sugars (sucrose, dextrose and mannitol) seem to be good media for negatively charged peptide-loaded particles and weak acids (citric- and lactic acid) for positively charged particles. Nevertheless, perhaps the most important rule can be given for isotonic salt solutions which all are very poor media when the stability of nanosuspension is considered.

  13. Plasma expansion does not precipitate the fall in plasma vasopressin in humans drinking isotonic fluids.

    PubMed Central

    Cotter, T P; Gebruers, E M; Hall, W J; O'Sullivan, M F

    1986-01-01

    In a group of healthy humans, plasma vasopressin (AVP) levels fell on drinking either Tyrode or mannitol solutions isosmotic with plasma. Both the timing and magnitude of the fall were appropriate to account for the transient diuresis which followed the drinking. Although plasma expansion follows drinking Tyrode solution it occurred too late to account for the fall in plasma AVP. It was also too small to inhibit AVP secretion. Even though plasma volume tended to contract on drinking isosmotic mannitol solution a fall in plasma AVP and a diuresis occurred, similar to those found after drinking Tyrode solution. These findings appear to eliminate plasma volume expansion as the stimulus for the fall in plasma AVP and the associated diuresis on drinking isotonic fluids. In a further group of human subjects, bypassing the oropharynx by intragastric infusion resulted in a slower onset of diuresis after a water load. We suggest that receptors, as yet undefined, in the upper gastrointestinal tract contribute to the early stages of a water diuresis and account for the apparently inappropriate transient diuresis which follows the drinking of isotonic fluids. PMID:3098967

  14. Insulin and glucose responses during bed rest with isotonic and isometric exercise

    NASA Technical Reports Server (NTRS)

    Dolkas, C. B.; Greenleaf, J. E.

    1977-01-01

    The effects of daily intensive isotonic (68% maximum oxygen uptake) and isometric (21% maximum extension force) leg exercise on plasma insulin and glucose responses to an oral glucose tolerance test (OGTT) during 14-day bed-rest (BR) periods were investigated in seven young healthy men. The OGTT was given during ambulatory control and on day 10 of the no-exercise, isotonic, and isometric exercise BR periods during the 15-wk study. The subjects were placed on a controlled diet starting 10 days before each BR period. During BR, basal plasma glucose concentration remained unchanged with no exercise, but increased (P less 0.05) to 87-89 mg/100 ml with both exercise regimens on day 2, and then fell slightly below control levels on day 13. The fall in glucose content during BR was independent of the exercise regimen and was an adjustment for the loss of plasma volume. The intensity of the responses of insulin and glucose to the OGTT was inversely proportional to the total daily energy expenditure during BR. It was estimated that at least 1020 kcal/day must be provided by supplemental exercise to restore the hyperinsulinemia to control levels.

  15. Search for the Skyrme-Hartree-Fock solutions for chiral rotation in N=75 isotones

    SciTech Connect

    Olbratowski, P.; Dobaczewski, J.; Dudek, J.

    2006-05-15

    A search for self-consistent solutions for the chiral rotational bands in the N=75 isotones {sup 130}Cs, {sup 132}La, {sup 134}Pr, and {sup 136}Pm is performed within the Skyrme-Hartree-Fock cranking approach using SKM* and SLy4 parametrizations. The dependence of the solutions on the time-odd contributions in the energy functional is studied. From among the four isotones considered, self-consistent chiral solutions are obtained only in {sup 132}La. The microscopic calculations are compared with the {sup 132}La experimental data and with results of a classical model that contains all the mechanisms underlying the chirality of the collective rotational motion. Strong similarities between the Hartree-Fock and classical model results are found. The suggestion formulated earlier by the authors that the chiral rotation cannot exist below a certain critical frequency is further illustrated and discussed, together with the microscopic origin of a transition from planar to chiral rotation in nuclei. We also formulate the separability rule by which the tilted-axis-cranking solutions can be inferred from three independent principal-axis-cranking solutions corresponding to three different axes of rotation.

  16. Heat-induced changes in the mechanics of a collagenous tissue: isothermal, isotonic shrinkage.

    PubMed

    Chen, S S; Wright, N T; Humphrey, J D

    1998-06-01

    We present data from isothermal, isotonic-shrinkage tests wherein bovine chordae tendineae were subjected to well-defined constant temperatures (from 65 to 90 degrees C), durations of heating (from 180 to 3600 s), and isotonic uniaxial stresses during heating (from 100 to 650 kPa). Tissue response during heating and "recovery" at 37 degrees C following heating was evaluated in terms of the axial shrinkage, a gross indicator of underlying heat-induced denaturation. There were three key findings. First, scaling the heating time via temperature and load-dependent characteristic times for the denaturation process collapsed all shrinkage data to a single curve, and thereby revealed a time-temperature-load equivalency. Second, the characteristic times exhibited an Arrhenius-type behavior with temperature wherein the slopes were nearly independent of applied load--this suggested that applied loads during heating affect the activation entropy, not energy. Third, all specimens exhibited a time-dependent, partial recovery when returned to 37 degrees C following heating, but the degree of recovery decreased with increases in the load imposed during heating. These new findings on heat-induced changes in tissue behavior will aid in the design of improved clinical heating protocols and provide guidance for the requisite constitutive formulations. PMID:10412406

  17. Landslide susceptibility mapping using a bivariate statistical model in a tropical hilly area of southeastern Brazil

    NASA Astrophysics Data System (ADS)

    Araújo, J. P. C.; DA Silva, L. M.; Dourado, F. A. D.; Fernandes, N.

    2015-12-01

    Landslides are the most damaging natural hazard in the mountainous region of Rio de Janeiro State in Brazil, responsible for thousands of deaths and important financial and environmental losses. However, this region has currently few landslide susceptibility maps implemented on an adequate scale. Identification of landslide susceptibility areas is fundamental in successful land use planning and management practices to reduce risk. This paper applied the Bayes' theorem based on weight of evidence (WoE) using 8 landslide-related factors in a geographic information system (GIS) for landslide susceptibility mapping. 378 landslide locations were identified and mapped on a selected basin in the city of Nova Friburgo, triggered by the January 2011 rainfall event. The landslide scars were divided into two subsets: training and validation subsets. The 8 landslide-related factors weighted by WoE were performed using chi-square test to indicate which variables are conditionally independent of each other to be used in the final map. Finally, the maps of weighted factors were summed up to construct the landslide susceptibility map and validated by the validation landslide subset. According to the results, slope, aspect and contribution area showed the higher positive spatial correlation with landslides. In the landslide susceptibility map, 21% of the area presented very low and low susceptibilities with 3% of the validation scars, 41% presented medium susceptibility with 22% of the validation scars and 38% presented high and very high susceptibilities with 75% of the validation scars. The very high susceptibility class stands for 16% of the basin area and has 54% of the all scars. The approach used in this study can be considered very useful since 75% of the area affected by landslides was included in the high and very high susceptibility classes.

  18. SEMG analysis of astronaut upper arm during isotonic muscle actions with normal standing posture

    NASA Astrophysics Data System (ADS)

    Qianxiang, Zhou; Chao, Ma; Xiaohui, Zheng

    sEMG analysis of astronaut upper arm during isotonic muscle actions with normal standing posture*1 Introduction Now the research on the isotonic muscle actions by using Surface Electromyography (sEMG) is becoming a pop topic in fields of astronaut life support training and rehabilitations. And researchers paid more attention on the sEMG signal processes for reducing the influence of noise which is produced during monitoring process and the fatigue estimation of isotonic muscle actions with different force levels by using the parameters which are obtained from sEMG signals such as Condition Velocity(CV), Median Frequency(MDF), Mean Frequency(MNF) and so on. As the lucubrated research is done, more and more research on muscle fatigue issue of isotonic muscle actions are carried out with sEMG analysis and subjective estimate system of Borg scales at the same time. In this paper, the relationship between the variable for fatigue based on sEMG and the Borg scale during the course of isotonic muscle actions of the upper arm with different contraction levels are going to be investigated. Methods 13 young male subjects(23.4±2.45years, 64.7±5.43Kg, 171.7±5.41cm) with normal standing postures were introduced to do isotonic actions of the upper arm with different force levels(10% MVC, 30%MVC and 50%MVC). And the MVC which means maximal voluntary contraction was obtained firstly in the experiment. Also the sEMG would be recorded during the experiments; the Borg scales would be recorded for each contraction level. By using one-third band octave method, the fatigue variable (p) based on sEMG were set up and it was expressed as p = i g(fi ) · F (fi ). And g(fi ) is defined as the frequent factor which was 0.42+0.5 cos(π fi /f0 )+0.08 cos(2π fi /f0 ), 0 < FI fi 0, orf0 ≤> f0 . According to the equations, the p could be computed and the relationship between variable p and the Borg scale would be investigated. Results In the research, three kinds of fitted curves between

  19. SEMG analysis of astronaut upper arm during isotonic muscle actions with normal standing posture

    NASA Astrophysics Data System (ADS)

    Qianxiang, Zhou; Chao, Ma; Xiaohui, Zheng

    sEMG analysis of astronaut upper arm during isotonic muscle actions with normal standing posture*1 Introduction Now the research on the isotonic muscle actions by using Surface Electromyography (sEMG) is becoming a pop topic in fields of astronaut life support training and rehabilitations. And researchers paid more attention on the sEMG signal processes for reducing the influence of noise which is produced during monitoring process and the fatigue estimation of isotonic muscle actions with different force levels by using the parameters which are obtained from sEMG signals such as Condition Velocity(CV), Median Frequency(MDF), Mean Frequency(MNF) and so on. As the lucubrated research is done, more and more research on muscle fatigue issue of isotonic muscle actions are carried out with sEMG analysis and subjective estimate system of Borg scales at the same time. In this paper, the relationship between the variable for fatigue based on sEMG and the Borg scale during the course of isotonic muscle actions of the upper arm with different contraction levels are going to be investigated. Methods 13 young male subjects(23.4±2.45years, 64.7±5.43Kg, 171.7±5.41cm) with normal standing postures were introduced to do isotonic actions of the upper arm with different force levels(10% MVC, 30%MVC and 50%MVC). And the MVC which means maximal voluntary contraction was obtained firstly in the experiment. Also the sEMG would be recorded during the experiments; the Borg scales would be recorded for each contraction level. By using one-third band octave method, the fatigue variable (p) based on sEMG were set up and it was expressed as p = i g(fi ) · F (fi ). And g(fi ) is defined as the frequent factor which was 0.42+0.5 cos(π fi /f0 )+0.08 cos(2π fi /f0 ), 0 < FI fi 0, orf0 ≤> f0 . According to the equations, the p could be computed and the relationship between variable p and the Borg scale would be investigated. Results In the research, three kinds of fitted curves between

  20. Asymptotics of bivariate generating functions with algebraic singularities

    NASA Astrophysics Data System (ADS)

    Greenwood, Torin

    Flajolet and Odlyzko (1990) derived asymptotic formulae the coefficients of a class of uni- variate generating functions with algebraic singularities. Gao and Richmond (1992) and Hwang (1996, 1998) extended these results to classes of multivariate generating functions, in both cases by reducing to the univariate case. Pemantle and Wilson (2013) outlined new multivariate ana- lytic techniques and used them to analyze the coefficients of rational generating functions. After overviewing these methods, we use them to find asymptotic formulae for the coefficients of a broad class of bivariate generating functions with algebraic singularities. Beginning with the Cauchy integral formula, we explicity deform the contour of integration so that it hugs a set of critical points. The asymptotic contribution to the integral comes from analyzing the integrand near these points, leading to explicit asymptotic formulae. Next, we use this formula to analyze an example from current research. In the following chapter, we apply multivariate analytic techniques to quan- tum walks. Bressler and Pemantle (2007) found a (d + 1)-dimensional rational generating function whose coefficients described the amplitude of a particle at a position in the integer lattice after n steps. Here, the minimal critical points form a curve on the (d + 1)-dimensional unit torus. We find asymptotic formulae for the amplitude of a particle in a given position, normalized by the number of steps n, as n approaches infinity. Each critical point contributes to the asymptotics for a specific normalized position. Using Groebner bases in Maple again, we compute the explicit locations of peak amplitudes. In a scaling window of size the square root of n near the peaks, each amplitude is asymptotic to an Airy function.

  1. Uncertain Characterization of Flood Hazard Using Bivariate Analysis Based on Copulas

    NASA Astrophysics Data System (ADS)

    Candela, Angela; Tito Aronica, Giuseppe

    2015-04-01

    This study presents a methodology to derive probabilistic flood hazard map in flood prone areas taking into account uncertainties in the definition of design-hydrographs. Particularly, we present an innovative approach to obtain probabilistic inundation and flood hazard maps where hydrological input (synthetic flood design event) to a 2D hydraulic model has been defined by generating flood peak discharges and volumes from a bivariate statistical analysis, through the use of copulas. This study also aims to quantify the contribution of boundary conditions uncertainty in order to explore the impact of this uncertainty on probabilistic flood hazard mapping. The uncertainty of extreme flood events is considered in terms of different possible combinations of peak discharge and flood volume given by the copula. Further, we analyzed the role of a multivariate probability hydrological analysis on inundation and flood hazard maps highlighting the differences between deterministic and probabilistic approaches. The methodology has been applied to a study area located in Sicily that was subject to several flooding events in the past.

  2. Bivariate spatial analysis of temperature and precipitation from general circulation models and observation proxies

    NASA Astrophysics Data System (ADS)

    Philbin, R.; Jun, M.

    2015-05-01

    This study validates the near-surface temperature and precipitation output from decadal runs of eight atmospheric ocean general circulation models (AOGCMs) against observational proxy data from the National Centers for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) reanalysis temperatures and Global Precipitation Climatology Project (GPCP) precipitation data. We model the joint distribution of these two fields with a parsimonious bivariate Matern spatial covariance model, accounting for the two fields' spatial cross-correlation as well as their own smoothnesses. We fit output from each AOGCM (30-year seasonal averages from 1981 to 2010) to a statistical model on each of 21 land regions. Both variance and smoothness values agree for both fields over all latitude bands except southern mid-latitudes. Our results imply that temperature fields have smaller smoothness coefficients than precipitation fields, while both have decreasing smoothness coefficients with increasing latitude. Models predict fields with smaller smoothness coefficients than observational proxy data for the tropics. The estimated spatial cross-correlations of these two fields, however, are quite different for most GCMs in mid-latitudes. Model correlation estimates agree well with those for observational proxy data for Australia, at high northern latitudes across North America, Europe and Asia, as well as across the Sahara, India, and Southeast Asia, but elsewhere, little consistent agreement exists.

  3. Estimation of isotonic point of incubation medium for two-cell mouse embryo.

    PubMed

    Pogorelova, M A; Golichenkov, V A; Pogorelova, V N; Kornienko, E V; Panait, A I; Pogorelov, A G

    2011-11-01

    Osmolarity of Dulbecco's medium at which the volume of two-cell mouse embryo remained similar to that of intact embryo was determined. The method is based on comparison of kinetic curves describing the volume of embryonic cell in solutions of different osmolarity. The blastomere volume was measured by quantitative laser microtomography after fixed osmotic stress intervals. It was found that Dulbecco's saline with 125 mM NaCl solution is an isotonic solution for two-cell mouse embryo. This concentration corresponds to 290 mOsm, which is lower than osmolarity (~310 mOsm) of media routinely used for culturing of differentiated cells or biological fluids, e.g. blood plasma. PMID:22803061

  4. Local suppression of collectivity in the N=80 isotones at the Z=58 subshell closure

    NASA Astrophysics Data System (ADS)

    Bauer, C.; Rainovski, G.; Pietralla, N.; Bianco, D.; Blazhev, A.; Bloch, T.; Bönig, S.; Damyanova, A.; Danchev, M.; Gladnishki, K. A.; Kröll, T.; Leske, J.; Lo Iudice, N.; Möller, T.; Moschner, K.; Pakarinen, J.; Reiter, P.; Scheck, M.; Seidlitz, M.; Siebeck, B.; Stahl, C.; Stegmann, R.; Stora, T.; Stoyanov, Ch.; Tarpanov, D.; Vermeulen, M. J.; Voulot, D.; Warr, N.; Wenander, F.; Werner, V.; De Witte, H.

    2013-08-01

    Background: Recent data on N=80 isotones have suggested that the proton π(1g7/2) subshell closure at Z=58 has an impact on the properties of low-lying collective states.Purpose: Knowledge of the B(E2;21+→01+) value of 140Nd is needed in order to test this conjecture.Method: The unstable, neutron-rich nucleus 140Nd was investigated via projectile Coulomb excitation at the REX-ISOLDE facility with the MINIBALL spectrometer.Results: The B(E2) value of 33(2) W.u. expands the N=80 systematics beyond the Z=58 subshell closure.Conclusions: The measurement demonstrates that the reduced collectivity of 138Ce is a local effect possibly due to the Z=58 subshell closure and requests refined theoretical calculations. The latter predict a smoothly increasing trend.

  5. Interplay between pairing and tensor effects in the N = 82 even-even isotone chain

    NASA Astrophysics Data System (ADS)

    Anguiano, M.; Bernard, R. N.; Lallena, A. M.; Co', G.; De Donno, V.

    2016-11-01

    The combined effects of the pairing and tensor terms of the nuclear interaction are investigated by analyzing the ground state properties of the nuclei belonging to the isotonic chain N = 82. The pairing effects have been taken into account by considering both Hartree-Fock-Bogoliubov and Hartree-Fock plus Bardeen-Cooper-Schrieffer approaches using the same finite-range nuclear interaction, specifically a force of Gogny type. Our results reproduce very well the available experimental data of binding energies and charge radii. The study of the particle number fluctuation indicates that the presence of the tensor terms in the interaction reduces the pairing effects and produces new shell closures in some isotopes. The experimental behavior of the energy difference between neutron single particle states up to A = 140 is properly described only if the tensor force is considered.

  6. Optical Coherence Tomography Noise Reduction Using Anisotropic Local Bivariate Gaussian Mixture Prior in 3D Complex Wavelet Domain

    PubMed Central

    Sonka, Milan; Abramoff, Michael D.

    2013-01-01

    In this paper, MMSE estimator is employed for noise-free 3D OCT data recovery in 3D complex wavelet domain. Since the proposed distribution for noise-free data plays a key role in the performance of MMSE estimator, a priori distribution for the pdf of noise-free 3D complex wavelet coefficients is proposed which is able to model the main statistical properties of wavelets. We model the coefficients with a mixture of two bivariate Gaussian pdfs with local parameters which are able to capture the heavy-tailed property and inter- and intrascale dependencies of coefficients. In addition, based on the special structure of OCT images, we use an anisotropic windowing procedure for local parameters estimation that results in visual quality improvement. On this base, several OCT despeckling algorithms are obtained based on using Gaussian/two-sided Rayleigh noise distribution and homomorphic/nonhomomorphic model. In order to evaluate the performance of the proposed algorithm, we use 156 selected ROIs from 650 × 512 × 128 OCT dataset in the presence of wet AMD pathology. Our simulations show that the best MMSE estimator using local bivariate mixture prior is for the nonhomomorphic model in the presence of Gaussian noise which results in an improvement of 7.8 ± 1.7 in CNR. PMID:24222760

  7. Nucleon-pair states of even-even N =82 isotones

    NASA Astrophysics Data System (ADS)

    Cheng, Y. Y.; Zhao, Y. M.; Arima, A.

    2016-08-01

    In this paper we study low-lying states of five N =82 isotones, 134Te, 136Xe, 138Ba, 140Ce and 142Nd, within the framework of the nucleon-pair approximation (NPA). For the low-lying yrast states of 136Xe and 138Ba, we calculate the overlaps between the wave functions obtained in the full shell-model (SM) space and those obtained in the truncated NPA space, and find that most of these overlaps are very close to 1. Very interestingly and surprisingly, for most of these yrast states, the SM wave functions are found to be well approximated by one-dimensional, optimized pair basis states, which indicates a simple picture of "nucleon-pair states". The positive-parity yrast states with spin J >6 in these nuclei, as well as the 82+ state, are found to be well described by breaking one or two S pair(s) of the 61+ or 62+ state (low-lying, seniority-two, spin-maximum, and positive-parity); similarly, negative-parity yrast states with spin J >9 are well represented by breaking one or two S pair(s) of the 91- state (low-lying, seniority-two, spin-maximum, and negative-parity). It is shown that the low-lying negative-parity yrast states of 136Xe and 138Ba are reasonably described to be one-octupole-phonon excited states. The evolution of the 61+ and 62+ states for the five isotones are also systematically investigated.

  8. Muscle gearing during isotonic and isokinetic movements in the ankle plantarflexors.

    PubMed

    Randhawa, Avleen; Jackman, Meghan E; Wakeling, James M

    2013-02-01

    Muscle-tendon gearing is the ratio of the muscle-tendon unit velocity to the fascicle velocity and can be expressed as the product of the gearing within the muscle belly and the gearing due to tendon stretch. Previous studies have shown that gearing is variable and increases at higher velocities. Changes in the muscle activation levels and force development have been suggested to affect tendon gearing and thus muscle-tendon unit gearing. However, the role of belly gearing as a part of muscle-tendon gearing and its associations with structural aspects of muscle and thus movement performance are important facets that need to be studied. The two gastrocnemii of twenty young adults were tested during isokinetic and isotonic contractions on an ankle dynamometer. Ultrasound images of both muscles were collected during contractions and were later digitised. Gearing was also predicted using a 2-dimensional panel model of these muscles. The results from experimental and models tests showed increases in gearing with greater torque levels at slower contraction velocities. However, in the isotonic models there was a substantial increase in gearing at faster contraction velocities. The level of muscle-tendon unit gearing is largely determined by the belly gearing, but its variability is driven by changes in tendon gearing that in turn is a factor of the muscle activation and coordination. The belly thickness of the medial gastrocnemius decreased during contractions, but increased for the lateral gastrocnemius. It is likely that changes to the belly shape and 3-dimensional structure are important to the gearing of the muscle.

  9. Causal networks clarify productivity-richness interrelations, bivariate plots do not

    USGS Publications Warehouse

    Grace, James B.; Adler, Peter B.; Harpole, W. Stanley; Borer, Elizabeth T.; Seabloom, Eric W.

    2014-01-01

    We urge ecologists to consider productivity–richness relationships through the lens of causal networks to advance our understanding beyond bivariate analysis. Further, we emphasize that models based on a causal network conceptualization can also provide more meaningful guidance for conservation management than can a bivariate perspective. Measuring only two variables does not permit the evaluation of complex ideas nor resolve debates about underlying mechanisms.

  10. A Comparative Study on Extreme Precipitation of the Han River Basin using a Bivariate Goodness-of-fit Measure for Regional Frequency Analysis

    NASA Astrophysics Data System (ADS)

    Ahn, Hyunjun; Jung, Younghun; Joo, Kyungwon; Kim, Taereem; Heo, Jun-Haeng

    2016-04-01

    In statistical hydrology, frequency analysis has been widely used for design of water resource systems. The traditional at-site analysis is recommended when the sample size is bigger than twice target return period (2T). However, in reality, the sample size of subject site is usually smaller than the target return periods such as 100- and 200-year ones. To overcome such a weakness, regional frequency analysis has been suggested and performed since 1960. To estimate robust precipitation quantiles in regional frequency analysis, it is important to select an appropriate probability distribution for a given region. Typically, goodness-of-fit measure developed by Hosking and Wallis based on the L-moment ratio diagram is used to select an appropriate probability distribution. Recently, several studies have been carried out on goodness-of-fit test for regional frequency analysis such as a bivariate goodness-of-fit measure to choose more appropriate probability distribution. In this study, regional frequency analysis is conducted for 1-hour maximum rainfall data (1961~2015) of the Han River basin in Korea. In this application, appropriate probability distributions are selected using the traditional goodness-of-fit and a bivariate goodness-of-fit measures, and then extreme precipitation quantiles from both methods are compared to suggest better method. Keywords: regional frequency analysis; goodness-of-fit measure; a bivariate goodness-of-fit measure; extreme precipitation events

  11. Interpreting Bivariate Regression Coefficients: Going beyond the Average

    ERIC Educational Resources Information Center

    Halcoussis, Dennis; Phillips, G. Michael

    2010-01-01

    Statistics, econometrics, investment analysis, and data analysis classes often review the calculation of several types of averages, including the arithmetic mean, geometric mean, harmonic mean, and various weighted averages. This note shows how each of these can be computed using a basic regression framework. By recognizing when a regression model…

  12. Experimental study of β and β -n decay of the neutron-rich N =54 isotone 87As

    NASA Astrophysics Data System (ADS)

    Korgul, A.; Rykaczewski, K. P.; Grzywacz, R.; Bingham, C. R.; Brewer, N. T.; Ciemny, A. A.; Gross, C. J.; Jost, C.; Karny, M.; Madurga, M.; Mazzocchi, C.; Mendez, A. J.; Miernik, K.; Miller, D.; Padgett, S.; Paulauskas, S. V.; Stracener, D. W.; Wolińska-Cichocka, M.

    2015-11-01

    The β -decay properties of neutron-rich 87As produced in the proton-induced fission of 238U were studied at the Holifield Radioactive Ion Beam Facility at Oak Ridge National Laboratory. The low-energy excited states in N =53 87Se and N =52 86Se were identified through β -γ and β -delayed neutron-γ decay of 87As, respectively. The experimental systematics of low-energy levels of N =53 isotones, Z =34 Se87, and Z =32 85Ge, and along with an analysis of shell-model calculations, allow us to discuss the main features of excited states expected for the next N =53 isotone, 83Zn.

  13. A Model of Peritubular Capillary Control of Isotonic Fluid Reabsorption by the Renal Proximal Tubule

    PubMed Central

    Deen, W. M.; Robertson, C. R.; Brenner, B. M.

    1973-01-01

    A mathematical model of peritubular transcapillary fluid exchange has been developed to investigate the role of the peritubular environment in the regulation of net isotonic fluid transport across the mammalian renal proximal tubule. The model, derived from conservation of mass and the Starling transcapillary driving forces, has been used to examine the quantitative effects on proximal reabsorption of changes in efferent arteriolar protein concentration and plasma flow rate. Under normal physiological conditions, relatively small perturbations in protein concentration are predicted to influence reabsorption more than even large variations in plasma flow, a prediction in close accord with recent experimental observations in the rat and dog. Changes either in protein concentration or plasma flow have their most pronounced effects when the opposing transcapillary hydrostatic and osmotic pressure differences are closest to equilibrium. Comparison of these theoretical results with variations in reabsorption observed in micropuncture studies makes it possible to place upper and lower bounds on the difference between interstitial oncotic and hydrostatic pressures in the renal cortex of the rat. PMID:4696761

  14. A Review of Classification Techniques of EMG Signals during Isotonic and Isometric Contractions

    PubMed Central

    Nazmi, Nurhazimah; Abdul Rahman, Mohd Azizi; Yamamoto, Shin-Ichiroh; Ahmad, Siti Anom; Zamzuri, Hairi; Mazlan, Saiful Amri

    2016-01-01

    In recent years, there has been major interest in the exposure to physical therapy during rehabilitation. Several publications have demonstrated its usefulness in clinical/medical and human machine interface (HMI) applications. An automated system will guide the user to perform the training during rehabilitation independently. Advances in engineering have extended electromyography (EMG) beyond the traditional diagnostic applications to also include applications in diverse areas such as movement analysis. This paper gives an overview of the numerous methods available to recognize motion patterns of EMG signals for both isotonic and isometric contractions. Various signal analysis methods are compared by illustrating their applicability in real-time settings. This paper will be of interest to researchers who would like to select the most appropriate methodology in classifying motion patterns, especially during different types of contractions. For feature extraction, the probability density function (PDF) of EMG signals will be the main interest of this study. Following that, a brief explanation of the different methods for pre-processing, feature extraction and classifying EMG signals will be compared in terms of their performance. The crux of this paper is to review the most recent developments and research studies related to the issues mentioned above. PMID:27548165

  15. Mechanics of myocardial relaxation: application of a model to isometric and isotonic relaxation of rat myocardium.

    PubMed

    Wiegner, A W; Bing, O H

    1982-01-01

    Using a simple model for cardiac muscle relaxation which takes into account muscle length, activation, elasticity and a rate constant for the decay of activation, we are able to use easily measured mechanical parameters to assess the state of the cardiac relaxing system. In isolated trabeculae carneae from the left ventricle of the rat, performing physiologically sequenced contractions, observations have been made (1) at varying preloads and afterloads, (2) with changes in temperature from 23 degrees to 33 degrees C, (3) with changes in bath Ca2+ concentration and (4) with the addition of isoproterenol. During isometric relaxation, the slope (SIM) of the curve relating maximum rate of decline of force (-dF/dtmax) to end-systolic muscle length is load-independent and sensitive to interventions which directly affect the cardiac relaxing system (e.g., temperature, isoproterenol); it is only slightly sensitive to bath calcium concentration. During isotonic relaxation, the maximum velocity of lengthening (+dL/dtmax) is in negative linear proportion to muscle shortening at a given preload, the slope (SIT) of the curve relating +dL/dtmax to end-systolic length is sensitive to the interventions which directly affect the cardiac relaxing system but insensitive to calcium-mediated inotropic interventions. The model provides a theoretical basis for the use of SIM and SIT as measures of the relaxation process. PMID:7161285

  16. Role of atrial natriuretic peptide in systemic responses to acute isotonic volume expansion

    NASA Technical Reports Server (NTRS)

    Watenpaugh, Donald E.; Yancy, Clyde W.; Buckey, Jay C.; Lane, Lynda D.; Hargens, Alan R.; Blomqvist, C. G.

    1992-01-01

    A hypothesis is proposed that a temporal relationship exists between increases in cardiac filling pressure and plasma artrial natriuretic peptide (ANP) concentration and also between ANP elevation and vasodilation, fluid movement from plasma to interstitium, and increased urine volume (UV). To test the hypothesis, 30 ml/kg isotonic saline were infused in supine male subjects over 24 min and responses were monitored for 3 h postinfusion. Results show that at end infusion, mean arterial pressure (RAP), heart rate and plasma volume exhibited peak increases of 146, 23, and 27 percent, respectively. Mean plasma ANP and UV peaked (45 and 390 percent, respectively) at 30 min postinfusion. Most cardiovascular variables had returned toward control levels by 1 h postinfusion, and net reabsorption of extravascular fluid ensued. It is concluded that since ANP was not significantly increased until 30 min postinfusion, factors other than ANP initiate responses to intravascular fluid loading. These factors include increased vascular pressures, baroreceptor-mediated vasolidation, and hemodilution of plasma proteins. ANP is suggested to mediate, in part, the renal response to saline infusion.

  17. A Review of Classification Techniques of EMG Signals during Isotonic and Isometric Contractions.

    PubMed

    Nazmi, Nurhazimah; Abdul Rahman, Mohd Azizi; Yamamoto, Shin-Ichiroh; Ahmad, Siti Anom; Zamzuri, Hairi; Mazlan, Saiful Amri

    2016-01-01

    In recent years, there has been major interest in the exposure to physical therapy during rehabilitation. Several publications have demonstrated its usefulness in clinical/medical and human machine interface (HMI) applications. An automated system will guide the user to perform the training during rehabilitation independently. Advances in engineering have extended electromyography (EMG) beyond the traditional diagnostic applications to also include applications in diverse areas such as movement analysis. This paper gives an overview of the numerous methods available to recognize motion patterns of EMG signals for both isotonic and isometric contractions. Various signal analysis methods are compared by illustrating their applicability in real-time settings. This paper will be of interest to researchers who would like to select the most appropriate methodology in classifying motion patterns, especially during different types of contractions. For feature extraction, the probability density function (PDF) of EMG signals will be the main interest of this study. Following that, a brief explanation of the different methods for pre-processing, feature extraction and classifying EMG signals will be compared in terms of their performance. The crux of this paper is to review the most recent developments and research studies related to the issues mentioned above. PMID:27548165

  18. New isotonic drinks with antioxidant and biological capacities from berries (maqui, açaí and blackthorn) and lemon juice.

    PubMed

    Gironés-Vilaplana, Amadeo; Villaño, Débora; Moreno, Diego A; García-Viguera, Cristina

    2013-11-01

    The aim of the study was to design new isotonic drinks with lemon juice and berries: maqui [Aristotelia chilensis (Molina) Stuntz], açaí (Euterpe oleracea Mart.) and blackthorn (Prunus spinosa L.), following on from previous research. Quality parameters - including colour (CIELab parameters), minerals, phytochemical identification and quantification by high-performance liquid chromatography with diode array detector, total phenolic content by the Folin-Ciocalteu reagent, the antioxidant capacity (ABTS(+), DPPH• and [Formula: see text] assays) and biological activities (in vitro alpha-glucosidase and lipase inhibitory effects) - were tested in the samples and compared to commercially available isotonic drinks. The new isotonic blends with lemon and anthocyanins-rich berries showed an attractive colour, especially in maqui samples, which is essential for consumer acceptance. Significantly higher antioxidant and biological effects were determined in the new blends, in comparison with the commercial isotonic beverages.

  19. Measuring mechanical properties, including isotonic fatigue, of fast and slow MLC/mIgf-1 transgenic skeletal muscle.

    PubMed

    Del Prete, Zaccaria; Musarò, Antonio; Rizzuto, Emanuele

    2008-07-01

    Contractile properties of fast-twitch (EDL) and slow-twitch (soleus) skeletal muscles were measured in MLC/mIgf-1 transgenic and wild-type mice. MLC/mIgf-1 mice express the local factor mIgf-1 under the transcriptional control of MLC promoter, selectively activated in fast-twitch muscle fibers. Isolated muscles were studied in vitro in both isometric and isotonic conditions. We used a rapid "ad hoc" testing protocol that measured, in a single procedure, contraction time, tetanic force, Hill's (F-v) curve, power curve and isotonic muscle fatigue. Transgenic soleus muscles did not differ from wild-type with regard to any measured variable. In contrast, transgenic EDL muscles displayed a hypertrophic phenotype, with a mass increase of 29.2% compared to wild-type. Absolute tetanic force increased by 21.5% and absolute maximum power by 34.1%. However, when normalized to muscle cross-sectional area and mass, specific force and normalized power were the same in transgenic and wild-type EDL muscles, revealing that mIgf-1 expression induces a functional hypertrophy without altering fibrotic tissue accumulation. Isotonic fatigue behavior did not differ between transgenic and wild-type muscles, suggesting that the ability of mIgf-1 transgenic muscle to generate a considerable higher absolute power did not affect its resistance to fatigue. PMID:18415017

  20. Isotonic saline in elderly men: an open-labelled controlled infusion study of electrolyte balance, urine flow and kidney function.

    PubMed

    Hahn, R G; Isacson, M Nyberg; Fagerström, T; Rosvall, J; Nyman, C R

    2016-02-01

    Isotonic saline is a widely-used infusion fluid, although the associated chloride load may cause metabolic acidosis and impair kidney function in young, healthy volunteers. We wished to examine whether these effects also occurred in the elderly, and conducted a crossover study in 13 men with a mean age of 73 years (range 66-84), who each received intravenous infusions of 1.5 l of Ringer's acetate and of isotonic saline. Isotonic saline induced mild changes in plasma sodium (mean +1.5 mmol.l(-1) ), plasma chloride (+3 mmol.l(-1) ) and standard bicarbonate (-2 mmol.l(-1) ). Three hours after starting the infusions, 68% of the Ringer's acetate and 30% of the infused saline had been excreted (p < 0.01). The glomerular filtration rate increased in response to both fluids, but more after the Ringer's acetate (p < 0.03). Pre-infusion fluid retention, as evidenced by high urinary osmolality (> 700 mOsmol.kg(-1) ) and/or creatinine (> 7 mmol.l(-1) ), was a strong factor governing the responses to both fluid loads.

  1. Modeling animal-vehicle collisions using diagonal inflated bivariate Poisson regression.

    PubMed

    Lao, Yunteng; Wu, Yao-Jan; Corey, Jonathan; Wang, Yinhai

    2011-01-01

    Two types of animal-vehicle collision (AVC) data are commonly adopted for AVC-related risk analysis research: reported AVC data and carcass removal data. One issue with these two data sets is that they were found to have significant discrepancies by previous studies. In order to model these two types of data together and provide a better understanding of highway AVCs, this study adopts a diagonal inflated bivariate Poisson regression method, an inflated version of bivariate Poisson regression model, to fit the reported AVC and carcass removal data sets collected in Washington State during 2002-2006. The diagonal inflated bivariate Poisson model not only can model paired data with correlation, but also handle under- or over-dispersed data sets as well. Compared with three other types of models, double Poisson, bivariate Poisson, and zero-inflated double Poisson, the diagonal inflated bivariate Poisson model demonstrates its capability of fitting two data sets with remarkable overlapping portions resulting from the same stochastic process. Therefore, the diagonal inflated bivariate Poisson model provides researchers a new approach to investigating AVCs from a different perspective involving the three distribution parameters (λ(1), λ(2) and λ(3)). The modeling results show the impacts of traffic elements, geometric design and geographic characteristics on the occurrences of both reported AVC and carcass removal data. It is found that the increase of some associated factors, such as speed limit, annual average daily traffic, and shoulder width, will increase the numbers of reported AVCs and carcass removals. Conversely, the presence of some geometric factors, such as rolling and mountainous terrain, will decrease the number of reported AVCs.

  2. Investigating NARCCAP Precipitation Extremes via Bivariate Extreme Value Theory (Invited)

    NASA Astrophysics Data System (ADS)

    Weller, G. B.; Cooley, D. S.; Sain, S. R.; Bukovsky, M. S.; Mearns, L. O.

    2013-12-01

    We introduce methodology from statistical extreme value theory to examine the ability of reanalysis-drive regional climate models to simulate past daily precipitation extremes. Going beyond a comparison of summary statistics such as 20-year return values, we study whether the most extreme precipitation events produced by climate model simulations exhibit correspondence to the most extreme events seen in observational records. The extent of this correspondence is formulated via the statistical concept of tail dependence. We examine several case studies of extreme precipitation events simulated by the six models of the North American Regional Climate Change Assessment Program (NARCCAP) driven by NCEP reanalysis. It is found that the NARCCAP models generally reproduce daily winter precipitation extremes along the Pacific coast quite well; in contrast, simulation of past daily summer precipitation extremes in a central US region is poor. Some differences in the strength of extremal correspondence are seen in the central region between models which employ spectral nudging and those which do not. We demonstrate how these techniques may be used to draw a link between extreme precipitation events and large-scale atmospheric drivers, as well as to downscale extreme precipitation simulated by a future run of a regional climate model. Specifically, we examine potential future changes in the nature of extreme precipitation along the Pacific coast produced by the pineapple express (PE) phenomenon. A link between extreme precipitation events and a "PE Index" derived from North Pacific sea-surface pressure fields is found. This link is used to study PE-influenced extreme precipitation produced by a future-scenario climate model run.

  3. Cross-Ratio Estimation for Bivariate Failure Times with Left Truncation

    PubMed Central

    Hu, Tianle; Lin, Xihong; Nan, Bin

    2013-01-01

    The cross-ratio is an important local measure that characterizes the dependence between bivariate failure times. To estimate the cross-ratio in follow-up studies where delayed entry is present, estimation procedures need to account for left truncation. Ignoring left truncation yields biased estimates of the cross-ratio. We extend the method of Hu et al. (2011) by modifying the risk sets and relevant indicators to handle left-truncated bivariate failure times, which yields the cross-ratio estimate with desirable asymptotic properties that can be shown by the same techniques used in Hu et al. (2011). Numerical studies are conducted. PMID:23700275

  4. Fracture phenomena in an isotonic salt solution during freezing and their elimination using glycerol.

    PubMed

    Gao, D Y; Lin, S; Watson, P F; Critser, J K

    1995-06-01

    Thermal stress and consequent fracture in frozen organs or cell suspensions have been proposed to be two causes of cell cryoinjury. A specific device was developed to study the thermal stress and the fracture phenomena during a slow cooling process of isotonic NaCl solutions with different concentrations of glycerol (cryoprotectant) in a cylindrical tube. It was shown from the experimental results that glycerol significantly influenced the solidification process of the ternary solutions and reduced the thermal stress. The higher the initial glycerol concentration, the lower the thermal stress in the frozen solutions. Glycerol concentrations over 0.3 M were sufficient to eliminate the fracture of the frozen solutions under the present experimental conditions. To explain the action of glycerol in reducing the thermal stress and preventing the ice fracture, a further study on ice crystal formation and growth of ice in these solutions was undertaken using cryomicroscopy. It is known from previous studies that an increase of initial glycerol concentration reduced frozen fraction of water in the solution at any given low temperature due to colligative properties of solution, which reduced the total ice volume expansion during water solidification. The present cryomicroscopic investigation showed that under a fixed cooling condition the different initial glycerol concentrations induced the different microstructures of the frozen solutions at not only a given low temperature but also a given frozen fraction of water. It has been known that ice volume expansion during solidification is a major factor causing the thermal stress and the interior microstructure is critical for the mechanical strength of a solid. Therefore, functions of glycerol in reducing the total ice volume expansion during water solidification and in influencing interior microstructure of the ice may contribute to reduce the thermal stress and prevent the fracture in the frozen solutions.

  5. Univariate versus Bivariate analysis and synthesis of floods to assess the risk of overtopping a dam - a case study for Argentina

    NASA Astrophysics Data System (ADS)

    Callau Poduje, Ana Claudia; Haberlandt, Uwe

    2013-04-01

    Considering floods as multivariate events allows a better representation of the process generating them. In this work the relevance of multivariate analysis for designing or assessing the risk of overtopping a dam is discussed. Generally, peak flow and volume are two statistically dependent variables; therefore they are used to characterize the flood events. A bivariate statistical frequency analysis is carried out to find a suitable model that adequately represents the data set of flood peak flow and volume. The dependence between the variables is modeled with a copula. The copula model is used to generate 1000 random pairs of variables characterizing the flood, which are transformed into hydrographs. The shape of the floods is modeled using a Beta distribution function. The synthetic flood events are routed through a reservoir to assess its behavior. The maximum water levels and outflows are computed for all hydrograph and compared to estimations considering peak flow and volume separately. The analysis is carried out using flood peak and volume series observed in the river Agrio basin with a drainage area of 7300 km2, located in the province of Neuquén, Argentina. The results show that the maximum water levels and outflows obtained based on the bivariate approach are higher compared to the univariate case. If the risk of an existing dam is to be assessed, the bivariate approach would indicate a greater risk of overtopping the dam for a given dam height and spillway geometry. If a dam is to be designed considering the joint behavior of both variables would result in a smaller risk for the structure compared to the univariate case.

  6. Analysis of Blood Transfusion Data Using Bivariate Zero-Inflated Poisson Model: A Bayesian Approach

    PubMed Central

    Mohammadi, Tayeb; Sedehi, Morteza

    2016-01-01

    Recognizing the factors affecting the number of blood donation and blood deferral has a major impact on blood transfusion. There is a positive correlation between the variables “number of blood donation” and “number of blood deferral”: as the number of return for donation increases, so does the number of blood deferral. On the other hand, due to the fact that many donors never return to donate, there is an extra zero frequency for both of the above-mentioned variables. In this study, in order to apply the correlation and to explain the frequency of the excessive zero, the bivariate zero-inflated Poisson regression model was used for joint modeling of the number of blood donation and number of blood deferral. The data was analyzed using the Bayesian approach applying noninformative priors at the presence and absence of covariates. Estimating the parameters of the model, that is, correlation, zero-inflation parameter, and regression coefficients, was done through MCMC simulation. Eventually double-Poisson model, bivariate Poisson model, and bivariate zero-inflated Poisson model were fitted on the data and were compared using the deviance information criteria (DIC). The results showed that the bivariate zero-inflated Poisson regression model fitted the data better than the other models. PMID:27703493

  7. Representing Topography with Second-Degree Bivariate Polynomial Functions Fitted by Least Squares.

    ERIC Educational Resources Information Center

    Neuman, Arthur Edward

    1987-01-01

    There is a need for abstracting topography other than for mapping purposes. The method employed should be simple and available to non-specialists, thereby ruling out spline representations. Generalizing from univariate first-degree least squares and from multiple regression, this article introduces bivariate polynomial functions fitted by least…

  8. A model for gust amplitude and gust length based on the bivariate gamma probability distribution function

    NASA Technical Reports Server (NTRS)

    Smith, O. E.; Adelfang, S. I.

    1981-01-01

    A model of the largest gust amplitude and gust length is presented which uses the properties of the bivariate gamma distribution. The gust amplitude and length are strongly dependent on the filter function; the amplitude increases with altitude and is larger in winter than in summer.

  9. BIVARIATE MODELLING OF CLUSTERED CONTINUOUS AND ORDERED CATEGORICAL OUTCOMES. (R824757)

    EPA Science Inventory

    Simultaneous observation of continuous and ordered categorical outcomes for each subject is common in biomedical research but multivariate analysis of the data is complicated by the multiple data types. Here we construct a model for the joint distribution of bivariate continuous ...

  10. Contributions to the Underlying Bivariate Normal Method for Factor Analyzing Ordinal Data

    ERIC Educational Resources Information Center

    Xi, Nuo; Browne, Michael W.

    2014-01-01

    A promising "underlying bivariate normal" approach was proposed by Jöreskog and Moustaki for use in the factor analysis of ordinal data. This was a limited information approach that involved the maximization of a composite likelihood function. Its advantage over full-information maximum likelihood was that very much less computation was…

  11. Interval Estimation of Bivariate Correlations with Missing Data on Both Variables: A Bayesian Approach.

    ERIC Educational Resources Information Center

    Gross, Alan L.

    1997-01-01

    An analytic expression is derived for the posterior distribution of the bivariate correlation given a data set that contains missing values on both variables. Interval estimates of the unknown correlation are then computed in terms of the highest posterior density regions. A sampling study illustrates the procedure. (SLD)

  12. Meta-analysis of studies with bivariate binary outcomes: a marginal beta-binomial model approach.

    PubMed

    Chen, Yong; Hong, Chuan; Ning, Yang; Su, Xiao

    2016-01-15

    When conducting a meta-analysis of studies with bivariate binary outcomes, challenges arise when the within-study correlation and between-study heterogeneity should be taken into account. In this paper, we propose a marginal beta-binomial model for the meta-analysis of studies with binary outcomes. This model is based on the composite likelihood approach and has several attractive features compared with the existing models such as bivariate generalized linear mixed model (Chu and Cole, 2006) and Sarmanov beta-binomial model (Chen et al., 2012). The advantages of the proposed marginal model include modeling the probabilities in the original scale, not requiring any transformation of probabilities or any link function, having closed-form expression of likelihood function, and no constraints on the correlation parameter. More importantly, because the marginal beta-binomial model is only based on the marginal distributions, it does not suffer from potential misspecification of the joint distribution of bivariate study-specific probabilities. Such misspecification is difficult to detect and can lead to biased inference using currents methods. We compare the performance of the marginal beta-binomial model with the bivariate generalized linear mixed model and the Sarmanov beta-binomial model by simulation studies. Interestingly, the results show that the marginal beta-binomial model performs better than the Sarmanov beta-binomial model, whether or not the true model is Sarmanov beta-binomial, and the marginal beta-binomial model is more robust than the bivariate generalized linear mixed model under model misspecifications. Two meta-analyses of diagnostic accuracy studies and a meta-analysis of case-control studies are conducted for illustration.

  13. Exploratory Causal Analysis in Bivariate Time Series Data

    NASA Astrophysics Data System (ADS)

    McCracken, James M.

    Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data

  14. Volcano clustering determination: Bivariate Gauss vs. Fisher kernels

    NASA Astrophysics Data System (ADS)

    Cañón-Tapia, Edgardo

    2013-05-01

    Underlying many studies of volcano clustering is the implicit assumption that vent distribution can be studied by using kernels originally devised for distribution in plane surfaces. Nevertheless, an important change in topology in the volcanic context is related to the distortion that is introduced when attempting to represent features found on the surface of a sphere that are being projected into a plane. This work explores the extent to which different topologies of the kernel used to study the spatial distribution of vents can introduce significant changes in the obtained density functions. To this end, a planar (Gauss) and a spherical (Fisher) kernels are mutually compared. The role of the smoothing factor in these two kernels is also explored with some detail. The results indicate that the topology of the kernel is not extremely influential, and that either type of kernel can be used to characterize a plane or a spherical distribution with exactly the same detail (provided that a suitable smoothing factor is selected in each case). It is also shown that there is a limitation on the resolution of the Fisher kernel relative to the typical separation between data that can be accurately described, because data sets with separations lower than 500 km are considered as a single cluster using this method. In contrast, the Gauss kernel can provide adequate resolutions for vent distributions at a wider range of separations. In addition, this study also shows that the numerical value of the smoothing factor (or bandwidth) of both the Gauss and Fisher kernels has no unique nor direct relationship with the relevant separation among data. In order to establish the relevant distance, it is necessary to take into consideration the value of the respective smoothing factor together with a level of statistical significance at which the contributions to the probability density function will be analyzed. Based on such reference level, it is possible to create a hierarchy of

  15. Knee-joint proprioception during 30-day 6 degrees head-down bed rest with isotonic and isokinetic exercise training

    NASA Technical Reports Server (NTRS)

    Bernauer, E. M.; Walby, W. F.; Ertl, A. C.; Dempster, P. T.; Bond, M.; Greenleaf, J. E.

    1994-01-01

    To determine if daily isotonic exercise or isokinetic exercise training coupled with daily leg proprioceptive training, would influence leg proprioceptive tracking responses during bed rest (BR), 19 men (36 +/- SD 4 years, 178 +/- 7 cm, 76.8 +/- 7.8 kg) were allocated into a no-exercise (NOE) training control group (n = 5), and isotonic exercise (ITE, n = 7) and isokinetic exercise (IKE, n = 7) training groups. Exercise training was conducted during BR for two 30-min periods.d-1, 5 d.week-1. Only the IKE group performed proprioceptive training using a new isokinetic procedure with each lower extremity for 2.5 min before and after the daily exercise training sessions; proprioceptive testing occurred weekly for all groups. There were no significant differences in proprioceptive tracking scores, expressed as a percentage of the perfect score of 100, in the pre-BR ambulatory control period between the three groups. Knee extension and flexion tracking responses were unchanged with NOE during BR, but were significantly greater (*p < 0.05) at the end of BR in both exercise groups when compared with NOE responses (extension: NOE 80.7 +/- 0.7%, ITE 82.9* +/- 0.6%, IKE 86.5* +/- 0.7%; flexion: NOE 77.6 +/- 1.5%, ITE 80.0 +/- 0.8% (NS), IKE 83.6* +/- 0.8%). Although proprioceptive tracking was unchanged during BR with NOE, both isotonic exercise training (without additional proprioceptive training) and especially isokinetic exercise training when combined with daily proprioceptive training, significantly improved knee proprioceptive tracking responses after 30 d of BR.

  16. [General pharmacological study of iodixanol, a new non-ionic isotonic contrast medium].

    PubMed

    Takasuna, K; Kasai, Y; Kitano, Y; Mori, K; Kobayashi, R; Makino, M; Hagiwara, T; Hirohashi, M; Nomura, M; Algate, D R

    1995-10-01

    The general pharmacological study of iodixanol, a non-ionic isotonic contrast medium, was conducted. 1) Iodixanol administered intravenously over a dose range of 320 to 3,200 mgI/kg had little or no effect on the general behavior, spontaneous locomotor activity, hexobarbital sleeping time, pain response, electroshock- or pentylenetetrazol-induced convulsion (mouse), EEG or body temperature (rabbit), gastrointestinal propulsion (mouse) or skeletal muscle contraction (rabbit). Iodixanol had no specific interaction with acetylcholine, histamine, serotonin, nicotin, BaCl2 (ileum), methacholine (trachea), isoprenaline (atrium) or oxytocin (pregnant uterus), nor had any effect on spontaneous contractility (atrium and uterus), or transmural electrostimulation-induced contractility (vas deferens) at concentrations of < or = 3.2 x 10(-3) gI/ml in vitro. Iodixanol had no effect on the cardiovascular system of dog, except that it increased femoral blood flow and respiratory rate at doses of > or = 1,000 mgI/kg. Iodixanol at 3,200 mgI/kg i.v. reduced urine output with a decrease in Na+ and Cl- excretion, whereas at 320 mgI/kg i.v., it slightly increased urine output (rat). 2) Injections of iodixanol into the cerebroventricular (0.96, 9.6 mgI/mouse and 3.2, 32 mgI/rat), left ventricular (1,920, 6,400 mgI/dog) or coronary artery (640, 1,920 mgI/dog) had no conspicuous effect on the central nervous system or the cardiovascular system, respectively. There was no marked difference among iodixanol, iohexol and iopamidol in this respect. Vascular pain during injection into the femoral artery (300-320 mgI/guinea pig) appeared to be less intense with iodixanol, compared with the other contrast media iohexol and iopamidol. These results suggest that intravenous injection of iodixanol is relatively free from pharmacological activity, and effects of iodixanol on the central nervous system (intracerebroventricular injection) and cardiovascular system (intra-left ventricular and -coronary

  17. Rotational Properties of N = 75 Isotones in the a Equivalent to 135 Mass Region

    NASA Astrophysics Data System (ADS)

    Ma, Ruimei

    1990-01-01

    High spin states of the N = 75 isotones ^{131}Ba, ^{133 }Ce, ^{137}Sm, and ^{139}Gd have been populated via heavy-ion fusion-evaporation reactions. Rotational properties of these nuclei have been studied regarding shape coexistence and enhanced deformation. At low spins, the yrast bands in these nuclei, built on a nuh_{11/2 } orbital, show large signature splitting which indicates a significant degree of triaxiality. The triaxiality is also apparent from the high crossing frequencies of the proton alignments. In order to reproduce the large signature splittings and the high crossing frequencies, cranked shell model calculations require a triaxial shape with gamma ~ -20^circ. After the proton alignment, the vanishing of the signature splitting is indicative of a prolate shape due to the strong gamma-driving force of the h_ {11/2} protons towards gamma = 0^circ. In the case of ^{131}Ba, competing (nuh_{11/2} ) ^2 and (pi h_{11/2}) ^2 alignments were observed which drive the nucleus towards near collective oblate (gamma ~ -60 ^circ) and prolate (gamma ~ 0^circ ) shapes, respectively. In addition, Delta I = 1 bands built on the nu h_ {11/2}otimespi h_{11/2 }otimespi g_{7/2} configuration indicating a prolate shape were observed in these nuclei. Decoupled bands, built on a configuration involving the nui_{13/2 } orbital intruding from two harmonic oscillator shells above the normal parity states, were presently observed in ^{133}Ce, ^{137}Sm and ^{139 }Gd. All of these bands have enhanced dynamic moments of inertia which may indicate a larger quadrupole deformation. In particular, the band in ^ {137}Sm was observed down to relatively low spin and excitation energy which implies a simple single quasineutron structure. The unique feature of this band is that a linking gamma transition connecting the superdeformed (beta ~ 0.4, an axis ratio ~ 3:2) and normal deformed (beta ~ 0.2, an axis ratio ~ 5:4) bands was established. The beta -driving effect of the single nui _{13

  18. High-spin states in the semimagic nucleus 89Y and neutron-core excitations in the N =50 isotones

    NASA Astrophysics Data System (ADS)

    Li, Z. Q.; Wang, S. Y.; Niu, C. Y.; Qi, B.; Wang, S.; Sun, D. P.; Liu, C.; Xu, C. J.; Liu, L.; Zhang, P.; Wu, X. G.; Li, G. S.; He, C. Y.; Zheng, Y.; Li, C. B.; Yu, B. B.; Hu, S. P.; Yao, S. H.; Cao, X. P.; Wang, J. L.

    2016-07-01

    The semimagic nucleus 89Y 89 has been investigated using the 82Se(11>B,4 n ) reaction at beam energies of 48 and 52 MeV. More than 24 new transitions have been identified, leading to a considerable extension of the level structures of 89Y. The experimental results are compared with the large-basis shell model calculations. They show that cross-shell neutron excitations play a pivotal role in high-spin level structures of 89Y. The systematic features of neutron-core excitations in the N =50 isotones are also discussed.

  19. A Bivariate Mixture Model for Natural Antibody Levels to Human Papillomavirus Types 16 and 18: Baseline Estimates for Monitoring the Herd Effects of Immunization.

    PubMed

    Vink, Margaretha A; Berkhof, Johannes; van de Kassteele, Jan; van Boven, Michiel; Bogaards, Johannes A

    2016-01-01

    Post-vaccine monitoring programs for human papillomavirus (HPV) have been introduced in many countries, but HPV serology is still an underutilized tool, partly owing to the weak antibody response to HPV infection. Changes in antibody levels among non-vaccinated individuals could be employed to monitor herd effects of immunization against HPV vaccine types 16 and 18, but inference requires an appropriate statistical model. The authors developed a four-component bivariate mixture model for jointly estimating vaccine-type seroprevalence from correlated antibody responses against HPV16 and -18 infections. This model takes account of the correlation between HPV16 and -18 antibody concentrations within subjects, caused e.g. by heterogeneity in exposure level and immune response. The model was fitted to HPV16 and -18 antibody concentrations as measured by a multiplex immunoassay in a large serological survey (3,875 females) carried out in the Netherlands in 2006/2007, before the introduction of mass immunization. Parameters were estimated by Bayesian analysis. We used the deviance information criterion for model selection; performance of the preferred model was assessed through simulation. Our analysis uncovered elevated antibody concentrations in doubly as compared to singly seropositive individuals, and a strong clustering of HPV16 and -18 seropositivity, particularly around the age of sexual debut. The bivariate model resulted in a more reliable classification of singly and doubly seropositive individuals than achieved by a combination of two univariate models, and suggested a higher pre-vaccine HPV16 seroprevalence than previously estimated. The bivariate mixture model provides valuable baseline estimates of vaccine-type seroprevalence and may prove useful in seroepidemiologic assessment of the herd effects of HPV vaccination. PMID:27537200

  20. A Bivariate Mixture Model for Natural Antibody Levels to Human Papillomavirus Types 16 and 18: Baseline Estimates for Monitoring the Herd Effects of Immunization

    PubMed Central

    Vink, Margaretha A.; Berkhof, Johannes; van de Kassteele, Jan; van Boven, Michiel; Bogaards, Johannes A.

    2016-01-01

    Post-vaccine monitoring programs for human papillomavirus (HPV) have been introduced in many countries, but HPV serology is still an underutilized tool, partly owing to the weak antibody response to HPV infection. Changes in antibody levels among non-vaccinated individuals could be employed to monitor herd effects of immunization against HPV vaccine types 16 and 18, but inference requires an appropriate statistical model. The authors developed a four-component bivariate mixture model for jointly estimating vaccine-type seroprevalence from correlated antibody responses against HPV16 and -18 infections. This model takes account of the correlation between HPV16 and -18 antibody concentrations within subjects, caused e.g. by heterogeneity in exposure level and immune response. The model was fitted to HPV16 and -18 antibody concentrations as measured by a multiplex immunoassay in a large serological survey (3,875 females) carried out in the Netherlands in 2006/2007, before the introduction of mass immunization. Parameters were estimated by Bayesian analysis. We used the deviance information criterion for model selection; performance of the preferred model was assessed through simulation. Our analysis uncovered elevated antibody concentrations in doubly as compared to singly seropositive individuals, and a strong clustering of HPV16 and -18 seropositivity, particularly around the age of sexual debut. The bivariate model resulted in a more reliable classification of singly and doubly seropositive individuals than achieved by a combination of two univariate models, and suggested a higher pre-vaccine HPV16 seroprevalence than previously estimated. The bivariate mixture model provides valuable baseline estimates of vaccine-type seroprevalence and may prove useful in seroepidemiologic assessment of the herd effects of HPV vaccination. PMID:27537200

  1. Lower Extremity Muscle Thickness During 30-Day 6 degrees Head-Down Bed Rest with Isotonic and Isokinetic Exercise Training

    NASA Technical Reports Server (NTRS)

    Ellis, S.; Kirby, L. C.; Greenleaf, J. E.

    1993-01-01

    Muscle thickness was measured in 19 Bed-Rested (BR) men (32-42 year) subjected to IsoTonic (ITE, cycle orgometer) and IsoKi- netic (IKE, torque orgometer) lower extremity exercise training, and NO Exercise (NOE) training. Thickness was measured with ultrasonography in anterior thigh-Rectus Femoris (RF) and Vastus Intermadius (VI), and combined posterior log-soleus, flexor ballucis longus, and tibialis posterior (S + FHL +TP) - muscles. Compared with ambulatory control values, thickness of the (S + FHL + TP) decreased by 90%-12% (p less than 0.05) In all three test groups. The (RF) thickness was unchanged in the two exercise groups, but decreased by 10% (p less than 0.05) in the NOE. The (VI) thickness was unchanged In the ITE group, but decreased by 12%-l6% (p less than 0.05) in the IKE and NOE groups. Thus, intensive, alternating, isotonic cycle ergometer exercise training is as effective as intensive, intermittent, isokinetic exercise training for maintaining thicknesses of rectus femoris and vastus lntermedius anterior thigh muscles, but not posterior log muscles, during prolonged BR deconditioning.

  2. A Case Study of Bivariate Rainfall Frequency Analysis Using Copula in South Korea

    NASA Astrophysics Data System (ADS)

    Joo, K.; Shin, J.; Kim, W.; Heo, J.

    2011-12-01

    For a given rainfall event, it can be characterized into some properties such as rainfall depth (amount), duration, and intensity. By considering these factors simultaneously, the actual phenomenon of rainfall event can be explained better than univariate model. Using bivariate model, rainfall quantiles can be obtained for a given return period without any limitations of specific rainfall duration. For bivariate(depth and duration) frequency analysis, copula model was used in this study. Recently, copula model has been studied widely for hydrological field. And it is more flexible for marginal distribution than other conventional bivariate models. In this study, five weather stations are applied for frequency analysis from Korea Meteorological Administration (KMA) which are Seoul, Chuncheon, Gangneung, Wonju, and Chungju stations. These sites have 38 ~ 50 years of hourly precipitation data. Inter-event time definition is used for identification of rainfall events. And three copula models (Gumbel-Hougaard, Frank, and Joe) are applied in this study. Maximum pseudo-likelihood estimation method is used to estimate the parameter of copula (θ). The normal, generalized extreme value, Gumbel, 3-parameter gamma, and generalized logistic distributions are examined for marginal distribution. As a result, rainfall quantiles can be obtained for any rainfall durations for a given return period by calculating conditional probability. In addition, rainfall quantiles from copula models are compared to those from univariate model.

  3. Simultaneous determination of nifuroxazide and drotaverine hydrochloride in pharmaceutical preparations by bivariate and multivariate spectral analysis.

    PubMed

    Metwally, Fadia H

    2008-02-01

    The quantitative predictive abilities of the new and simple bivariate spectrophotometric method are compared with the results obtained by the use of multivariate calibration methods [the classical least squares (CLS), principle component regression (PCR) and partial least squares (PLS)], using the information contained in the absorption spectra of the appropriate solutions. Mixtures of the two drugs Nifuroxazide (NIF) and Drotaverine hydrochloride (DRO) were resolved by application of the bivariate method. The different chemometric approaches were applied also with previous optimization of the calibration matrix, as they are useful in simultaneous inclusion of many spectral wavelengths. The results found by application of the bivariate, CLS, PCR and PLS methods for the simultaneous determinations of mixtures of both components containing 2-12microgml(-1) of NIF and 2-8microgml(-1) of DRO are reported. Both approaches were satisfactorily applied to the simultaneous determination of NIF and DRO in pure form and in pharmaceutical formulation. The results were in accordance with those given by the EVA Pharma reference spectrophotometric method. PMID:17631041

  4. Simultaneous determination of nifuroxazide and drotaverine hydrochloride in pharmaceutical preparations by bivariate and multivariate spectral analysis.

    PubMed

    Metwally, Fadia H

    2008-02-01

    The quantitative predictive abilities of the new and simple bivariate spectrophotometric method are compared with the results obtained by the use of multivariate calibration methods [the classical least squares (CLS), principle component regression (PCR) and partial least squares (PLS)], using the information contained in the absorption spectra of the appropriate solutions. Mixtures of the two drugs Nifuroxazide (NIF) and Drotaverine hydrochloride (DRO) were resolved by application of the bivariate method. The different chemometric approaches were applied also with previous optimization of the calibration matrix, as they are useful in simultaneous inclusion of many spectral wavelengths. The results found by application of the bivariate, CLS, PCR and PLS methods for the simultaneous determinations of mixtures of both components containing 2-12microgml(-1) of NIF and 2-8microgml(-1) of DRO are reported. Both approaches were satisfactorily applied to the simultaneous determination of NIF and DRO in pure form and in pharmaceutical formulation. The results were in accordance with those given by the EVA Pharma reference spectrophotometric method.

  5. Probability-based differential normalized fluorescence bivariate analysis for the classification of tissue autofluorescence spectra.

    PubMed

    Wang, Gufeng; Platz, Charles P; Geng, M Lei

    2006-05-01

    Differential normalized fluorescence (DNF) is an efficient and effective method for the differentiation of normal and cancerous tissue fluorescence spectra. The diagnostic features are extracted from the difference between the averaged cancerous and averaged normal tissue spectra and used as indices in tissue classification. In this paper, a new method, probability-based DNF bivariate analysis, is introduced based on the univariate DNF method. Two differentiation features are used concurrently in the new method to achieve better classification accuracy. The probability of each sample belonging to a disease state is determined with Bayes decision theory. This probability approach classifies the tissue spectra according to disease states and provides uncertainty information on classification. With a data set of 57 colonic tissue sites, probability-based DNF bivariate analysis is demonstrated to improve the accuracy of cancer diagnosis. The bivariate DNF analysis only requires the collection of a few data points across the entire emission spectrum and has the potential of improving data acquisition speed in tissue imaging.

  6. Simultaneous determination of Nifuroxazide and Drotaverine hydrochloride in pharmaceutical preparations by bivariate and multivariate spectral analysis

    NASA Astrophysics Data System (ADS)

    Metwally, Fadia H.

    2008-02-01

    The quantitative predictive abilities of the new and simple bivariate spectrophotometric method are compared with the results obtained by the use of multivariate calibration methods [the classical least squares (CLS), principle component regression (PCR) and partial least squares (PLS)], using the information contained in the absorption spectra of the appropriate solutions. Mixtures of the two drugs Nifuroxazide (NIF) and Drotaverine hydrochloride (DRO) were resolved by application of the bivariate method. The different chemometric approaches were applied also with previous optimization of the calibration matrix, as they are useful in simultaneous inclusion of many spectral wavelengths. The results found by application of the bivariate, CLS, PCR and PLS methods for the simultaneous determinations of mixtures of both components containing 2-12 μg ml -1 of NIF and 2-8 μg ml -1 of DRO are reported. Both approaches were satisfactorily applied to the simultaneous determination of NIF and DRO in pure form and in pharmaceutical formulation. The results were in accordance with those given by the EVA Pharma reference spectrophotometric method.

  7. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution.

    PubMed

    Dinov, Ivo D; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2013-01-01

    Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students' understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference.

  8. A mixed effect model for bivariate meta-analysis of diagnostic test accuracy studies using a copula representation of the random effects distribution.

    PubMed

    Nikoloulopoulos, Aristidis K

    2015-12-20

    Diagnostic test accuracy studies typically report the number of true positives, false positives, true negatives and false negatives. There usually exists a negative association between the number of true positives and true negatives, because studies that adopt less stringent criterion for declaring a test positive invoke higher sensitivities and lower specificities. A generalized linear mixed model (GLMM) is currently recommended to synthesize diagnostic test accuracy studies. We propose a copula mixed model for bivariate meta-analysis of diagnostic test accuracy studies. Our general model includes the GLMM as a special case and can also operate on the original scale of sensitivity and specificity. Summary receiver operating characteristic curves are deduced for the proposed model through quantile regression techniques and different characterizations of the bivariate random effects distribution. Our general methodology is demonstrated with an extensive simulation study and illustrated by re-analysing the data of two published meta-analyses. Our study suggests that there can be an improvement on GLMM in fit to data and makes the argument for moving to copula random effects models. Our modelling framework is implemented in the package CopulaREMADA within the open source statistical environment R.

  9. Bivariate zero-inflated regression for count data: a Bayesian approach with application to plant counts.

    PubMed

    Majumdar, Anandamayee; Gries, Corinna

    2010-01-01

    Lately, bivariate zero-inflated (BZI) regression models have been used in many instances in the medical sciences to model excess zeros. Examples include the BZI Poisson (BZIP), BZI negative binomial (BZINB) models, etc. Such formulations vary in the basic modeling aspect and use the EM algorithm (Dempster, Laird and Rubin, 1977) for parameter estimation. A different modeling formulation in the Bayesian context is given by Dagne (2004). We extend the modeling to a more general setting for multivariate ZIP models for count data with excess zeros as proposed by Li, Lu, Park, Kim, Brinkley and Peterson (1999), focusing on a particular bivariate regression formulation. For the basic formulation in the case of bivariate data, we assume that Xi are (latent) independent Poisson random variables with parameters λ i, i = 0, 1, 2. A bi-variate count vector (Y1, Y2) response follows a mixture of four distributions; p0 stands for the mixing probability of a point mass distribution at (0, 0); p1, the mixing probability that Y2 = 0, while Y1 = X0 + X1; p2, the mixing probability that Y1 = 0 while Y2 = X0 + X2; and finally (1 - p0 - p1 - p2), the mixing probability that Yi = Xi + X0, i = 1, 2. The choice of the parameters {pi, λ i, i = 0, 1, 2} ensures that the marginal distributions of Yi are zero inflated Poisson (λ 0 + λ i). All the parameters thus introduced are allowed to depend on co-variates through canonical link generalized linear models (McCullagh and Nelder, 1989). This flexibility allows for a range of real-life applications, especially in the medical and biological fields, where the counts are bivariate in nature (with strong association between the processes) and where there are excess of zeros in one or both processes. Our contribution in this paper is to employ a fully Bayesian approach consolidating the work of Dagne (2004) and Li et al. (1999) generalizing the modeling and sampling-based methods described by Ghosh, Mukhopadhyay and Lu (2006) to estimate the

  10. Statistical analysis of multivariate atmospheric variables. [cloud cover

    NASA Technical Reports Server (NTRS)

    Tubbs, J. D.

    1979-01-01

    Topics covered include: (1) estimation in discrete multivariate distributions; (2) a procedure to predict cloud cover frequencies in the bivariate case; (3) a program to compute conditional bivariate normal parameters; (4) the transformation of nonnormal multivariate to near-normal; (5) test of fit for the extreme value distribution based upon the generalized minimum chi-square; (6) test of fit for continuous distributions based upon the generalized minimum chi-square; (7) effect of correlated observations on confidence sets based upon chi-square statistics; and (8) generation of random variates from specified distributions.

  11. Randomised clinical study comparing the effectiveness and physiological effects of hypertonic and isotonic polyethylene glycol solutions for bowel cleansing

    PubMed Central

    Yamano, Hiro-o; Matsushita, Hiro-o; Yoshikawa, Kenjiro; Takagi, Ryo; Harada, Eiji; Tanaka, Yoshihito; Nakaoka, Michiko; Himori, Ryogo; Yoshida, Yuko; Satou, Kentarou; Imai, Yasushi

    2016-01-01

    Objectives Bowel cleansing is necessary before colonoscopy, but is a burden to patients because of the long cleansing time and large dose volume. A low-volume (2 L) hypertonic polyethylene glycol-ascorbic acid solution (PEG-Asc) has been introduced, but its possible dehydration effects have not been quantitatively studied. We compared the efficacy and safety including the dehydration risk between hypertonic PEG-Asc and isotonic PEG regimens. Design This was an observer-blinded randomised study. Participants (n=310) were allocated to receive 1 of 3 regimens on the day of colonoscopy: PEG-Asc (1.5 L) and water (0.75 L) dosed with 1 split (PEG-Asc-S) or 4 splits (PEG-Asc-M), or PEG-electrolyte solution (PEG-ES; 2.25 L) dosed with no split. Dehydration was analysed by measuring haematocrit (Ht). Results The cleansing time using the hypertonic PEG-Asc-S (3.33±0.48 hours) was significantly longer than that with isotonic PEG-ES (3.05±0.56 hours; p<0.001). PEG-Asc-M (3.00±0.53 hours) did not have this same disadvantage. Successful cleansing was achieved in more than 94% of participants using each of the 3 regimens. The percentage changes in Ht from baseline (before dosing) to the end of dosing with PEG-Asc-S (3.53±3.32%) and PEG-Asc-M (4.11±3.07%) were significantly greater than that with PEG-ES (1.31±3.01%). Conclusions These 3 lower volume regimens were efficacious and had no serious adverse effects. Even patients cleansed with isotonic PEG-ES showed significant physiological dehydration at the end of dosing. The four-split PEG-Asc-M regimen is recommended because of its shorter cleansing time without causing serious nausea. Trial registration number UMIN000013103; Results. PMID:27547443

  12. Long-lead station-scale prediction of hydrological droughts in South Korea based on bivariate pattern-based downscaling

    NASA Astrophysics Data System (ADS)

    Sohn, Soo-Jin; Tam, Chi-Yung

    2016-05-01

    Capturing climatic variations in boreal winter to spring (December-May) is essential for properly predicting droughts in South Korea. This study investigates the variability and predictability of the South Korean climate during this extended season, based on observations from 60 station locations and multi-model ensemble (MME) hindcast experiments (1983/1984-2005/2006) archived at the APEC Climate Center (APCC). Multivariate empirical orthogonal function (EOF) analysis results based on observations show that the first two leading modes of winter-to-spring precipitation and temperature variability, which together account for ~80 % of the total variance, are characterized by regional-scale anomalies covering the whole South Korean territory. These modes were also closely related to some of the recurrent large-scale circulation changes in the northern hemisphere during the same season. Consistent with the above, examination of the standardized precipitation evapotranspiration index (SPEI) indicates that drought conditions in South Korea tend to be accompanied by regional-to-continental-scale circulation anomalies over East Asia to the western north Pacific. Motivated by the aforementioned findings on the spatial-temporal coherence among station-scale precipitation and temperature anomalies, a new bivariate and pattern-based downscaling method was developed. The novelty of this method is that precipitation and temperature data were first filtered using multivariate EOFs to enhance their spatial-temporal coherence, before being linked to large-scale circulation variables using canonical correlation analysis (CCA). To test its applicability and to investigate its related potential predictability, a perfect empirical model was first constructed with observed datasets as predictors. Next, a model output statistics (MOS)-type hybrid dynamical-statistical model was developed, using products from nine one-tier climate models as inputs. It was found that, with model sea

  13. Effect of spaceflight on the isotonic contractile properties of single skeletal muscle fibers in the rhesus monkey

    NASA Technical Reports Server (NTRS)

    Fitts, R. H.; Romatowski, J. G.; Blaser, C.; De La Cruz, L.; Gettelman, G. J.; Widrick, J. J.

    2000-01-01

    Experiments from both Cosmos and Space Shuttle missions have shown weightlessness to result in a rapid decline in the mass and force of rat hindlimb extensor muscles. Additionally, despite an increased maximal shortening velocity, peak power was reduced in rat soleus muscle post-flight. In humans, declines in voluntary peak isometric ankle extensor torque ranging from 15-40% have been reported following long- and short-term spaceflight and prolonged bed rest. Complete understanding of the cellular events responsible for the fiber atrophy and the decline in force, as well as the development of effective countermeasures, will require detailed knowledge of how the physiological and biochemical processes of muscle function are altered by spaceflight. The specific purpose of this investigation was to determine the extent to which the isotonic contractile properties of the slow- and fast-twitch fiber types of the soleus and gastrocnemius muscles of rhesus monkeys (Macaca mulatta) were altered by a 14-day spaceflight.

  14. Isotonic sodium bicarbonate-triggered emodin release from borate stabilized emodin nanoparticles-loaded polymeric microgel films.

    PubMed

    Wang, Lin; Wang, Xiaohan; Li, Xiaozhou

    2014-07-20

    Hydrosoluble emodin-borate (EmB) nanoparticles (NPs) were fabricated by a simple solvent exchange method to address emodin's poor water solubility. As the result, negative charges were introduced in the surface of EmB NPs. In addition, layer-by-layer assembled multilayer films containing cation-rich polymeric microgels (named PAHD) and sodium carboxymethyl cellulose (NaCMC) were employed as drug carrier. Anionic EmB can be loaded into the PAHD/CMC multilayer films. The influences of various experimental parameters on cargo capacity of the PAHD/CMC film were studied in detail. The loaded EmB can be released in the form of emodin molecule in presence of isotonic sodium bicarbonate (ISB) solution. Gratifyingly, EmB did not almost release in presence of water, PBS buffer solution, 0.9% normal saline, and 5% glucose solution. PMID:24755249

  15. Bivariate Genome-Wide Association Analysis of the Growth and Intake Components of Feed Efficiency

    PubMed Central

    Beever, Jonathan E.; Bollero, Germán A.; Southey, Bruce R.; Faulkner, Daniel B.; Rodriguez-Zas, Sandra L.

    2013-01-01

    Single nucleotide polymorphisms (SNPs) associated with average daily gain (ADG) and dry matter intake (DMI), two major components of feed efficiency in cattle, were identified in a genome-wide association study (GWAS). Uni- and multi-SNP models were used to describe feed efficiency in a training data set and the results were confirmed in a validation data set. Results from the univariate and bivariate analyses of ADG and DMI, adjusted by the feedlot beef steer maintenance requirements, were compared. The bivariate uni-SNP analysis identified (P-value <0.0001) 11 SNPs, meanwhile the univariate analyses of ADG and DMI identified 8 and 9 SNPs, respectively. Among the six SNPs confirmed in the validation data set, five SNPs were mapped to KDELC2, PHOX2A, and TMEM40. Findings from the uni-SNP models were used to develop highly accurate predictive multi-SNP models in the training data set. Despite the substantially smaller size of the validation data set, the training multi-SNP models had slightly lower predictive ability when applied to the validation data set. Six Gene Ontology molecular functions related to ion transport activity were enriched (P-value <0.001) among the genes associated with the detected SNPs. The findings from this study demonstrate the complementary value of the uni- and multi-SNP models, and univariate and bivariate GWAS analyses. The identified SNPs can be used for genome-enabled improvement of feed efficiency in feedlot beef cattle, and can aid in the design of empirical studies to further confirm the associations. PMID:24205251

  16. Bivariate spline solution of time dependent nonlinear PDE for a population density over irregular domains.

    PubMed

    Gutierrez, Juan B; Lai, Ming-Jun; Slavov, George

    2015-12-01

    We study a time dependent partial differential equation (PDE) which arises from classic models in ecology involving logistic growth with Allee effect by introducing a discrete weak solution. Existence, uniqueness and stability of the discrete weak solutions are discussed. We use bivariate splines to approximate the discrete weak solution of the nonlinear PDE. A computational algorithm is designed to solve this PDE. A convergence analysis of the algorithm is presented. We present some simulations of population development over some irregular domains. Finally, we discuss applications in epidemiology and other ecological problems.

  17. A compressed primal-dual method for generating bivariate cubic L1 splines

    NASA Astrophysics Data System (ADS)

    Wang, Yong; Fang, Shu-Cherng; Lavery, John E.

    2007-04-01

    In this paper, we develop a compressed version of the primal-dual interior point method for generating bivariate cubic L1 splines. Discretization of the underlying optimization model, which is a nonsmooth convex programming problem, leads to an overdetermined linear system that can be handled by interior point methods. Taking advantage of the special matrix structure of the cubic L1 spline problem, we design a compressed primal-dual interior point algorithm. Computational experiments indicate that this compressed primal-dual method is robust and is much faster than the ordinary (uncompressed) primal-dual interior point algorithm.

  18. A semiparametric approach to simultaneous covariance estimation for bivariate sparse longitudinal data.

    PubMed

    Das, Kiranmoy; Daniels, Michael J

    2014-03-01

    Estimation of the covariance structure for irregular sparse longitudinal data has been studied by many authors in recent years but typically using fully parametric specifications. In addition, when data are collected from several groups over time, it is known that assuming the same or completely different covariance matrices over groups can lead to loss of efficiency and/or bias. Nonparametric approaches have been proposed for estimating the covariance matrix for regular univariate longitudinal data by sharing information across the groups under study. For the irregular case, with longitudinal measurements that are bivariate or multivariate, modeling becomes more difficult. In this article, to model bivariate sparse longitudinal data from several groups, we propose a flexible covariance structure via a novel matrix stick-breaking process for the residual covariance structure and a Dirichlet process mixture of normals for the random effects. Simulation studies are performed to investigate the effectiveness of the proposed approach over more traditional approaches. We also analyze a subset of Framingham Heart Study data to examine how the blood pressure trajectories and covariance structures differ for the patients from different BMI groups (high, medium, and low) at baseline. PMID:24400941

  19. Application of a Bivariate Gamma Distribution for a Chemically Reacting Plume in the Atmosphere

    NASA Astrophysics Data System (ADS)

    Ferrero, Enrico; Mortarini, Luca; Alessandrini, Stefano; Lacagnina, Carlo

    2013-04-01

    The joint concentration probability density function of two reactive chemical species is modelled using a bivariate Gamma distribution coupled with a three-dimensional fluctuating plume model able to simulate the diffusion and mixing of turbulent plumes. A wind-tunnel experiment (Brown and Bilger, J Fluid Mech 312:373-407, 1996), carried out in homogeneous unbounded turbulence, in which nitrogen oxide is released from a point source in an ozone doped background and the chemical reactions take place in non-equilibrium conditions, is considered as a test case. The model is based on a stochastic Langevin equation reproducing the barycentre position distribution through a proper low-pass filter for the turbulence length scales. While the meandering large-scale motion of the plume is directly simulated, the internal mixing relative to the centroid is reproduced using a bivariate Gamma density function. The effect of turbulence on the chemical reaction (segregation), which in this case has not yet attained equilibrium, is directly evaluated through the covariance of the tracer concentration fields. The computed mean concentrations and the O3-NO concentration covariance are also compared with those obtained by the Alessandrini and Ferrero Lagrangian single particle model (Alessandrini and Ferrero, Physica A 388:1375-1387, 2009) that entails an ad hoc parametrization for the segregation coefficient.

  20. A semiparametric approach to simultaneous covariance estimation for bivariate sparse longitudinal data.

    PubMed

    Das, Kiranmoy; Daniels, Michael J

    2014-03-01

    Estimation of the covariance structure for irregular sparse longitudinal data has been studied by many authors in recent years but typically using fully parametric specifications. In addition, when data are collected from several groups over time, it is known that assuming the same or completely different covariance matrices over groups can lead to loss of efficiency and/or bias. Nonparametric approaches have been proposed for estimating the covariance matrix for regular univariate longitudinal data by sharing information across the groups under study. For the irregular case, with longitudinal measurements that are bivariate or multivariate, modeling becomes more difficult. In this article, to model bivariate sparse longitudinal data from several groups, we propose a flexible covariance structure via a novel matrix stick-breaking process for the residual covariance structure and a Dirichlet process mixture of normals for the random effects. Simulation studies are performed to investigate the effectiveness of the proposed approach over more traditional approaches. We also analyze a subset of Framingham Heart Study data to examine how the blood pressure trajectories and covariance structures differ for the patients from different BMI groups (high, medium, and low) at baseline.

  1. Xp21 contiguous gene syndromes: Deletion quantitation with bivariate flow karyotyping allows mapping of patient breakpoints

    SciTech Connect

    McCabe, E.R.B.; Towbin, J.A. ); Engh, G. van den; Trask, B.J. )

    1992-12-01

    Bivariate flow karyotyping was used to estimate the deletion sizes for a series of patients with Xp21 contiguous gene syndromes. The deletion estimates were used to develop an approximate scale for the genomic map in Xp21. The bivariate flow karyotype results were compared with clinical and molecular genetic information on the extent of the patients' deletions, and these various types of data were consistent. The resulting map spans >15 Mb, from the telomeric interval between DXS41 (99-6) and DXS68 (1-4) to a position centromeric to the ornithine transcarbamylase locus. The deletion sizing was considered to be accurate to [plus minus]1 Mb. The map provides information on the relative localization of genes and markers within this region. For example, the map suggests that the adrenal hypoplasia congenita and glycerol kinase genes are physically close to each other, are within 1-2 Mb of the telomeric end of the Duchenne muscular dystrophy (DMD) gene, and are nearer to the DMD locus than to the more distal marker DXS28 (C7). Information of this type is useful in developing genomic strategies for positional cloning in Xp21. These investigations demonstrate that the DNA from patients with Xp21 contiguous gene syndromes can be valuable reagents, not only for ordering loci and markers but also for providing an approximate scale to the map of the Xp21 region surrounding DMD. 44 refs., 3 figs.

  2. A Statistical Analysis of Cotton Fiber Properties

    NASA Astrophysics Data System (ADS)

    Ghosh, Anindya; Das, Subhasis; Majumder, Asha

    2016-04-01

    This paper reports a statistical analysis of different cotton fiber properties, such as strength, breaking elongation, upper half mean length, length uniformity index, short fiber index, micronaire, reflectance and yellowness measured from 1200 cotton bales. The uni-variate, bi-variate and multi-variate statistical analysis have been invoked to elicit interrelationship between above-mentioned properties taking them up singularly, pairwise and multiple way, respectively. In multi-variate analysis all cotton fiber properties are simultaneously considered for multi-dimensional techniques of principal factor analysis.

  3. Effects of salt rich diet in the obese Zucker rats: studies on renal function during isotonic volume expansion.

    PubMed

    Pamidimukkala, Jaya; Jandhyala, Bhagavan S

    2004-01-01

    Obese Zucker rats (OZR) are hyperinsulenemic, hyperglycemic and dyslipidemic and develop salt dependent hypertension. Since salt sensitivity is considered to be due to impaired handling of renal sodium excretion, these studies were conducted in the obese and lean Zucker rats (LZR) anesthetized with Inactin to evaluate renal function under basal conditions and during acute isotonic fluid volume expansion (VE). Mean Arterial blood pressure (MBP), heart rate (HR), renal blood flow(RBF) and glomerular filtration rate (GFR) were not significantly different between the lean Zucker rats fed normal diet or that fed salt rich diet(8% NaCI). However, basal UV and UNaV were significantly greater in the LZR fed high salt. During VE essentially identical increases occurred in GFR, UV and UNaV in both the lean groups. In the OZR fed salt rich diet also, there were no significant changes in the heart rate, RBF and GFR. However, arterial blood pressure of the OZR fed salt rich diet was significantly greater than that of the OZR on the normal diet as well as that of both the lean groups. Also, as in the LZR, basal UV and UNaV were significantly greater in the salt fed obese rats. During volume expansion there were no impairments in the ability of the obese groups fed normal or salt rich diet to eliminate sodium and water during volume load. In fact, the net sodium and water excretions during and 60 min after VE in both the obese groups were significantly greater than that of corresponding lean groups. Furthermore, these values in the OZR fed salt rich diet were significantly greater than that of the obese rats on normal salt diet perhaps due to the contribution of pressure natriuretic mechanisms'. These data demonstrate that although OZR are salt sensitive, the renal mechanisms that would collectively respond to acute isotonic VE were fully functional. An unexpected and a novel finding in these studies is that the salt rich diet, in addition to increasing arterial blood pressure

  4. Effects of salt rich diet in the obese Zucker rats: studies on renal function during isotonic volume expansion.

    PubMed

    Pamidimukkala, Jaya; Jandhyala, Bhagavan S

    2004-01-01

    Obese Zucker rats (OZR) are hyperinsulenemic, hyperglycemic and dyslipidemic and develop salt dependent hypertension. Since salt sensitivity is considered to be due to impaired handling of renal sodium excretion, these studies were conducted in the obese and lean Zucker rats (LZR) anesthetized with Inactin to evaluate renal function under basal conditions and during acute isotonic fluid volume expansion (VE). Mean Arterial blood pressure (MBP), heart rate (HR), renal blood flow(RBF) and glomerular filtration rate (GFR) were not significantly different between the lean Zucker rats fed normal diet or that fed salt rich diet(8% NaCI). However, basal UV and UNaV were significantly greater in the LZR fed high salt. During VE essentially identical increases occurred in GFR, UV and UNaV in both the lean groups. In the OZR fed salt rich diet also, there were no significant changes in the heart rate, RBF and GFR. However, arterial blood pressure of the OZR fed salt rich diet was significantly greater than that of the OZR on the normal diet as well as that of both the lean groups. Also, as in the LZR, basal UV and UNaV were significantly greater in the salt fed obese rats. During volume expansion there were no impairments in the ability of the obese groups fed normal or salt rich diet to eliminate sodium and water during volume load. In fact, the net sodium and water excretions during and 60 min after VE in both the obese groups were significantly greater than that of corresponding lean groups. Furthermore, these values in the OZR fed salt rich diet were significantly greater than that of the obese rats on normal salt diet perhaps due to the contribution of pressure natriuretic mechanisms'. These data demonstrate that although OZR are salt sensitive, the renal mechanisms that would collectively respond to acute isotonic VE were fully functional. An unexpected and a novel finding in these studies is that the salt rich diet, in addition to increasing arterial blood pressure

  5. Bayesian neural networks for bivariate binary data: an application to prostate cancer study.

    PubMed

    Chakraborty, Sounak; Ghosh, Malay; Maiti, Tapabrata; Tewari, Ashutosh

    2005-12-15

    Prostate cancer is one of the most common cancers in American men. The cancer could either be locally confined, or it could spread outside the organ. When locally confined, there are several options for treating and curing this disease. Otherwise, surgery is the only option, and in extreme cases of outside spread, it could very easily recur within a short time even after surgery and subsequent radiation therapy. Hence, it is important to know, based on pre-surgery biopsy results how likely the cancer is organ-confined or not. The paper considers a hierarchical Bayesian neural network approach for posterior prediction probabilities of certain features indicative of non-organ confined prostate cancer. In particular, we find such probabilities for margin positivity (MP) and seminal vesicle (SV) positivity jointly. The available training set consists of bivariate binary outcomes indicating the presence or absence of the two. In addition, we have certain covariates such as prostate specific antigen (PSA), gleason score and the indicator for the cancer to be unilateral or bilateral (i.e. spread on one or both sides) in one data set and gene expression microarrays in another data set. We take a hierarchical Bayesian neural network approach to find the posterior prediction probabilities for a test and validation set, and compare these with the actual outcomes for the first data set. In case of the microarray data we use leave one out cross-validation to access the accuracy of our method. We also demonstrate the superiority of our method to the other competing methods through a simulation study. The Bayesian procedure is implemented by an application of the Markov chain Monte Carlo numerical integration technique. For the problem at hand, our Bayesian bivariate neural network procedure is shown to be superior to the classical neural network, Radford Neal's Bayesian neural network as well as bivariate logistic models to predict jointly the MP and SV in a patient in both the

  6. A composite likelihood method for bivariate meta-analysis in diagnostic systematic reviews

    PubMed Central

    Liu, Yulun; Ning, Jing; Nie, Lei; Zhu, Hongjian; Chu, Haitao

    2014-01-01

    Diagnostic systematic review is a vital step in the evaluation of diagnostic technologies. In many applications, it involves pooling pairs of sensitivity and specificity of a dichotomized diagnostic test from multiple studies. We propose a composite likelihood method for bivariate meta-analysis in diagnostic systematic reviews. This method provides an alternative way to make inference on diagnostic measures such as sensitivity, specificity, likelihood ratios and diagnostic odds ratio. Its main advantages over the standard likelihood method are the avoidance of the non-convergence problem, which is non-trivial when the number of studies are relatively small, the computational simplicity and some robustness to model mis-specifications. Simulation studies show that the composite likelihood method maintains high relative efficiency compared to that of the standard likelihood method. We illustrate our method in a diagnostic review of the performance of contemporary diagnostic imaging technologies for detecting metastases in patients with melanoma. PMID:25512146

  7. A Bivariate Chebyshev Spectral Collocation Quasilinearization Method for Nonlinear Evolution Parabolic Equations

    PubMed Central

    Motsa, S. S.; Magagula, V. M.; Sibanda, P.

    2014-01-01

    This paper presents a new method for solving higher order nonlinear evolution partial differential equations (NPDEs). The method combines quasilinearisation, the Chebyshev spectral collocation method, and bivariate Lagrange interpolation. In this paper, we use the method to solve several nonlinear evolution equations, such as the modified KdV-Burgers equation, highly nonlinear modified KdV equation, Fisher's equation, Burgers-Fisher equation, Burgers-Huxley equation, and the Fitzhugh-Nagumo equation. The results are compared with known exact analytical solutions from literature to confirm accuracy, convergence, and effectiveness of the method. There is congruence between the numerical results and the exact solutions to a high order of accuracy. Tables were generated to present the order of accuracy of the method; convergence graphs to verify convergence of the method and error graphs are presented to show the excellent agreement between the results from this study and the known results from literature. PMID:25254252

  8. Estimating the Correlation in Bivariate Normal Data with Known Variances and Small Sample Sizes1

    PubMed Central

    Fosdick, Bailey K.; Raftery, Adrian E.

    2013-01-01

    We consider the problem of estimating the correlation in bivariate normal data when the means and variances are assumed known, with emphasis on the small sample case. We consider eight different estimators, several of them considered here for the first time in the literature. In a simulation study, we found that Bayesian estimators using the uniform and arc-sine priors outperformed several empirical and exact or approximate maximum likelihood estimators in small samples. The arc-sine prior did better for large values of the correlation. For testing whether the correlation is zero, we found that Bayesian hypothesis tests outperformed significance tests based on the empirical and exact or approximate maximum likelihood estimators considered in small samples, but that all tests performed similarly for sample size 50. These results lead us to suggest using the posterior mean with the arc-sine prior to estimate the correlation in small samples when the variances are assumed known. PMID:23378667

  9. A bivariate Chebyshev spectral collocation quasilinearization method for nonlinear evolution parabolic equations.

    PubMed

    Motsa, S S; Magagula, V M; Sibanda, P

    2014-01-01

    This paper presents a new method for solving higher order nonlinear evolution partial differential equations (NPDEs). The method combines quasilinearisation, the Chebyshev spectral collocation method, and bivariate Lagrange interpolation. In this paper, we use the method to solve several nonlinear evolution equations, such as the modified KdV-Burgers equation, highly nonlinear modified KdV equation, Fisher's equation, Burgers-Fisher equation, Burgers-Huxley equation, and the Fitzhugh-Nagumo equation. The results are compared with known exact analytical solutions from literature to confirm accuracy, convergence, and effectiveness of the method. There is congruence between the numerical results and the exact solutions to a high order of accuracy. Tables were generated to present the order of accuracy of the method; convergence graphs to verify convergence of the method and error graphs are presented to show the excellent agreement between the results from this study and the known results from literature.

  10. Bayesian bivariate generalized Lindley model for survival data with a cure fraction.

    PubMed

    Martinez, Edson Z; Achcar, Jorge A

    2014-11-01

    The cure fraction models have been widely used to analyze survival data in which a proportion of the individuals is not susceptible to the event of interest. In this article, we introduce a bivariate model for survival data with a cure fraction based on the three-parameter generalized Lindley distribution. The joint distribution of the survival times is obtained by using copula functions. We consider three types of copula function models, the Farlie-Gumbel-Morgenstern (FGM), Clayton and Gumbel-Barnett copulas. The model is implemented under a Bayesian framework, where the parameter estimation is based on Markov Chain Monte Carlo (MCMC) techniques. To illustrate the utility of the model, we consider an application to a real data set related to an invasive cervical cancer study.

  11. A bivariate Chebyshev spectral collocation quasilinearization method for nonlinear evolution parabolic equations.

    PubMed

    Motsa, S S; Magagula, V M; Sibanda, P

    2014-01-01

    This paper presents a new method for solving higher order nonlinear evolution partial differential equations (NPDEs). The method combines quasilinearisation, the Chebyshev spectral collocation method, and bivariate Lagrange interpolation. In this paper, we use the method to solve several nonlinear evolution equations, such as the modified KdV-Burgers equation, highly nonlinear modified KdV equation, Fisher's equation, Burgers-Fisher equation, Burgers-Huxley equation, and the Fitzhugh-Nagumo equation. The results are compared with known exact analytical solutions from literature to confirm accuracy, convergence, and effectiveness of the method. There is congruence between the numerical results and the exact solutions to a high order of accuracy. Tables were generated to present the order of accuracy of the method; convergence graphs to verify convergence of the method and error graphs are presented to show the excellent agreement between the results from this study and the known results from literature. PMID:25254252

  12. On the joint spectral density of bivariate random sequences. Thesis Technical Report No. 21

    NASA Technical Reports Server (NTRS)

    Aalfs, David D.

    1995-01-01

    For univariate random sequences, the power spectral density acts like a probability density function of the frequencies present in the sequence. This dissertation extends that concept to bivariate random sequences. For this purpose, a function called the joint spectral density is defined that represents a joint probability weighing of the frequency content of pairs of random sequences. Given a pair of random sequences, the joint spectral density is not uniquely determined in the absence of any constraints. Two approaches to constraining the sequences are suggested: (1) assume the sequences are the margins of some stationary random field, (2) assume the sequences conform to a particular model that is linked to the joint spectral density. For both approaches, the properties of the resulting sequences are investigated in some detail, and simulation is used to corroborate theoretical results. It is concluded that under either of these two constraints, the joint spectral density can be computed from the non-stationary cross-correlation.

  13. The Bivariate Brightness Distribution of Galaxies as a Function of Spectral Type.

    NASA Astrophysics Data System (ADS)

    Cross, N. J. G.; Driver, S. P.; Lemon, D. J.; Liske, J.; Couch, W. J.; 2dFGRS Team

    2002-05-01

    The Bivariate Brightness Distribution (BBD) is the space density of galaxies as a function of absolute magnitude and effective surface brightness. We have measured the BBD for 4 different spectral types identified in the 2dFGRS. These spectral types range from strong absorption (type 1) to strong emission (type 4). We find that type 1 galaxies have a bounded distribution with little or no correlation between luminosity and surface brightness. Types 2 to 4 have unbound distributions (i.e. the space density is still increasing at the limits of the survey) and have strong luminosity-surface brightness correlations. The gradient β , where (M=β μ e+C), and scatter σ of this correlation appear to be constant for the 3 spectral types, with β =0.23+/-0.09 and σ = 0.56+/-0.01. This work was supported by the UK Particle Physics and Astronomy Research Council.

  14. Laser capillary spectrophotometric acquisition of bivariate drop size and concentration data for liquid-liquid dispersion

    DOEpatents

    Tavlarides, L.L.; Bae, J.H.

    1991-12-24

    A laser capillary spectrophotometric technique measures real time or near real time bivariate drop size and concentration distribution for a reactive liquid-liquid dispersion system. The dispersion is drawn into a precision-bore glass capillary and an appropriate light source is used to distinguish the aqueous phase from slugs of the organic phase at two points along the capillary whose separation is precisely known. The suction velocity is measured, as is the length of each slug from which the drop free diameter is calculated. For each drop, the absorptivity at a given wavelength is related to the molar concentration of a solute of interest, and the concentration of given drops of the organic phase is derived from pulse heights of the detected light. This technique permits on-line monitoring and control of liquid-liquid dispersion processes. 17 figures.

  15. Laser capillary spectrophotometric acquisition of bivariate drop size and concentration data for liquid-liquid dispersion

    DOEpatents

    Tavlarides, Lawrence L.; Bae, Jae-Heum

    1991-01-01

    A laser capillary spectrophotometric technique measures real time or near real time bivariate drop size and concentration distribution for a reactive liquid-liquid dispersion system. The dispersion is drawn into a precision-bore glass capillary and an appropriate light source is used to distinguish the aqueous phase from slugs of the organic phase at two points along the capillary whose separation is precisely known. The suction velocity is measured, as is the length of each slug from which the drop free diameter is calculated. For each drop, the absorptivity at a given wavelength is related to the molar concentration of a solute of interest, and the concentration of given drops of the organic phase is derived from pulse heights of the detected light. This technique permits on-line monitoring and control of liquid-liquid dispersion processes.

  16. Detecting flood event trends assigned to changes in urbanisation levels using a bivariate copula model

    NASA Astrophysics Data System (ADS)

    Requena, Ana; Prosdocimi, Ilaria; Kjeldsen, Thomas R.; Mediero, Luis

    2014-05-01

    Flood frequency analyses based on stationary assumptions are usually employed for estimating design floods. However, more complex non-stationarity approaches are trying to be incorporated with the aim of improving such estimates. In this study, the effect of changing urbanisation on maximum flood peak (Q) and volume (V) series is analysed. The potential changes in an urbanised catchment and in a nearby hydrologically similar rural catchment in northwest England are investigated. The urbanised catchment is characterised by a noticeable increase of the urbanisation level in time, while the rural catchment has not been altered by anthropogenic actions. Winter, summer and annual maximum flood events are studied. With the aim of analysing changes in time, two non-superimposed time-windows are defined covering the periods 1976-1992 and 1993-2008, respectively. A preliminary analysis of temporal trends in Q, V and Kendall's tau is visually done, being formal tested by a resampling procedure. Differences were found among winter, summer and annual maximum flood events. As annual maximum flood events are commonly used for designing purposes, the corresponding bivariate distribution (margins and copula) was obtained for the different time-windows. Trends regarding both time-windows were analysed by comparing bivariate return period curves in the Q-V space. Different behaviours were found depending on the catchment. As a result, the application of the proposed methodology provides useful information in describing changes in flood events, regarding different flood variables and their relationship. In addition, the methodology can inform practitioners on the potential changes connected with urbanisation for appropriate design flood estimation.

  17. Pleiotropic locus for emotion recognition and amygdala volume identified using univariate and bivariate linkage

    PubMed Central

    Knowles, Emma E. M.; McKay, D. Reese; Kent, Jack W.; Sprooten, Emma; Carless, Melanie A.; Curran, Joanne E.; de Almeida, Marcio A. A.; Dyer, Thomas D.; Göring, Harald H. H.; Olvera, Rene; Duggirala, Ravi; Fox, Peter; Almasy, Laura; Blangero, John; Glahn, David. C.

    2014-01-01

    The role of the amygdala in emotion recognition is well established and separately each trait has been shown to be highly heritable, but the potential role of common genetic influences on both traits has not been explored. Here we present an investigation of the pleiotropic influences of amygdala and emotion recognition in a sample of randomly selected, extended pedigrees (N = 858). Using a combination of univariate and bivariate linkage we found a pleiotropic region for amygdala and emotion recognition on 4q26 (LOD = 4.34). Association analysis conducted in the region underlying the bivariate linkage peak revealed a variant meeting the corrected significance level (pBonferroni = 5.01×10−05) within an intron of PDE5A (rs2622497, Χ2 =16.67, p = 4.4×10−05) as being jointly influential on both traits. PDE5A has been implicated previously in recognition-memory deficits and is expressed in subcortical structures that are thought to underlie memory ability including the amygdala. The present paper extends our understanding of the shared etiology between amygdala and emotion recognition by showing that the overlap between the two traits is due, at least in part, to common genetic influences. Moreover, the present paper identifies a pleiotropic locus for the two traits and an associated variant, which localizes the genetic signal even more precisely. These results, when taken in the context of previous research, highlight the potential utility of PDE5-inhibitors for ameliorating emotion-recognition deficits in populations including, but not exclusively, those individuals suffering from mental or neurodegenerative illness. PMID:25322361

  18. Isokinetic Strength and Endurance During 30-day 6 deg Head-Down Bed Rest with Isotonic and Isokinetic Exercise Training

    NASA Technical Reports Server (NTRS)

    Greenleaf, J. E.; Bernauer, E. M.; Ertl, A. C.; Bond, M.; Bulbulian, R.

    1994-01-01

    The purpose of our study was to determine if an intensive, intermittent, isokinetic, lower extremity exercise training program would attenuate or eliminate the decrease of muscular strength and endurance during prolonged bed rest (BR). The 19 male subjects (36 +/- 1 yr, 178 +/- 2 cm, 76.5 +/- 1.7 kg) were allocated into a no exercise (NOE) training group (N = 5), an isotonic (lower extremity cycle orgometer) exercise (ITE) training group (N = 7), and an isokinetic (isokinetic knee flexion-extension) exercise (IKE) training group (N = 7). Peak knee (flexion and extension) and shoulder (abduction-adduction) functions were measured weekly in all groups with one 5-repetition set. After BR, average knee extension total work decreased by 16% with NOE, increased by 27% with IKE, and was unchanged with ITE. Average knee flexion total work and peak torque (strength) responses were unchanged in all groups. Force production increased by 20% with IKE and was unchanged with NOE and ITE. Shoulder total work was unchanged in all groups, while gross average peak torque increased by 27% with ITE and by 22% with IKE, and was unchanged with NOE. Thus, while ITE training can maintain some isokinetic functions during BR, maximal intermittent IKE training can increase other functions above pre-BR control levels.

  19. Handgrip and general muscular strength and endurance during prolonged bedrest with isometric and isotonic leg exercise training

    NASA Technical Reports Server (NTRS)

    Greenleaf, J. E.; Starr, J. C.; Van Beaumont, W.; Convertino, V. A.

    1983-01-01

    Measurements of maximal grip strength and endurance at 40 percent max strength were obtained for 7 men 19-21 years of age, 1-2 days before and on the first recovery day during three 2-week bedrest (BR) periods, each separated by a 3-week ambulatory recovery period. The subjects performed isometric exercise (IME) for 1 hr/day, isotonic exercise (ITE) for 1 hr/day, and no exercise (NOE) in the three BR periods. It was found that the mean maximal grip strength was unchanged after all three BR periods. Mean grip endurance was found to be unchanged after IME and ITE training, but was significantly reduced after NOE. These results indicate that IME and ITE training during BR do not increase or decrease maximal grip strength, alghough they prevent loss of grip endurance, while the maximal strength of all other major muscle groups decreases in proportion to the length of BR to 70 days. The maximal strength reduction of the large muscle groups was found to be about twice that of the small muscle groups during BR. In addition, it is shown that changes in maximal strength after spaceflight, BR, or water immersion deconditioning cannot be predicted from changes in submaximal or maximal oxygen uptake values.

  20. Modeling, design and validation of a novel microfluidic sensor for in-vitro isotonic measurement of microvessel contraction/dilation.

    PubMed

    Izzo, Ivano; Dario, Paolo

    2007-02-01

    The paper presents an innovative fluidic microdevice for in-vitro isotonic measurements of microvessel contraction/dilation. The proposed microdevice is based on the well-known fluid dynamic principle which correlates a geometrical change of a solid body immersed in a constant flow, with the differential pressure change induced in the flow itself. Indeed, a biochemically-induced change of microvessel diameter lead to a change of its hydraulic resistance which can be in-vitro measured by a differential pressure sensor. The novel microfluidic sensor has been modeled, designed and fabricated in order to properly implement this working principle. The fluidic scheme of the sensor consists of two pressure-sensorized chambers which are connected by a calibrated channel where the microvessel is placed. Experimental tests have been performed by using three microvessel passive simulators with different inner diameter (300, 600 and 800 microm), in order to validate the expected sensor functioning characterized by two sensitivities: the pressure drop S (PL) and the overall S 2-3 ones. Their measured values (S (PL,m)=2.132 mV/V/Pa, S (2-3,m)=0.787 mV/V/(mm3/s)/mm) confirm the validity of the proposed working principle and permit to be confident in a future application of the sensor in a clinical context.

  1. Isokinetic strength and endurance during 30-day 6 degrees head-down bed rest with isotonic and isokinetic exercise training

    NASA Technical Reports Server (NTRS)

    Greenleaf, J. E.; Bernauer, E. M.; Ertl, A. C.; Bulbulian, R.; Bond, M.

    1994-01-01

    The purpose of our study was to determine if an intensive, intermittent, isokinetic, lower extremity exercise training program would attenuate or eliminate the decrease of muscular strength and endurance during prolonged bed rest (BR). The 19 male subjects (36 +/- 1 yr, 178 +/- 2 cm, 76.5 +/- 1.7 kg) were allocated into a no exercise (NOE) training group (N = 5), an isotonic (lower extremity cycle ergometer) exercise (ITE) training group (N = 7), and an isokinetic (isokinetic knee flexion-extension) exercise (IKE) training group (N = 7). Peak knee (flexion and extension) and shoulder (abduction-adduction) functions were measured weekly in all groups with one 5-repetition set. After BR, average knee extension total work decreased by 16% with NOE, increased by 27% with IKE, and was unchanged with ITE. Average knee flexion total work and peak torque (strength) responses were unchanged in all groups. Force production increased by 20% with IKE and was unchanged with NOE and ITE. Shoulder total work was unchanged in all groups, while gross average peak torque increased by 27% with ITE and by 22% with IKE, and was unchanged with NOE. Thus, while ITE training can maintain some isokinetic functions during BR, maximal intermittent IKE training can increase other functions above pre-BR control levels.

  2. THE RELATIONSHIP BETWEEN ISOTONIC PLANTAR FLEXOR ENDURANCE, NAVICULAR DROP, AND EXERCISE-RELATED LEG PAIN IN A COHORT OF COLLEGIATE CROSS-COUNTRY RUNNERS

    PubMed Central

    Reinking, Mark F.; Rauh, Mitchell J.

    2012-01-01

    Purpose: The purpose of this study was to examine the relationships between isotonic ankle plantar flexor endurance (PFE), foot pronation as measured by navicular drop, and exercise-related leg pain (ERLP). Background: Exercise-related leg pain is a common occurrence in competitive and recreational runners. The identification of factors contributing to the development of ERLP may help guide methods for the prevention and management of overuse injuries. Methods: Seventy-seven (44 males, 33 females) competitive runners from five collegiate cross-country (XC) teams consented to participate in the study. Isotonic ankle PFE and foot pronation were measured using the standing heel-rise and navicular drop (ND) tests, respectively. Demographic information, anthropometric measurements, and ERLP history were also recorded. Subjects were then prospectively tracked for occurrence of ERLP during the 2009 intercollegiate cross-country season. Multivariate logistic regression analysis was used to examine the relationships between isotonic ankle joint PFE and ND and the occurrence of ERLP. Results: While no significant differences were identified for isotonic ankle PFE between groups of collegiate XC runners with and without ERLP, runners with a ND >10 mm were almost 7 times (OR=6.6, 95% CI=1.2–38.0) more likely to incur medial ERLP than runners with ND <10 mm. Runners with a history of ERLP in the month previous to the start of the XC season were 12 times (OR=12.3, 95% CI=3.1–48.9) more likely to develop an in-season occurrence of ERLP. Conclusion: While PFE did not appear to be a risk factor in the development of ERLP in this group of collegiate XC runners, those with a ND greater than 10 mm may be at greater odds of incurring medial ERLP. Level of Evidence: 2b. PMID:22666641

  3. Calculation of the Residual Blood Volume after Acute, Non-Ongoing Hemorrhage Using Serial Hematocrit Measurements and the Volume of Isotonic Fluid Infused: Theoretical Hypothesis Generating Study.

    PubMed

    Oh, Won Sup; Chon, Sung-Bin

    2016-05-01

    Fluid resuscitation, hemostasis, and transfusion is essential in care of hemorrhagic shock. Although estimation of the residual blood volume is crucial, the standard measuring methods are impractical or unsafe. Vital signs, central venous or pulmonary artery pressures are inaccurate. We hypothesized that the residual blood volume for acute, non-ongoing hemorrhage was calculable using serial hematocrit measurements and the volume of isotonic solution infused. Blood volume is the sum of volumes of red blood cells and plasma. For acute, non-ongoing hemorrhage, red blood cell volume would not change. A certain portion of the isotonic fluid would increase plasma volume. Mathematically, we suggest that the residual blood volume after acute, non-ongoing hemorrhage might be calculated as 0·25N/[(Hct1/Hct2)-1], where Hct1 and Hct2 are the initial and subsequent hematocrits, respectively, and N is the volume of isotonic solution infused. In vivo validation and modification is needed before clinical application of this model.

  4. Calculation of the Residual Blood Volume after Acute, Non-Ongoing Hemorrhage Using Serial Hematocrit Measurements and the Volume of Isotonic Fluid Infused: Theoretical Hypothesis Generating Study

    PubMed Central

    2016-01-01

    Fluid resuscitation, hemostasis, and transfusion is essential in care of hemorrhagic shock. Although estimation of the residual blood volume is crucial, the standard measuring methods are impractical or unsafe. Vital signs, central venous or pulmonary artery pressures are inaccurate. We hypothesized that the residual blood volume for acute, non-ongoing hemorrhage was calculable using serial hematocrit measurements and the volume of isotonic solution infused. Blood volume is the sum of volumes of red blood cells and plasma. For acute, non-ongoing hemorrhage, red blood cell volume would not change. A certain portion of the isotonic fluid would increase plasma volume. Mathematically, we suggest that the residual blood volume after acute, non-ongoing hemorrhage might be calculated as 0·25N/[(Hct1/Hct2)–1], where Hct1 and Hct2 are the initial and subsequent hematocrits, respectively, and N is the volume of isotonic solution infused. In vivo validation and modification is needed before clinical application of this model. PMID:27134507

  5. Antioxidant activity, total phenolics content, anthocyanin, and color stability of isotonic model beverages colored with Andes berry (Rubus glaucus Benth) anthocyanin powder.

    PubMed

    Estupiñan, D C; Schwartz, S J; Garzón, G A

    2011-01-01

    The stability of anthocyanin (ACN) freeze-dried powders from Andes berry (Rubus glaucus Benth) as affected by storage, addition of maltodextrin as a carrier agent, and illumination was evaluated in isotonic model beverages. The ethanolic ACN extract was freeze dried with and without maltodextrin DE 20. Isotonic model beverages were colored with freeze-dried ACN powder (FDA), freeze-dried ACN powder with maltodextrin (MFDA), and red nr 40. Beverages were stored in the dark and under the effect of illumination. Half life of the ACNs, changes in color, total phenolics content (TPC), and antioxidant activity were analyzed for 71 d. Addition of maltodextrin and absence of light stabilized the color of beverages and improved ACN and TPC stability during storage. The antioxidant activity of the beverages was higher when they were colored with MFDA and highly correlated with ACN content. There was no correlation between antioxidant activity and TPC. It is concluded that addition of maltodextrin DE 20 as a carrier agent during freeze-drying improves the color and stability of nutraceutical antioxidants present in Andes berry extract. This suggests a protective enclosing of ACNs within a maltodextrin matrix with a resulting powder that could serve as a supplement or additive to naturally color and to enhance the antioxidant capacity of isotonic beverages.

  6. Antioxidant Activity, Total Phenolics Content, Anthocyanin, and Color Stability of Isotonic Model Beverages Colored with Andes Berry (Rubus glaucus Benth) Anthocyanin Powder

    PubMed Central

    Estupiñan, D.C.; Schwartz, S.J.; Garzón, G.A.

    2013-01-01

    The stability of anthocyanin (ACN) freeze-dried powders from Andes berry (Rubus glaucus Benth) as affected by storage, addition of maltodextrin as a carrier agent, and illumination was evaluated in isotonic model beverages. The ethanolic ACN extract was freeze dried with and without maltodextrin DE 20. Isotonic model beverages were colored with freeze-dried ACN powder (FDA), freeze-dried ACN powder with maltodextrin (MFDA), and red nr 40. Beverages were stored in the dark and under the effect of illumination. Half life of the ACNs, changes in color, total phenolics content (TPC), and antioxidant activity were analyzed for 71 d. Addition of maltodextrin and absence of light stabilized the color of beverages and improved ACN and TPC stability during storage. The antioxidant activity of the beverages was higher when they were colored with MFDA and highly correlated with ACN content. There was no correlation between antioxidant activity and TPC. It is concluded that addition of maltodextrin DE 20 as a carrier agent during freeze-drying improves the color and stability of nutraceutical antioxidants present in Andes berry extract. This suggests a protective enclosing of ACNs within a maltodextrin matrix with a resulting powder that could serve as a supplement or additive to naturally color and to enhance the antioxidant capacity of isotonic beverages. PMID:21535712

  7. Cosmic statistics of statistics

    NASA Astrophysics Data System (ADS)

    Szapudi, István; Colombi, Stéphane; Bernardeau, Francis

    1999-12-01

    The errors on statistics measured in finite galaxy catalogues are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly non-linear to weakly non-linear scales. For non-linear functions of unbiased estimators, such as the cumulants, the phenomenon of cosmic bias is identified and computed. Since it is subdued by the cosmic errors in the range of applicability of the theory, correction for it is inconsequential. In addition, the method of Colombi, Szapudi & Szalay concerning sampling effects is generalized, adapting the theory for inhomogeneous galaxy catalogues. While previous work focused on the variance only, the present article calculates the cross-correlations between moments and connected moments as well for a statistically complete description. The final analytic formulae representing the full theory are explicit but somewhat complicated. Therefore we have made available a fortran program capable of calculating the described quantities numerically (for further details e-mail SC at colombi@iap.fr). An important special case is the evaluation of the errors on the two-point correlation function, for which this should be more accurate than any method put forward previously. This tool will be immensely useful in the future for assessing the precision of measurements from existing catalogues, as well as aiding the design of new galaxy surveys. To illustrate the applicability of the results and to explore the numerical aspects of the theory qualitatively and quantitatively, the errors and cross-correlations are predicted under a wide range of assumptions for the future Sloan Digital Sky Survey. The principal results concerning the cumulants ξ, Q3 and Q4 is that

  8. Leg muscle volume during 30-day 6-degree head-down bed rest with isotonic and isokinetic exercise training

    NASA Technical Reports Server (NTRS)

    Greenleaf, J. E.; Lee, P. L.; Ellis, S.; Selzer, R. H.; Ortendahl, D. A.

    1994-01-01

    Magnetic resonance imaging (MRI) was used to compare the effect of two modes of lower-extremity exercise training on the mass (volume) of posterior leg group (PLG) muscles (soleus, flexor hallucis longus, tibialis posterior, lateral and medial gastrocnemius, and flexor digitorum longus) on 19 men (ages 32-42 years) subjected to intense dynamic-isotonic (ITE, cycle ergometer, number of subjects (N) = 7), isokinetic (IKE, torque egrometer, N = 7), and no exercise (NOE, N = 5) training for 60 min/day during head-down bed rest (HDBR). Total volume of the PLG muscles decreased (p less than 0.05) similarly: ITE = 4.3 +/- SE 1.6%, IKE = 7.7 +/- 1.6%, and NOE = 6.3 +/- 0.8%; combined volume (N = 19) loss was 6.1 +/- 0.9%. Ranges of volume changes were 2.6% to -9.0% (ITE), -2.1% to -14.9% (IKE), and -3.4% to -8/1% (NOE). Correlation coefficients (r) of muscle volume versus thickness measured with ultrasonography were: ITE r + 0.79 (p less than 0.05), IKE r = 0.27 (not significant (NS)), and NOE r = 0.63 (NS). Leg-muscle volume and thickness were highly correlated (r = 0.79) when plasma volume was maintained during HDBR with ITE. Thus, neither intensive lower extremity ITE nor IKE training influence the normal non-exercised posterior leg muscle atrophy during HDBR. The relationship of muscle volume and thickness may depend on the mode of exercise training associated with the maintenance of plasma volume.

  9. Alterations in Skeletal Muscle Function with Microgravity, and the Protective Effects of High Resistance Isometric and Isotonic Exercise

    NASA Technical Reports Server (NTRS)

    Fitts, R. H.; Hurst, J. E.; Norenberg, K. M.; Widrick, J. J.; Riley, D. A.; Bain, J. L. W.; Trappe, S. W.; Trappe, T. A.; Costill, D. L.

    1999-01-01

    Exposure to microgravity or models designed to mimic the unloaded condition, such as bed rest in humans and hindlimb unloading (HU) in rats leads to skeletal muscle atrophy, a loss in peak force and power, and an increased susceptibility to fatigue. The posterior compartment muscles of the lower leg (calf muscle group) appear to be particularly susceptible. Following only 1 wk in space or HU, rat soleus muscle showed a 30 to 40% loss in wet weight. After 3 wk of HU, almost all of the atrophied soleus fibers showed a significant increase in maximal shortening velocity (V(sub 0)), while only 25 to 30 % actually transitioned to fast fibers. The increased V(sub 0), was protective in that it reduced the decline in peak power associated with the reduced peak force. When the soleus is stimulated in situ following HU or zero-g one observes an increased rate and extent of fatigue, and in the former the increased fatigue is associated with a more rapid depletion of muscle glycogen and lactate production. Our working hypothesis is that following HU or spaceflight in rats and bed rest or spaceflight in humans limb skeletal muscles during contractile activity depend more on carbohydrates and less on fatty acids for their substrate supply. Baldwin et al. found 9 days of spaceflight to reduce by 37% the ability of both the high and low oxidative regions of the vastus muscle to oxidize long-chain fatty acids. This decline was not associated with any change in the enzymes of the tricarboxylic acid cycle or oxidation pathway. The purpose of the current research was to establish the extent of functional change in the slow type I and fast type H fibers of the human calf muscle following 17 days of spaceflight, and determine the cellular mechanisms of the observed changes. A second goal was to study the effectiveness of high resistance isotonic and isometric exercise in preventing the deleterious functional changes associated with unloading.

  10. Knee-Joint Proprioception During 30-Day 6 deg Head-Down Bed Rest with Isotonic and Isokinetic Exercise Training

    NASA Technical Reports Server (NTRS)

    Bernauer, E. M.; Walby, W. F.; Ertl, A. C.; Dempster, P. T.; Bond, M.; Greenleaf, J. E.

    1994-01-01

    To determine if daily isotonic exercise or isokinetic exercise training coupled with daily log proprioceptive training, would influence log proprioceptive tracking responses during Bed Rest (BR), 19 men (36 +/- SD 4 years, 178 +/- 7 cm, 76.8 +/- 7.8 kg) were allocated into a NO-Exercise (NOE) training control group (n = 5), and IsoTanic Exercise (ITE, n = 7) and IsoKinetic Exercise (IKE, n = 7) training groups. Exercise training was conducted during BR for two 30-min period / d, 5 d /week. Only the IKE group performed proprioceptive training using a now isokinetic procedure with each lower extremity for 2.5 min before and after the daily exercise training sessions; proprioceptive testing occurred weekly for all groups. There were no significant differences in proprioceptive tracking scores, expressed as a percentage of the perfect score of 100, in the pro-BR ambulatory control period between the three groups. Knee extension and flexion tracking responses were unchanged with NOE during BR, but were significantly greater (*p less than 0.05) at the end of BR in both exercise groups when compared with NOE responses (extension: NOE 80.7 +/- 0.7%, ITE 82.9 +/- 0.6%, IKE 86.5* +/- 0.7%; flexion: NOE 77.6 +/- 1.50, ITE 80.0 +/- 0.8% (NS), IKE 83.6* +/- 0.8%). Although proprioceptive tracking was unchanged during BR with NOE, both lsotonic exercise training (without additional propriaceptive training) and especially isokinetic exercise training when combined with daily proprioceptive training, significantly improved knee proprioceptive tracking responses after 30 d of BR.

  11. Analysis of meteorological droughts for the Saskatchewan River Basin using univariate and bivariate approaches

    NASA Astrophysics Data System (ADS)

    Masud, M. B.; Khaliq, M. N.; Wheater, H. S.

    2015-03-01

    This study is focused on the Saskatchewan River Basin (SRB) that spans southern parts of Alberta, Saskatchewan and Manitoba, the three Prairie Provinces of Canada, where most of the country's agricultural activities are concentrated. The SRB is confronted with immense water-related challenges and is now one of the ten GEWEX (Global Energy and Water Exchanges) Regional Hydroclimate Projects in the world. In the past, various multi-year droughts have been observed in this part of Canada that impacted agriculture, energy and socio-economic sectors. Therefore, proper understanding of the spatial and temporal characteristics of historical droughts is important for many water resources planning and management related activities across the basin. In the study, observed gridded data of daily precipitation and temperature and conventional univariate and copula-based bivariate frequency analyses are used to characterize drought events in terms of drought severity and duration on the basis of two drought indices, the Standardized Precipitation Index (SPI) and the Standardized Precipitation Evapotranspiration Index (SPEI). Within the framework of univariate and bivariate analyses, drought risk indicators are developed and mapped across the SRB to delineate the most vulnerable parts of the basin. Based on the results obtained, southern parts of the SRB (i.e., western part of the South Saskatchewan River, Seven Persons Creek and Bigstick Lake watersheds) are associated with a higher drought risk, while moderate risk is noted for the North Saskatchewan River (except its eastern parts), Red Deer River, Oldman River, Bow River, Sounding Creek, Carrot River and Battle River watersheds. Lower drought risk is found for the areas surrounding the Saskatchewan-Manitoba border (particularly, the Saskatchewan River watershed). It is also found that the areas characterized with higher drought severity are also associated with higher drought duration. A comparison of SPI- and SPEI

  12. Investigation of asymmetrical spatial dependence of the regional climate model precipitation using empirical bivariate copulas

    NASA Astrophysics Data System (ADS)

    Suroso, S.; Bardossy, A.

    2015-12-01

    Spatial precipitation model which is capable of deriving climate change scenarios at a finer spatial resolution has been playing crucial role in many hydrological applications. Regional climate models (RCMs) are promising tools which can provide projected precipitation data with high spatial and temporal resolutions. This study investigates asymmetrical spatial dependence of precipitation obtained from historical and future RCM simulations on the basis of empirical bivariate copulas. The study regions are located on the south part of Germany namely the states of Bavaria, Baden Württemberg, and Rhine Pfalz using 890 observation stations. RCM grid points are then selected based on nearest grid point to each observation site. Empirical bivariate copulas are constructed by adopting the concept of regionalized of variables in spatial random process assuming that for every selected time interval, precipitation over the region of interest is assumed to be a realization of spatial random process. To get reasonable this assumption, investigation regions are divided into several sub-regions and selected based on homogeneity areas with little topography variation. In order to study behavior of the precipitation fields at different time scales, the data are aggregated into the higher time scales for instance at 5, 10, 15 days, monthly, and quarterly in each different seasons. The asymmetrical dependence is calculated using the deviation between the joint probability of exceeding a quantile 1-u and not exceeding the quantile u for each realization using different values of u (0.1, 0.2, 0.3, 0.4). Positive asymmetric indicates that the high values have a stronger dependence than the low values and vice versa. Gaussian simulation based testing is then applied for counting its degree of uncertainty. Empirical evidences prove that both observations and RCM simulations show an interesting systematic pattern relating to the domination of positive non-symmetrical dependence in short

  13. Dynamics of intracranial electroencephalographic recordings from epilepsy patients using univariate and bivariate recurrence networks

    NASA Astrophysics Data System (ADS)

    Subramaniyam, Narayan Puthanmadam; Hyttinen, Jari

    2015-02-01

    Recently Andrezejak et al. combined the randomness and nonlinear independence test with iterative amplitude adjusted Fourier transform (iAAFT) surrogates to distinguish between the dynamics of seizure-free intracranial electroencephalographic (EEG) signals recorded from epileptogenic (focal) and nonepileptogenic (nonfocal) brain areas of epileptic patients. However, stationarity is a part of the null hypothesis for iAAFT surrogates and thus nonstationarity can violate the null hypothesis. In this work we first propose the application of the randomness and nonlinear independence test based on recurrence network measures to distinguish between the dynamics of focal and nonfocal EEG signals. Furthermore, we combine these tests with both iAAFT and truncated Fourier transform (TFT) surrogate methods, which also preserves the nonstationarity of the original data in the surrogates along with its linear structure. Our results indicate that focal EEG signals exhibit an increased degree of structural complexity and interdependency compared to nonfocal EEG signals. In general, we find higher rejections for randomness and nonlinear independence tests for focal EEG signals compared to nonfocal EEG signals. In particular, the univariate recurrence network measures, the average clustering coefficient C and assortativity R , and the bivariate recurrence network measure, the average cross-clustering coefficient Ccross, can successfully distinguish between the focal and nonfocal EEG signals, even when the analysis is restricted to nonstationary signals, irrespective of the type of surrogates used. On the other hand, we find that the univariate recurrence network measures, the average path length L , and the average betweenness centrality BC fail to distinguish between the focal and nonfocal EEG signals when iAAFT surrogates are used. However, these two measures can distinguish between focal and nonfocal EEG signals when TFT surrogates are used for nonstationary signals. We also

  14. Application of continuous normal-lognormal bivariate density functions in a sensitivity analysis of municipal solid waste landfill.

    PubMed

    Petrovic, Igor; Hip, Ivan; Fredlund, Murray D

    2016-09-01

    The variability of untreated municipal solid waste (MSW) shear strength parameters, namely cohesion and shear friction angle, with respect to waste stability problems, is of primary concern due to the strong heterogeneity of MSW. A large number of municipal solid waste (MSW) shear strength parameters (friction angle and cohesion) were collected from published literature and analyzed. The basic statistical analysis has shown that the central tendency of both shear strength parameters fits reasonably well within the ranges of recommended values proposed by different authors. In addition, it was established that the correlation between shear friction angle and cohesion is not strong but it still remained significant. Through use of a distribution fitting method it was found that the shear friction angle could be adjusted to a normal probability density function while cohesion follows the log-normal density function. The continuous normal-lognormal bivariate density function was therefore selected as an adequate model to ascertain rational boundary values ("confidence interval") for MSW shear strength parameters. It was concluded that a curve with a 70% confidence level generates a "confidence interval" within the reasonable limits. With respect to the decomposition stage of the waste material, three different ranges of appropriate shear strength parameters were indicated. Defined parameters were then used as input parameters for an Alternative Point Estimated Method (APEM) stability analysis on a real case scenario of the Jakusevec landfill. The Jakusevec landfill is the disposal site of the capital of Croatia - Zagreb. The analysis shows that in the case of a dry landfill the most significant factor influencing the safety factor was the shear friction angle of old, decomposed waste material, while in the case of a landfill with significant leachate level the most significant factor influencing the safety factor was the cohesion of old, decomposed waste material. The

  15. Application of continuous normal-lognormal bivariate density functions in a sensitivity analysis of municipal solid waste landfill.

    PubMed

    Petrovic, Igor; Hip, Ivan; Fredlund, Murray D

    2016-09-01

    The variability of untreated municipal solid waste (MSW) shear strength parameters, namely cohesion and shear friction angle, with respect to waste stability problems, is of primary concern due to the strong heterogeneity of MSW. A large number of municipal solid waste (MSW) shear strength parameters (friction angle and cohesion) were collected from published literature and analyzed. The basic statistical analysis has shown that the central tendency of both shear strength parameters fits reasonably well within the ranges of recommended values proposed by different authors. In addition, it was established that the correlation between shear friction angle and cohesion is not strong but it still remained significant. Through use of a distribution fitting method it was found that the shear friction angle could be adjusted to a normal probability density function while cohesion follows the log-normal density function. The continuous normal-lognormal bivariate density function was therefore selected as an adequate model to ascertain rational boundary values ("confidence interval") for MSW shear strength parameters. It was concluded that a curve with a 70% confidence level generates a "confidence interval" within the reasonable limits. With respect to the decomposition stage of the waste material, three different ranges of appropriate shear strength parameters were indicated. Defined parameters were then used as input parameters for an Alternative Point Estimated Method (APEM) stability analysis on a real case scenario of the Jakusevec landfill. The Jakusevec landfill is the disposal site of the capital of Croatia - Zagreb. The analysis shows that in the case of a dry landfill the most significant factor influencing the safety factor was the shear friction angle of old, decomposed waste material, while in the case of a landfill with significant leachate level the most significant factor influencing the safety factor was the cohesion of old, decomposed waste material. The

  16. Improved deadzone modeling for bivariate wavelet shrinkage-based image denoising

    NASA Astrophysics Data System (ADS)

    DelMarco, Stephen

    2016-05-01

    Modern image processing performed on-board low Size, Weight, and Power (SWaP) platforms, must provide high- performance while simultaneously reducing memory footprint, power consumption, and computational complexity. Image preprocessing, along with downstream image exploitation algorithms such as object detection and recognition, and georegistration, place a heavy burden on power and processing resources. Image preprocessing often includes image denoising to improve data quality for downstream exploitation algorithms. High-performance image denoising is typically performed in the wavelet domain, where noise generally spreads and the wavelet transform compactly captures high information-bearing image characteristics. In this paper, we improve modeling fidelity of a previously-developed, computationally-efficient wavelet-based denoising algorithm. The modeling improvements enhance denoising performance without significantly increasing computational cost, thus making the approach suitable for low-SWAP platforms. Specifically, this paper presents modeling improvements to the Sendur-Selesnick model (SSM) which implements a bivariate wavelet shrinkage denoising algorithm that exploits interscale dependency between wavelet coefficients. We formulate optimization problems for parameters controlling deadzone size which leads to improved denoising performance. Two formulations are provided; one with a simple, closed form solution which we use for numerical result generation, and the second as an integral equation formulation involving elliptic integrals. We generate image denoising performance results over different image sets drawn from public domain imagery, and investigate the effect of wavelet filter tap length on denoising performance. We demonstrate denoising performance improvement when using the enhanced modeling over performance obtained with the baseline SSM model.

  17. Semiparametric bivariate zero-inflated Poisson models with application to studies of abundance for multiple species

    USGS Publications Warehouse

    Arab, Ali; Holan, Scott H.; Wikle, Christopher K.; Wildhaber, Mark L.

    2012-01-01

    Ecological studies involving counts of abundance, presence–absence or occupancy rates often produce data having a substantial proportion of zeros. Furthermore, these types of processes are typically multivariate and only adequately described by complex nonlinear relationships involving externally measured covariates. Ignoring these aspects of the data and implementing standard approaches can lead to models that fail to provide adequate scientific understanding of the underlying ecological processes, possibly resulting in a loss of inferential power. One method of dealing with data having excess zeros is to consider the class of univariate zero-inflated generalized linear models. However, this class of models fails to address the multivariate and nonlinear aspects associated with the data usually encountered in practice. Therefore, we propose a semiparametric bivariate zero-inflated Poisson model that takes into account both of these data attributes. The general modeling framework is hierarchical Bayes and is suitable for a broad range of applications. We demonstrate the effectiveness of our model through a motivating example on modeling catch per unit area for multiple species using data from the Missouri River Benthic Fishes Study, implemented by the United States Geological Survey.

  18. Perceived social support and academic achievement: cross-lagged panel and bivariate growth curve analyses.

    PubMed

    Mackinnon, Sean P

    2012-04-01

    As students transition to post-secondary education, they experience considerable stress and declines in academic performance. Perceived social support is thought to improve academic achievement by reducing stress. Longitudinal designs with three or more waves are needed in this area because they permit stronger causal inferences and help disentangle the direction of relationships. This study uses a cross-lagged panel and a bivariate growth curve analysis with a three-wave longitudinal design. Participants include 10,445 students (56% female; 12.6% born outside of Canada) transitioning to post-secondary education from ages 15-19. Self-report measures of academic achievement and a generalized measure of perceived social support were used. An increase in average relative standing in academic achievement predicted an increase in average relative standing on perceived social support 2 years later, but the reverse was not true. High levels of perceived social support at age 15 did not protect against declines in academic achievement over time. In sum, perceived social support appears to have no bearing on adolescents' future academic performance, despite commonly held assumptions of its importance.

  19. Bivariate Frequency Analysis with Nonstationary Gumbel/GEV Marginal Distributions for Rainfall Event

    NASA Astrophysics Data System (ADS)

    Joo, Kyungwon; Kim, Sunghun; Kim, Hanbeen; Ahn, Hyunjun; Heo, Jun-Haeng

    2016-04-01

    Multivariate frequency analysis has been developing for hydrological data recently. Particularly, the copula model has been used as an effective method which has no limitation on deciding marginal distributions. The time-series rainfall data can be characterized to rainfall event by inter-event time definition and each rainfall event has rainfall depth and duration. In addition, changes in rainfall depth have been studied recently due to climate change. The nonstationary (time-varying) Gumbel and Generalized Extreme Value (GEV) have been developed and their performances have been investigated from many studies. In the current study, bivariate frequency analysis has performed for rainfall depth and duration using Archimedean copula on stationary and nonstationary hourly rainfall data to consider the effect of climate change. The parameter of copula model is estimated by inference function for margin (IFM) method and stationary/nonstationary Gumbel and GEV distributions are used for marginal distributions. As a result, level curve of copula model is obtained and goodness-of-fit test is performed to choose appropriate marginal distribution among the applied stationary and nonstationary Gumbel and GEV distributions.

  20. Modeling both of the number of pausibacillary and multibacillary leprosy patients by using bivariate poisson regression

    NASA Astrophysics Data System (ADS)

    Winahju, W. S.; Mukarromah, A.; Putri, S.

    2015-03-01

    Leprosy is a chronic infectious disease caused by bacteria of leprosy (Mycobacterium leprae). Leprosy has become an important thing in Indonesia because its morbidity is quite high. Based on WHO data in 2014, in 2012 Indonesia has the highest number of new leprosy patients after India and Brazil with a contribution of 18.994 people (8.7% of the world). This number makes Indonesia automatically placed as the country with the highest number of leprosy morbidity of ASEAN countries. The province that most contributes to the number of leprosy patients in Indonesia is East Java. There are two kind of leprosy. They consist of pausibacillary and multibacillary. The morbidity of multibacillary leprosy is higher than pausibacillary leprosy. This paper will discuss modeling both of the number of multibacillary and pausibacillary leprosy patients as responses variables. These responses are count variables, so modeling will be conducted by using bivariate poisson regression method. Unit experiment used is in East Java, and predictors involved are: environment, demography, and poverty. The model uses data in 2012, and the result indicates that all predictors influence significantly.

  1. Copula-based regression modeling of bivariate severity of temporary disability and permanent motor injuries.

    PubMed

    Ayuso, Mercedes; Bermúdez, Lluís; Santolino, Miguel

    2016-04-01

    The analysis of factors influencing the severity of the personal injuries suffered by victims of motor accidents is an issue of major interest. Yet, most of the extant literature has tended to address this question by focusing on either the severity of temporary disability or the severity of permanent injury. In this paper, a bivariate copula-based regression model for temporary disability and permanent injury severities is introduced for the joint analysis of the relationship with the set of factors that might influence both categories of injury. Using a motor insurance database with 21,361 observations, the copula-based regression model is shown to give a better performance than that of a model based on the assumption of independence. The inclusion of the dependence structure in the analysis has a higher impact on the variance estimates of the injury severities than it does on the point estimates. By taking into account the dependence between temporary and permanent severities a more extensive factor analysis can be conducted. We illustrate that the conditional distribution functions of injury severities may be estimated, thus, providing decision makers with valuable information.

  2. IDF relationships using bivariate copula for storm events in Peninsular Malaysia

    NASA Astrophysics Data System (ADS)

    Ariff, N. M.; Jemain, A. A.; Ibrahim, K.; Wan Zin, W. Z.

    2012-11-01

    SummaryIntensity-duration-frequency (IDF) curves are used in many hydrologic designs for the purpose of water managements and flood preventions. The IDF curves available in Malaysia are those obtained from univariate analysis approach which only considers the intensity of rainfalls at fixed time intervals. As several rainfall variables are correlated with each other such as intensity and duration, this paper aims to derive IDF points for storm events in Peninsular Malaysia by means of bivariate frequency analysis. This is achieved through utilizing the relationship between storm intensities and durations using the copula method. Four types of copulas; namely the Ali-Mikhail-Haq (AMH), Frank, Gaussian and Farlie-Gumbel-Morgenstern (FGM) copulas are considered because the correlation between storm intensity, I, and duration, D, are negative and these copulas are appropriate when the relationship between the variables are negative. The correlations are attained by means of Kendall's τ estimation. The analysis was performed on twenty rainfall stations with hourly data across Peninsular Malaysia. Using Akaike's Information Criteria (AIC) for testing goodness-of-fit, both Frank and Gaussian copulas are found to be suitable to represent the relationship between I and D. The IDF points found by the copula method are compared to the IDF curves yielded based on the typical IDF empirical formula of the univariate approach. This study indicates that storm intensities obtained from both methods are in agreement with each other for any given storm duration and for various return periods.

  3. On a bivariate spectral relaxation method for unsteady magneto-hydrodynamic flow in porous media.

    PubMed

    Magagula, Vusi M; Motsa, Sandile S; Sibanda, Precious; Dlamini, Phumlani G

    2016-01-01

    The paper presents a significant improvement to the implementation of the spectral relaxation method (SRM) for solving nonlinear partial differential equations that arise in the modelling of fluid flow problems. Previously the SRM utilized the spectral method to discretize derivatives in space and finite differences to discretize in time. In this work we seek to improve the performance of the SRM by applying the spectral method to discretize derivatives in both space and time variables. The new approach combines the relaxation scheme of the SRM, bivariate Lagrange interpolation as well as the Chebyshev spectral collocation method. The technique is tested on a system of four nonlinear partial differential equations that model unsteady three-dimensional magneto-hydrodynamic flow and mass transfer in a porous medium. Computed solutions are compared with previously published results obtained using the SRM, the spectral quasilinearization method and the Keller-box method. There is clear evidence that the new approach produces results that as good as, if not better than published results determined using the other methods. The main advantage of the new approach is that it offers better accuracy on coarser grids which significantly improves the computational speed of the method. The technique also leads to faster convergence to the required solution.

  4. The Bivariate Luminosity--HI Mass Distribution Function of Galaxies based on the NIBLES Survey

    NASA Astrophysics Data System (ADS)

    Butcher, Zhon; Schneider, Stephen E.; van Driel, Wim; Lehnert, Matt

    2016-01-01

    We use 21cm HI line observations for 2610 galaxies from the Nançay Interstellar Baryons Legacy Extragalactic Survey (NIBLES) to derive a bivariate luminosity--HI mass distribution function. Our HI survey was selected to randomly probe the local (900 < cz < 12,000 km/s) galaxy population in each 0.5 mag wide bin for the absolute z-band magnitude range of -13.5 < Mz < -24 without regard to morphology or color. This targeted survey allowed more on-source integration time for weak and non-detected sources, enabling us to probe lower HI mass fractions and apply lower upper limits for non-detections than would be possible with the larger blind HI surveys. Additionally, we obtained a factor of four higher sensitivity follow-up observations at Arecibo of 90 galaxies from our non-detected and marginally detected categories to quantify the underlying HI distribution of sources not detected at Nançay. Using the optical luminosity function and our higher sensitivity follow up observations as priors, we use a 2D stepwise maximum likelihood technique to derive the two dimensional volume density distribution of luminosity and HI mass in each SDSS band.

  5. Fluid deprivation increases isotonic NaCl intake, but not hypertonic salt intake, under normal and heated conditions in obese Zucker rats.

    PubMed

    Omouessi, S T; Lemamy, G J; Kiki-Mvouaka, S; Fernette, B; Falconetti, C; Ndeboko, B; Mouecoucou, J; Thornton, S N

    2016-02-01

    In the course of exposure to fluid deprivation and heated environment, mammals regulate their hydromineral balance and body temperature by a number of mechanisms including sweating, water and salt intakes. Here we challenged obese Zucker rats, known to have a predisposition to hypertension, with 0.9%NaCl alone or with 2%NaCl solution + water to drink under fluid deprivation and heated conditions. Food and fluid intakes, body weight, diuresis and natriuresis were measured daily throughout. Serum aldosterone levels and Na(+) concentration were also analyzed. Data showed that obese and lean rats presented similar baseline measurements of food, 0.9%NaCl and fluid intakes, diuresis and fluid balance; whereas hypertonic 2%NaCl consumption was almost absent. Before and during fluid deprivation animals increased isotonic but not hypertonic NaCl intake; the obese showed significant increases in diuresis and Na(+) excretion, whereas, total fluid intake was similar between groups. Heat increased isotonic NaCl intake and doubled natriuresis in obese which were wet on their fur and displayed a paradoxical increase of fluid gain. Fluid deprivation plus heat produced similar negative fluid balance in all groups. Body weight losses, food intake and diuresis reductions were amplified under the combined conditions. Animals exposed to 2%NaCl showed higher circulating levels of aldosterone and obese were lower than leans. In animals which drank 0.9%NaCl, obese showed higher serum levels of Na(+) than leans. We conclude that in spite of their higher sensitivity to high salt and heat obese Zucker rats can control hydromineral balance in response to fluid deprivation and heat by adjusting isotonic NaCl preference with sodium balance and circulating levels of aldosterone. This suggests a key hormonal role in the mechanisms underlying thermoregulation, body fluid homeostasis and sodium intake. PMID:26621332

  6. The relationship between arginine vasopressin levels and hyponatremia following a percutaneous renal biopsy in children receiving hypotonic or isotonic intravenous fluids.

    PubMed

    Kanda, Kyoko; Nozu, Kandai; Kaito, Hiroshi; Iijima, Kazumoto; Nakanishi, Koichi; Yoshikawa, Norishige; Ninchoji, Takeshi; Hashimura, Yuya; Matsuo, Masafumi; Moritz, Michael L

    2011-01-01

    Post-operative hyponatremia is a common complication in children which results from hypotonic fluid administration in the presence of arginine vasopressin (AVP) excess. We evaluated the relationship between the change in serum sodium and AVP levels following percutaneous renal biopsy in children receiving either hypotonic or isotonic fluids. This study was prompted after we encountered a patient who developed near-fatal hyponatremic encephalopathy following a renal biopsy while receiving hypotonic fluids. The relationship between the change in serum sodium and AVP levels was evaluated prior to (T0) and at 5 h (T5) following a percutaneous renal biopsy in 60 children receiving either hypotonic (0.6% NaCl, 90 mEq/L) or isotonic fluids (0.9% NaCl, 154 mEq/L). The proportion of patients with elevated AVP levels post-procedure was similar between those receiving 0.6 or 0.9% NaCl (30 vs. 26%). Patients receiving 0.6% NaCl with elevated AVP levels experienced a fall in serum sodium of 1.9 ± 1.5 mEq/L, whereas those receiving 0.9% NaCl had a rise in serum sodium of 0.85 ± 0.34 mEq/L with no patients developing hyponatremia. There were no significant changes in serum sodium levels in patients with normal AVP concentrations post-procedure in either group. In conclusion, elevated AVP levels were common among our patients following a percutaneous renal biopsy. Isotonic fluids prevented a fall in serum sodium and hyponatremia, while hypotonic fluids did not.

  7. Fluid deprivation increases isotonic NaCl intake, but not hypertonic salt intake, under normal and heated conditions in obese Zucker rats.

    PubMed

    Omouessi, S T; Lemamy, G J; Kiki-Mvouaka, S; Fernette, B; Falconetti, C; Ndeboko, B; Mouecoucou, J; Thornton, S N

    2016-02-01

    In the course of exposure to fluid deprivation and heated environment, mammals regulate their hydromineral balance and body temperature by a number of mechanisms including sweating, water and salt intakes. Here we challenged obese Zucker rats, known to have a predisposition to hypertension, with 0.9%NaCl alone or with 2%NaCl solution + water to drink under fluid deprivation and heated conditions. Food and fluid intakes, body weight, diuresis and natriuresis were measured daily throughout. Serum aldosterone levels and Na(+) concentration were also analyzed. Data showed that obese and lean rats presented similar baseline measurements of food, 0.9%NaCl and fluid intakes, diuresis and fluid balance; whereas hypertonic 2%NaCl consumption was almost absent. Before and during fluid deprivation animals increased isotonic but not hypertonic NaCl intake; the obese showed significant increases in diuresis and Na(+) excretion, whereas, total fluid intake was similar between groups. Heat increased isotonic NaCl intake and doubled natriuresis in obese which were wet on their fur and displayed a paradoxical increase of fluid gain. Fluid deprivation plus heat produced similar negative fluid balance in all groups. Body weight losses, food intake and diuresis reductions were amplified under the combined conditions. Animals exposed to 2%NaCl showed higher circulating levels of aldosterone and obese were lower than leans. In animals which drank 0.9%NaCl, obese showed higher serum levels of Na(+) than leans. We conclude that in spite of their higher sensitivity to high salt and heat obese Zucker rats can control hydromineral balance in response to fluid deprivation and heat by adjusting isotonic NaCl preference with sodium balance and circulating levels of aldosterone. This suggests a key hormonal role in the mechanisms underlying thermoregulation, body fluid homeostasis and sodium intake.

  8. Solubility of NH3 and apparent pK of NH4+ in human plasma, isotonic salt solutions and water at 37 degrees C.

    PubMed

    Lang, W; Blöck, T M; Zander, R

    1998-05-01

    The solubility of ammonia, alphaNH3 (mM/mmHg), was determined at 37 degrees C and low ammonia partial pressure (0.02-1 mmHg) in pure water (n =24) as 46.70+/-0.40; aqueous isotonic salt solutions (n = 7) as 46.8+/-0.81; and human plasma (n = 5) as 42.0+/-0.66. The last figure increases to 45.3+/-0.63 if expressed in molal units (mmol/kg plasma water x mmHg) instead of molarity with respect to the water content of the plasma (mean from four healthy and fasting donors: 0.908+0.005 kg H2O/kg plasma; mean density at 37 degrees C: 1.020+/-0.002 kg/l). In pure water, the solubility value is the mean of three different methods: (a) extrapolation of the salting-out effect of ammonia in aqueous NaOH to zero concentration; (b) slope of Henry-Dalton's law and (c) directly measured in pure water and 0.001 M aqueous NaOH. Based on the Henderson-Hasselbalch equation for the system NH4/NH3 in isotonic salt solutions and human plasma, both constants, apparent pK and solubility, can be derived from total ammonia concentration and pH at equilibrium with defined ammonia gas phase, if additionally the concentration of NH4 or NH3 is known. This was verified, in the first case, by determining the concentration of NH4+ by the experimental conditions, and in the second, by two measurements of total ammonia concentration at two different pH values. Total ammonia concentration was measured by a specific enzymatic standard test and pH with the glass electrode. The mean apparent pK was 8.968+/-0.013 in isotonic salt solutions (n = 7), and in human plasma (n = 10) it was 9.014+/-0.033.

  9. On the level of skill in predicting maximum sunspot number - A comparative study of single variate and bivariate precursor techniques

    NASA Technical Reports Server (NTRS)

    Wilson, Robert M.

    1990-01-01

    The level of skill in predicting the size of the sunspot cycle is investigated for the two types of precursor techniques, single variate and bivariate fits, both applied to cycle 22. The present level of growth in solar activity is compared to the mean level of growth (cycles 10-21) and to the predictions based on the precursor techniques. It is shown that, for cycle 22, both single variate methods (based on geomagnetic data) and bivariate methods suggest a maximum amplitude smaller than that observed for cycle 19, and possibly for cycle 21. Compared to the mean cycle, cycle 22 is presently behaving as if it were a +2.6 sigma cycle (maximum amplitude of about 225), which means that either it will be the first cycle not to be reliably predicted by the combined precursor techniques or its deviation relative to the mean cycle will substantially decrease over the next 18 months.

  10. New Colors for Histology: Optimized Bivariate Color Maps Increase Perceptual Contrast in Histological Images

    PubMed Central

    Kather, Jakob Nikolas; Weis, Cleo-Aron; Marx, Alexander; Schuster, Alexander K.; Schad, Lothar R.; Zöllner, Frank Gerrit

    2015-01-01

    Background Accurate evaluation of immunostained histological images is required for reproducible research in many different areas and forms the basis of many clinical decisions. The quality and efficiency of histopathological evaluation is limited by the information content of a histological image, which is primarily encoded as perceivable contrast differences between objects in the image. However, the colors of chromogen and counterstain used for histological samples are not always optimally distinguishable, even under optimal conditions. Methods and Results In this study, we present a method to extract the bivariate color map inherent in a given histological image and to retrospectively optimize this color map. We use a novel, unsupervised approach based on color deconvolution and principal component analysis to show that the commonly used blue and brown color hues in Hematoxylin—3,3’-Diaminobenzidine (DAB) images are poorly suited for human observers. We then demonstrate that it is possible to construct improved color maps according to objective criteria and that these color maps can be used to digitally re-stain histological images. Validation To validate whether this procedure improves distinguishability of objects and background in histological images, we re-stain phantom images and N = 596 large histological images of immunostained samples of human solid tumors. We show that perceptual contrast is improved by a factor of 2.56 in phantom images and up to a factor of 2.17 in sets of histological tumor images. Context Thus, we provide an objective and reliable approach to measure object distinguishability in a given histological image and to maximize visual information available to a human observer. This method could easily be incorporated in digital pathology image viewing systems to improve accuracy and efficiency in research and diagnostics. PMID:26717571

  11. Bivariate Genome-Wide Linkage Analysis of Femoral Bone Traits and Leg Lean Mass: Framingham Study

    PubMed Central

    Karasik, David; Zhou, Yanhua; Cupples, L Adrienne; Hannan, Marian T; Kiel, Douglas P; Demissie, Serkalem

    2009-01-01

    The risk of osteoporotic fracture is a function of both applied muscle mass and bone tissue distribution. Leg lean mass (LLM) and femoral bone geometry are both known to have substantial genetic components. Therefore, we estimated shared heritability (h2) and performed linkage analysis to identify chromosomal regions governing both LLM and bone geometry. A genome-wide scan (using 636 microsatellite markers) for linkage analyses was performed on 1346 adults from 327 extended families of the Framingham study. DXA measures were LLM, femoral neck length, neck-shaft angle (NSA), subperiosteal width, cross-sectional area (CSA), and section modulus (Z) at the femoral narrow neck and shaft (S) regions. Variance component linkage analysis was performed on normalized residuals (adjusted for age, height, BMI, and estrogen status in women). The results indicated substantial h2 for LLM (0.42 ± 0.07) that was comparable to bone geometry traits. Phenotypic correlations between LLM and bone geometry phenotypes ranged from 0.033 with NSA (p > 0.05) to 0.251 with S_Z (p < 0.001); genetic correlations ranged from 0.087 (NSA, p > 0.05) to 0.454 (S_Z, p < 0.001). Univariate linkage analysis of covariate-adjusted LLM identified no chromosomal regions with LOD scores ≥2.0; however, bivariate analysis identified two loci with LOD scores >3.0, shared by LLM with S_CSA on chromosome 12p12.3–12p13.2, and with NSA, on 14q21.3–22.1. In conclusion, we identified chromosomal regions potentially linked to both LLM and femoral bone geometry. Identification and subsequent characterization of these shared loci may further elucidate the genetic contributions to both osteoporosis and sarcopenia. PMID:19063671

  12. Using bivariate signal analysis to characterize the epileptic focus: The benefit of surrogates

    NASA Astrophysics Data System (ADS)

    Andrzejak, R. G.; Chicharro, D.; Lehnertz, K.; Mormann, F.

    2011-04-01

    The disease epilepsy is related to hypersynchronous activity of networks of neurons. While acute epileptic seizures are the most extreme manifestation of this hypersynchronous activity, an elevated level of interdependence of neuronal dynamics is thought to persist also during the seizure-free interval. In multichannel recordings from brain areas involved in the epileptic process, this interdependence can be reflected in an increased linear cross correlation but also in signal properties of higher order. Bivariate time series analysis comprises a variety of approaches, each with different degrees of sensitivity and specificity for interdependencies reflected in lower- or higher-order properties of pairs of simultaneously recorded signals. Here we investigate which approach is best suited to detect putatively elevated interdependence levels in signals recorded from brain areas involved in the epileptic process. For this purpose, we use the linear cross correlation that is sensitive to lower-order signatures of interdependence, a nonlinear interdependence measure that integrates both lower- and higher-order properties, and a surrogate-corrected nonlinear interdependence measure that aims to specifically characterize higher-order properties. We analyze intracranial electroencephalographic recordings of the seizure-free interval from 29 patients with an epileptic focus located in the medial temporal lobe. Our results show that all three approaches detect higher levels of interdependence for signals recorded from the brain hemisphere containing the epileptic focus as compared to signals recorded from the opposite hemisphere. For the linear cross correlation, however, these differences are not significant. For the nonlinear interdependence measure, results are significant but only of moderate accuracy with regard to the discriminative power for the focal and nonfocal hemispheres. The highest significance and accuracy is obtained for the surrogate-corrected nonlinear

  13. Diagnostic performance of lung ultrasound in the diagnosis of pneumonia: a bivariate meta-analysis

    PubMed Central

    Hu, Qian-Jing; Shen, Yong-Chun; Jia, Liu-Qun; Guo, Shu-Jin; Long, Hong-Yu; Pang, Cai-Shuang; Yang, Ting; Wen, Fu-Qiang

    2014-01-01

    Background and Objective: Pneumonia is a common disease with both high morbidity and mortality, the diagnosis of pneumonia remains a clinical challenge. Many studies have been conducted to identify the usefulness of lung ultrasound for the diagnosis of pneumonia, but with inconsistent and inconclusive results. The present study aimed to establish the overall diagnostic accuracy of lung ultrasound in diagnosing pneumonia. Methods: Based on a comprehensive search of the Pubmed, Embase, and the Cochrane database, we identified out-come data from all articles estimating diagnostic accuracy with lung ultrasound for pneumonia. Quality was assessed with the Quality Assessment for Diagnostic Accuracy Studies. Results from different studies were pooled using a bivariate meta-analysis. Summary receiver operating characteristic curve was used to assess the overall performance of lung ultrasound-based assays. Results: Nine studies containing 1080 subjects were included in this meta-analysis. The summary estimates for lung ultrasound in the diagnosis of pneumonia in the studies included were as follows: sensitivity, 0.97 (95% CI: 0.93-0.99); specificity, 0.94 (95% CI: 0.85-0.98); DOR, 507.99 (95% CI: 128.11-2014.34); positive likelihood ratio, 15.62 (95% CI: 6.31-38.68); negative likelihood ratio, 0.03 (95% CI: 0.01-0.08); The area under the summary receiver operating characteristic curve was 0.99 (95% CI: 0.98-1.00). Conclusion: Lung ultrasound is a capable of diagnosing pneumonia with high accuracy and is a promising attractive alternative to chest radiography and thoracic CT scan. PMID:24482696

  14. A view on coupled cluster perturbation theory using a bivariational Lagrangian formulation.

    PubMed

    Kristensen, Kasper; Eriksen, Janus J; Matthews, Devin A; Olsen, Jeppe; Jørgensen, Poul

    2016-02-14

    We consider two distinct coupled cluster (CC) perturbation series that both expand the difference between the energies of the CCSD (CC with single and double excitations) and CCSDT (CC with single, double, and triple excitations) models in orders of the Møller-Plesset fluctuation potential. We initially introduce the E-CCSD(T-n) series, in which the CCSD amplitude equations are satisfied at the expansion point, and compare it to the recently developed CCSD(T-n) series [J. J. Eriksen et al., J. Chem. Phys. 140, 064108 (2014)], in which not only the CCSD amplitude, but also the CCSD multiplier equations are satisfied at the expansion point. The computational scaling is similar for the two series, and both are term-wise size extensive with a formal convergence towards the CCSDT target energy. However, the two series are different, and the CCSD(T-n) series is found to exhibit a more rapid convergence up through the series, which we trace back to the fact that more information at the expansion point is utilized than for the E-CCSD(T-n) series. The present analysis can be generalized to any perturbation expansion representing the difference between a parent CC model and a higher-level target CC model. In general, we demonstrate that, whenever the parent parameters depend upon the perturbation operator, a perturbation expansion of the CC energy (where only parent amplitudes are used) differs from a perturbation expansion of the CC Lagrangian (where both parent amplitudes and parent multipliers are used). For the latter case, the bivariational Lagrangian formulation becomes more than a convenient mathematical tool, since it facilitates a different and faster convergent perturbation series than the simpler energy-based expansion.

  15. Bivariate genome-wide association study suggests that the DARC gene influences lean body mass and age at menarche.

    PubMed

    Hai, Rong; Zhang, Lei; Pei, Yufang; Zhao, Lanjuan; Ran, Shu; Han, Yingying; Zhu, Xuezhen; Shen, Hui; Tian, Qing; Deng, Hongwen

    2012-06-01

    Lean body mass (LBM) and age at menarche (AAM) are two important complex traits for human health. The aim of this study was to identify pleiotropic genes for both traits using a powerful bivariate genome-wide association study (GWAS). Two studies, a discovery study and a replication study, were performed. In the discovery study, 909622 single nucleotide polymorphisms (SNPs) were genotyped in 801 unrelated female Han Chinese subjects using the Affymetrix human genome-wide SNP array 6.0 platform. Then, a bivariate GWAS was performed to identify the SNPs that may be important for LBM and AAM. In the replication study, significant findings from the discovery study were validated in 1692 unrelated Caucasian female subjects. One SNP rs3027009 that was bivariately associated with left arm lean mass and AAM in the discovery samples (P=7.26×10(-6)) and in the replication samples (P=0.005) was identified. The SNP is located at the upstream of DARC (Duffy antigen receptor for chemokines) gene, suggesting that DARC may play an important role in regulating the metabolisms of both LBM and AAM.

  16. Bivariate segmentation of SNP-array data for allele-specific copy number analysis in tumour samples

    PubMed Central

    2013-01-01

    Background SNP arrays output two signals that reflect the total genomic copy number (LRR) and the allelic ratio (BAF), which in combination allow the characterisation of allele-specific copy numbers (ASCNs). While methods based on hidden Markov models (HMMs) have been extended from array comparative genomic hybridisation (aCGH) to jointly handle the two signals, only one method based on change-point detection, ASCAT, performs bivariate segmentation. Results In the present work, we introduce a generic framework for bivariate segmentation of SNP array data for ASCN analysis. For the matter, we discuss the characteristics of the typically applied BAF transformation and how they affect segmentation, introduce concepts of multivariate time series analysis that are of concern in this field and discuss the appropriate formulation of the problem. The framework is implemented in a method named CnaStruct, the bivariate form of the structural change model (SCM), which has been successfully applied to transcriptome mapping and aCGH. Conclusions On a comprehensive synthetic dataset, we show that CnaStruct outperforms the segmentation of existing ASCN analysis methods. Furthermore, CnaStruct can be integrated into the workflows of several ASCN analysis tools in order to improve their performance, specially on tumour samples highly contaminated by normal cells. PMID:23497144

  17. Nonlinear bivariate dependency of price-volume relationships in agricultural commodity futures markets: A perspective from Multifractal Detrended Cross-Correlation Analysis

    NASA Astrophysics Data System (ADS)

    He, Ling-Yun; Chen, Shu-Peng

    2011-01-01

    Nonlinear dependency between characteristic financial and commodity market quantities (variables) is crucially important, especially between trading volume and market price. Studies on nonlinear dependency between price and volume can provide practical insights into market trading characteristics, as well as the theoretical understanding of market dynamics. Actually, nonlinear dependency and its underlying dynamical mechanisms between price and volume can help researchers and technical analysts in understanding the market dynamics by integrating the market variables, instead of investigating them in the current literature. Therefore, for investigating nonlinear dependency of price-volume relationships in agricultural commodity futures markets in China and the US, we perform a new statistical test to detect cross-correlations and apply a new methodology called Multifractal Detrended Cross-Correlation Analysis (MF-DCCA), which is an efficient algorithm to analyze two spatially or temporally correlated time series. We discuss theoretically the relationship between the bivariate cross-correlation exponent and the generalized Hurst exponents for time series of respective variables. We also perform an empirical study and find that there exists a power-law cross-correlation between them, and that multifractal features are significant in all the analyzed agricultural commodity futures markets.

  18. Reducing uncertainty in the selection of bi-variate distributions of flood peaks and volumes using copulas and hydrological process-based model selection

    NASA Astrophysics Data System (ADS)

    Szolgay, Jan; Gaál, Ladislav; Bacigál, Tomáš; Kohnová, Silvia; Blöschl, Günter

    2016-04-01

    Bi-variate distributions of flood peaks and flood event volumes are needed for a range of practical purposes including e.g. retention basin design and identifying extent and duration of flooding in flood hazard zones. However, the selection of the types of bi-variate distributions and estimating their parameters from observed peak-volume pairs are associated with far larger uncertainties compared to uni-variate distributions, since observed flood records of required length are rarely available. This poses a serious problem to reliable flood risk estimation in bi-variate design cases. The aim of this contribution was to shed light on the possibility of reducing uncertainties in the estimation of the dependence models/parameters from a regional perspective. The peak-volume relationships were modeled in terms of copulas. Flood events were classified according to their origin. In order to reduce the uncertainty in estimating flood risk, pooling and analyzing catchments of similar behavior according to flood process types was attempted. Most of the work reported in the literature so far did not direct the multivariate analysis toward discriminating certain types of models regionally according to specific runoff generation processes. Specifically, the contribution addresses these problems: - Are the peak-volume relationships of different flood types for a given catchment similar? - Are the peak-volume dependence structures between catchments in a larger region for given flood types similar? - Are some copula types more suitable for given flood process types and does this have consequences for reliable risk estimation? The target region is located in the northern parts of Austria, and consists of 72 small and mid-sized catchments. Instead of the traditional approach that deals with annual maximum floods, the current analysis includes all independent flood events in the region. 24 872 flood events from the period 1976-2007 were identified, and classified as synoptic, flash

  19. Comparison of acute responses to isotonic or isokinetic eccentric muscle action: differential outcomes in skeletal muscle damage and implications for rehabilitation.

    PubMed

    Alemany, J A; Delgado-Díaz, D C; Mathews, H; Davis, J M; Kostek, M C

    2014-01-01

    Both isotonic and isokinetic eccentric muscle contractions are commonly used in muscle research laboratories to induce muscle damage, yet, the muscle damage outcomes between these 2 modes of eccentric contraction have not been compared. The purpose of this study was to compare modes of contraction for differences in muscle damage. 16 men were placed in the isotonic (IT: 110% of maximal isometric torque) or the isokinetic (IK: 120°/s) group, with each group performing 200 eccentric muscle actions of the knee extensors. Isometric peak torque, perceived soreness and CK activity were measured immediately pre and post exercise, and 48-h post exercise. Mean total work (~1700 J) and peak torque per set (~265 Nm) decreased over the 200 repetitions (p<0.01), and was not different between groups. Damage markers changed 48-h post exercise (p<0.05): peak isometric torque (-13%), creatine kinase activity (+200%) and self-perceived muscular soreness (+4 unit change). Significant group×time interactions (p<0.01) indicated that peak isometric torque was 22% lower, and creatine kinase and self-perceived muscular soreness were 330% and 3 unit difference higher in the IT as compared to the IK groups, 48-h post exercise. When equating for total work, skeletal muscle damage markers are higher during IT vs. IK modes. This reflects differences inherent in contraction type and suggests that this should be taken into account during physical rehabilitation.

  20. K{sup {pi}}=8{sup -} isomers and K{sup {pi}}=2{sup -} octupole vibrations in N=150 shell-stabilized isotones

    SciTech Connect

    Robinson, A. P.; Khoo, T. L.; Ahmad, I.; Kondev, F. G.; Seweryniak, D.; Back, B. B.; Carpenter, M. P.; Davids, C. N.; Greene, J. P.; Gros, S.; Janssens, R. V. F.; Lauritsen, T.; Lister, C. J.; Peterson, D.; Zhu, S.; Tandel, S. K.; Chowdhury, P.; Tandel, U. S.; Nakatsukasa, T.; Asai, M.

    2008-09-15

    Isomers have been populated in {sup 246}Cm and {sup 252}No with quantum numbers K{sup {pi}}=8{sup -}, which decay through K{sup {pi}}=2{sup -} rotational bands built on octupole vibrational states. For N=150 isotones with (even) atomic number Z=94-102, the K{sup {pi}}=8{sup -} and 2{sup -} states have remarkably stable energies, indicating neutron excitations. An exception is a singular minimum in the 2{sup -} energy at Z=98, due to the additional role of proton configurations. The nearly constant energies, in isotones spanning an 18% increase in Coulomb energy near the Coulomb limit, provide a test for theory. The two-quasiparticle K{sup {pi}}=8{sup -} energies are described with single-particle energies given by the Woods-Saxon potential and the K{sup {pi}}=2{sup -} vibrational energies by quasiparticle random-phase approximation calculations. Ramifications for self-consistent mean-field theory are discussed.

  1. The systematic study of the electroporation and electrofusion of B16-F1 and CHO cells in isotonic and hypotonic buffer.

    PubMed

    Usaj, Marko; Kanduser, Masa

    2012-09-01

    The fusogenic state of the cell membrane can be induced by external electric field. When two fusogenic membranes are in close contact, cell fusion takes place. An appropriate hypotonic treatment of cells before the application of electric pulses significantly improves electrofusion efficiency. How hypotonic treatment improves electrofusion is still not known in detail. Our results indicate that at given induced transmembrane potential electroporation was not affected by buffer osmolarity. In contrast to electroporation, cells' response to hypotonic treatment significantly affects their electrofusion. High fusion yield was observed when B16-F1 cells were used; this cell line in hypotonic buffer resulted in 41 ± 9 % yield, while in isotonic buffer 32 ± 11 % yield was observed. Based on our knowledge, these fusion yields determined in situ by dual-color fluorescence microscopy are among the highest in electrofusion research field. The use of hypotonic buffer was more crucial for electrofusion of CHO cells; the fusion yield increased from below 1 % in isotonic buffer to 10 ± 4 % in hypotonic buffer. Since the same degree of cell permeabilization was achieved in both buffers, these results indicate that hypotonic treatment significantly improves fusion yield. The effect could be attributed to improved physical contact of cell membranes or to enhanced fusogenic state of the cell membrane itself.

  2. Low-energy dipole excitations in neon isotopes and N=16 isotones within the quasiparticle random-phase approximation and the Gogny force

    SciTech Connect

    Martini, M.; Peru, S.; Dupuis, M.

    2011-03-15

    Low-energy dipole excitations in neon isotopes and N=16 isotones are calculated with a fully consistent axially-symmetric-deformed quasiparticle random phase approximation (QRPA) approach based on Hartree-Fock-Bogolyubov (HFB) states. The same Gogny D1S effective force has been used both in HFB and QRPA calculations. The microscopical structure of these low-lying resonances, as well as the behavior of proton and neutron transition densities, are investigated in order to determine the isoscalar or isovector nature of the excitations. It is found that the N=16 isotones {sup 24}O, {sup 26}Ne, {sup 28}Mg, and {sup 30}Si are characterized by a similar behavior. The occupation of the 2s{sub 1/2} neutron orbit turns out to be crucial, leading to nontrivial transition densities and to small but finite collectivity. Some low-lying dipole excitations of {sup 28}Ne and {sup 30}Ne, characterized by transitions involving the {nu}1d{sub 3/2} state, present a more collective behavior and isoscalar transition densities. A collective proton low-lying excitation is identified in the {sup 18}Ne nucleus.

  3. Morbidity statistics

    PubMed Central

    Smith, Alwyn

    1969-01-01

    This paper is based on an analysis of questionnaires sent to the health ministries of Member States of WHO asking for information about the extent, nature, and scope of morbidity statistical information. It is clear that most countries collect some statistics of morbidity and many countries collect extensive data. However, few countries relate their collection to the needs of health administrators for information, and many countries collect statistics principally for publication in annual volumes which may appear anything up to 3 years after the year to which they refer. The desiderata of morbidity statistics may be summarized as reliability, representativeness, and relevance to current health problems. PMID:5306722

  4. Statistical Diversions

    ERIC Educational Resources Information Center

    Petocz, Peter; Sowey, Eric

    2012-01-01

    The term "data snooping" refers to the practice of choosing which statistical analyses to apply to a set of data after having first looked at those data. Data snooping contradicts a fundamental precept of applied statistics, that the scheme of analysis is to be planned in advance. In this column, the authors shall elucidate the statistical…

  5. Bivariate hydrologic risk analysis based on a coupled entropy-copula method for the Xiangxi River in the Three Gorges Reservoir area, China

    NASA Astrophysics Data System (ADS)

    Fan, Y. R.; Huang, W. W.; Huang, G. H.; Huang, K.; Li, Y. P.; Kong, X. M.

    2016-07-01

    In this study, a bivariate hydrologic risk framework is proposed based on a coupled entropy-copula method. In the proposed risk analysis framework, bivariate flood frequency would be analyzed for different flood variable pairs (i.e., flood peak-volume, flood peak-duration, flood volume-duration). The marginal distributions of flood peak, volume, and duration are quantified through both parametric (i.e., gamma, general extreme value (GEV), and lognormal distributions) and nonparametric (i.e., entropy) approaches. The joint probabilities of flood peak-volume, peak-duration, and volume-duration are established through copulas. The bivariate hydrologic risk is then derived based on the joint return period to reflect the interactive effects of flood variables on the final hydrologic risk values. The proposed method is applied to the risk analysis for the Xiangxi River in the Three Gorges Reservoir area, China. The results indicate the entropy method performs best in quantifying the distribution of flood duration. Bivariate hydrologic risk would then be generated to characterize the impacts of flood volume and duration on the occurrence of a flood. The results suggest that the bivariate risk for flood peak-volume would not decrease significantly for the flood volume less than 1000 m3/s. Moreover, a flood in the Xiangxi River may last at least 5 days without significant decrease of the bivariate risk for flood peak-duration.

  6. Statistics Clinic

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.; Foy, Millennia; Ploutz-Snyder, Robert; Fiedler, James

    2014-01-01

    Do you have elevated p-values? Is the data analysis process getting you down? Do you experience anxiety when you need to respond to criticism of statistical methods in your manuscript? You may be suffering from Insufficient Statistical Support Syndrome (ISSS). For symptomatic relief of ISSS, come for a free consultation with JSC biostatisticians at our help desk during the poster sessions at the HRP Investigators Workshop. Get answers to common questions about sample size, missing data, multiple testing, when to trust the results of your analyses and more. Side effects may include sudden loss of statistics anxiety, improved interpretation of your data, and increased confidence in your results.

  7. SEER Statistics

    Cancer.gov

    The Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute works to provide information on cancer statistics in an effort to reduce the burden of cancer among the U.S. population.

  8. Cancer Statistics

    MedlinePlus

    ... cancer statistics across the world. U.S. Cancer Mortality Trends The best indicator of progress against cancer is ... the number of cancer survivors has increased. These trends show that progress is being made against the ...

  9. Statistical Physics

    NASA Astrophysics Data System (ADS)

    Hermann, Claudine

    Statistical Physics bridges the properties of a macroscopic system and the microscopic behavior of its constituting particles, otherwise impossible due to the giant magnitude of Avogadro's number. Numerous systems of today's key technologies - such as semiconductors or lasers - are macroscopic quantum objects; only statistical physics allows for understanding their fundamentals. Therefore, this graduate text also focuses on particular applications such as the properties of electrons in solids with applications, and radiation thermodynamics and the greenhouse effect.

  10. Evolution of quadrupole collectivity in N =80 isotones toward the Z =64 subshell gap: The B (E 2 ;21+→01+) value of 142Sm

    NASA Astrophysics Data System (ADS)

    Stegmann, R.; Bauer, C.; Rainovski, G.; Pietralla, N.; Stahl, C.; Bönig, S.; Ilieva, S.; Blazhev, A.; Damyanova, A.; Danchev, M.; Gladnishki, K.; Jolie, J.; Lutter, R.; Pakarinen, J.; Radeck, D.; Rapisarda, E.; Reiter, P.; Scheck, M.; Siebeck, B.; Stora, T.; Thöle, P.; Thomas, T.; Thürauf, M.; Vermeulen, M. J.; Voulot, D.; Warr, N.; Wenander, F.; Werner, V.; De Witte, H.

    2015-05-01

    It was shown that the evolution of the B (E 2 ;21+→01+) values in N = 80 isotones from Te to Nd is affected by the underlying subshell structure. This manifests itself in the observation of the local suppression of the B (E 2 ) value at Z =58 with respect to the neighboring nuclei 136Ba and 140Nd. To investigate this shell sensitivity toward the Z =64 subshell gap, the B (E 2 ;21+→01+) value of the unstable nucleus 142Sm was measured utilizing the projectile Coulomb excitation technique. The radioactive ion beam (RIB) experiment was performed at the REX-ISOLDE facility at CERN. The B (E 2 ) value of 32 (4 )W .u . reflects the impact of the π (1 g7 /22 d5 /2) subshell closure at Z =64 with respect to a linear scaling of collectivity with valence proton number.

  11. Mechanisms Underlying Activation of α1-Adrenergic Receptor-Induced Trafficking of AQP5 in Rat Parotid Acinar Cells under Isotonic or Hypotonic Conditions

    PubMed Central

    Bragiel, Aneta M.; Wang, Di; Pieczonka, Tomasz D.; Shono, Masayuki; Ishikawa, Yasuko

    2016-01-01

    Defective cellular trafficking of aquaporin-5 (AQP5) to the apical plasma membrane (APM) in salivary glands is associated with the loss of salivary fluid secretion. To examine mechanisms of α1-adrenoceptor (AR)-induced trafficking of AQP5, immunoconfocal microscopy and Western blot analysis were used to analyze AQP5 localization in parotid tissues stimulated with phenylephrine under different osmolality. Phenylephrine-induced trafficking of AQP5 to the APM and lateral plasma membrane (LPM) was mediated via the α1A-AR subtype, but not the α1B- and α1D-AR subtypes. Phenylephrine-induced trafficking of AQP5 was inhibited by ODQ and KT5823, inhibitors of nitric oxide (NO)-stimulated guanylcyclase (GC) and protein kinase (PK) G, respectively, indicating the involvement of the NO/ soluble (c) GC/PKG signaling pathway. Under isotonic conditions, phenylephrine-induced trafficking was inhibited by La3+, implying the participation of store-operated Ca2+ channel. Under hypotonic conditions, phenylephrine-induced trafficking of AQP5 to the APM was higher than that under isotonic conditions. Under non-stimulated conditions, hypotonicity-induced trafficking of AQP5 to the APM was inhibited by ruthenium red and La3+, suggesting the involvement of extracellular Ca2+ entry. Thus, α1A-AR activation induced the trafficking of AQP5 to the APM and LPM via the Ca2+/ cyclic guanosine monophosphate (cGMP)/PKG signaling pathway, which is associated with store-operated Ca2+ entry. PMID:27367668

  12. Towards an accurate model of redshift-space distortions: a bivariate Gaussian description for the galaxy pairwise velocity distributions

    NASA Astrophysics Data System (ADS)

    Bianchi, Davide; Chiesa, Matteo; Guzzo, Luigi

    2016-10-01

    As a step towards a more accurate modelling of redshift-space distortions (RSD) in galaxy surveys, we develop a general description of the probability distribution function of galaxy pairwise velocities within the framework of the so-called streaming model. For a given galaxy separation , such function can be described as a superposition of virtually infinite local distributions. We characterize these in terms of their moments and then consider the specific case in which they are Gaussian functions, each with its own mean μ and variance σ2. Based on physical considerations, we make the further crucial assumption that these two parameters are in turn distributed according to a bivariate Gaussian, with its own mean and covariance matrix. Tests using numerical simulations explicitly show that with this compact description one can correctly model redshift-space distorsions on all scales, fully capturing the overall linear and nonlinear dynamics of the galaxy flow at different separations. In particular, we naturally obtain Gaussian/exponential, skewed/unskewed distribution functions, depending on separation as observed in simulations and data. Also, the recently proposed single-Gaussian description of redshift-space distortions is included in this model as a limiting case, when the bivariate Gaussian is collapsed to a two-dimensional Dirac delta function. More work is needed, but these results indicate a very promising path to make definitive progress in our program to improve RSD estimators.

  13. Bivariate mass-size relation as a function of morphology as determined by Galaxy Zoo 2 crowdsourced visual classifications

    NASA Astrophysics Data System (ADS)

    Beck, Melanie; Scarlata, Claudia; Fortson, Lucy; Willett, Kyle; Galloway, Melanie

    2016-01-01

    It is well known that the mass-size distribution evolves as a function of cosmic time and that this evolution is different between passive and star-forming galaxy populations. However, the devil is in the details and the precise evolution is still a matter of debate since this requires careful comparison between similar galaxy populations over cosmic time while simultaneously taking into account changes in image resolution, rest-frame wavelength, and surface brightness dimming in addition to properly selecting representative morphological samples.Here we present the first step in an ambitious undertaking to calculate the bivariate mass-size distribution as a function of time and morphology. We begin with a large sample (~3 x 105) of SDSS galaxies at z ~ 0.1. Morphologies for this sample have been determined by Galaxy Zoo crowdsourced visual classifications and we split the sample not only by disk- and bulge-dominated galaxies but also in finer morphology bins such as bulge strength. Bivariate distribution functions are the only way to properly account for biases and selection effects. In particular, we quantify the mass-size distribution with a version of the parametric Maximum Likelihood estimator which has been modified to account for measurement errors as well as upper limits on galaxy sizes.

  14. Operator identities involving the bivariate Rogers-Szegö polynomials and their applications to the multiple q-series identities

    NASA Astrophysics Data System (ADS)

    Zhang, Zhizheng; Wang, Tianze

    2008-07-01

    In this paper, we first give several operator identities involving the bivariate Rogers-Szegö polynomials. By applying the technique of parameter augmentation to the multiple q-binomial theorems given by Milne [S.C. Milne, Balanced summation theorems for U(n) basic hypergeometric series, AdvE Math. 131 (1997) 93-187], we obtain several new multiple q-series identities involving the bivariate Rogers-Szegö polynomials. These include multiple extensions of Mehler's formula and Rogers's formula. Our U(n+1) generalizations are quite natural as they are also a direct and immediate consequence of their (often classical) known one-variable cases and Milne's fundamental theorem for An or U(n+1) basic hypergeometric series in Theorem 1E49 of [S.C. Milne, An elementary proof of the Macdonald identities for , Adv. Math. 57 (1985) 34-70], as rewritten in Lemma 7.3 on p. 163 of [S.C. Milne, Balanced summation theorems for U(n) basic hypergeometric series, Adv. Math. 131 (1997) 93-187] or Corollary 4.4 on pp. 768-769 of [S.C. Milne, M. Schlosser, A new An extension of Ramanujan's summation with applications to multilateral An series, Rocky Mountain J. Math. 32 (2002) 759-792].

  15. Non-Linear Wavelet Regression and Branch & Bound Optimization for the Full Identification of Bivariate Operator Fractional Brownian Motion

    NASA Astrophysics Data System (ADS)

    Frecon, Jordan; Didier, Gustavo; Pustelnik, Nelly; Abry, Patrice

    2016-08-01

    Self-similarity is widely considered the reference framework for modeling the scaling properties of real-world data. However, most theoretical studies and their practical use have remained univariate. Operator Fractional Brownian Motion (OfBm) was recently proposed as a multivariate model for self-similarity. Yet it has remained seldom used in applications because of serious issues that appear in the joint estimation of its numerous parameters. While the univariate fractional Brownian motion requires the estimation of two parameters only, its mere bivariate extension already involves 7 parameters which are very different in nature. The present contribution proposes a method for the full identification of bivariate OfBm (i.e., the joint estimation of all parameters) through an original formulation as a non-linear wavelet regression coupled with a custom-made Branch & Bound numerical scheme. The estimation performance (consistency and asymptotic normality) is mathematically established and numerically assessed by means of Monte Carlo experiments. The impact of the parameters defining OfBm on the estimation performance as well as the associated computational costs are also thoroughly investigated.

  16. Diagnostic performance of des-γ-carboxy prothrombin (DCP) for hepatocellular carcinoma: a bivariate meta-analysis.

    PubMed

    Gao, P; Li, M; Tian, Q B; Liu, Dian-Wu

    2012-01-01

    Serum markers are needed to be developed to specifically diagnose Hepatocellular carcinoma (HCC). Des-γ-carboxy prothrombin (DCP) is a promising tool with limited expense and widely accessibility, but the reported results have been controversial. In order to review the performance of DCP for the diagnosis of HCC, the meta-analysis was performed. After a systematic review of relevant studies, the sensitivity, specificity, positive and negative likelihood ratios (PLR and NLR, respectively) were pooled using a bivariate meta-analysis. Potential between-study heterogeneity was explored by meta-regression model. The post-test probability and the likelihood ratio scattergram to evaluate clinical usefulness were calculated. Based on literature review of 20 publications, the overall sensitivity, specificity, PLR and NLR of DCP for the detection of HCC were 67% (95%CI, 58%-74%), 92% (95%CI, 88%-94%), 7.9 (95%CI, 5.6-11.2) and 0.36 (95%CI, 0.29-0.46), respectively. The area under the bivariate summary receiving operating characteristics curve was 0.89 (95%CI, 0.85-0.92). Significant heterogeneity was present. In conclusion, the major role of DCP is the moderate confirmation of HCC. More prospective studies of DCP are needed in future.

  17. Critical Evaluation of Internet Resources for Teaching Trend and Variability in Bivariate Data

    ERIC Educational Resources Information Center

    Forster, Pat

    2007-01-01

    A search on the Internet for resources for teaching statistics yields multiple sites with data sets, projects, worksheets, applets, and software. Often these are made available without information on how they might benefit learning. This paper addresses potential benefits from resources that target trend and variability relationships in bivariate…

  18. A Bivariate Mixed Distribution with a Heavy-tailed Component and its Application to Single-site Daily Rainfall Simulation

    SciTech Connect

    Li, Chao ..; Singh, Vijay P.; Mishra, Ashok K.

    2013-02-06

    This paper presents an improved brivariate mixed distribution, which is capable of modeling the dependence of daily rainfall from two distinct sources (e.g., rainfall from two stations, two consecutive days, or two instruments such as satellite and rain gauge). The distribution couples an existing framework for building a bivariate mixed distribution, the theory of copulae and a hybrid marginal distribution. Contributions of the improved distribution are twofold. One is the appropriate selection of the bivariate dependence structure from a wider admissible choice (10 candidate copula families). The other is the introduction of a marginal distribution capable of better representing low to moderate values as well as extremes of daily rainfall. Among several applications of the improved distribution, particularly presented here is its utility for single-site daily rainfall simulation. Rather than simulating rainfall occurrences and amounts separately, the developed generator unifies the two processes by generalizing daily rainfall as a Markov process with autocorrelation described by the improved bivariate mixed distribution. The generator is first tested on a sample station in Texas. Results reveal that the simulated and observed sequences are in good agreement with respect to essential characteristics. Then, extensive simulation experiments are carried out to compare the developed generator with three other alternative models: the conventional two-state Markov chain generator, the transition probability matrix model and the semi-parametric Markov chain model with kernel density estimation for rainfall amounts. Analyses establish that overall the developed generator is capable of reproducing characteristics of historical extreme rainfall events and is apt at extrapolating rare values beyond the upper range of available observed data. Moreover, it automatically captures the persistence of rainfall amounts on consecutive wet days in a relatively natural and easy way

  19. Statistical Optics

    NASA Astrophysics Data System (ADS)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  20. Causal inference for bivariate longitudinal quality of life data in presence of death by using global odds ratios.

    PubMed

    Lee, Keunbaik; Daniels, Michael J

    2013-10-30

    In longitudinal clinical trials, if a subject drops out due to death, certain responses, such as those measuring quality of life (QoL), will not be defined after the time of death. Thus, standard missing data analyses, e.g., under ignorable dropout, are problematic because these approaches implicitly 'impute' values of the response after death. In this paper we define a new survivor average causal effect for a bivariate response in a longitudinal quality of life study that had a high dropout rate with the dropout often due to death (or tumor progression). We show how principal stratification, with a few sensitivity parameters, can be used to draw causal inferences about the joint distribution of these two ordinal quality of life measures.

  1. [Statistical materials].

    PubMed

    1986-01-01

    Official population data for the USSR are presented for 1985 and 1986. Part 1 (pp. 65-72) contains data on capitals of union republics and cities with over one million inhabitants, including population estimates for 1986 and vital statistics for 1985. Part 2 (p. 72) presents population estimates by sex and union republic, 1986. Part 3 (pp. 73-6) presents data on population growth, including birth, death, and natural increase rates, 1984-1985; seasonal distribution of births and deaths; birth order; age-specific birth rates in urban and rural areas and by union republic; marriages; age at marriage; and divorces. PMID:12178831

  2. Order-Constrained Reference Priors with Implications for Bayesian Isotonic Regression, Analysis of Covariance and Spatial Models

    NASA Astrophysics Data System (ADS)

    Gong, Maozhen

    Selecting an appropriate prior distribution is a fundamental issue in Bayesian Statistics. In this dissertation, under the framework provided by Berger and Bernardo, I derive the reference priors for several models which include: Analysis of Variance (ANOVA)/Analysis of Covariance (ANCOVA) models with a categorical variable under common ordering constraints, the conditionally autoregressive (CAR) models and the simultaneous autoregressive (SAR) models with a spatial autoregression parameter rho considered. The performances of reference priors for ANOVA/ANCOVA models are evaluated by simulation studies with comparisons to Jeffreys' prior and Least Squares Estimation (LSE). The priors are then illustrated in a Bayesian model of the "Risk of Type 2 Diabetes in New Mexico" data, where the relationship between the type 2 diabetes risk (through Hemoglobin A1c) and different smoking levels is investigated. In both simulation studies and real data set modeling, the reference priors that incorporate internal order information show good performances and can be used as default priors. The reference priors for the CAR and SAR models are also illustrated in the "1999 SAT State Average Verbal Scores" data with a comparison to a Uniform prior distribution. Due to the complexity of the reference priors for both CAR and SAR models, only a portion (12 states in the Midwest) of the original data set is considered. The reference priors can give a different marginal posterior distribution compared to a Uniform prior, which provides an alternative for prior specifications for areal data in Spatial statistics.

  3. Comparison of two techniques for applying disjunctive kriging: the Gaussian anamorphosis model versus the direct statistical inference of the bivariate distributions

    SciTech Connect

    Carr, J.R.; Deng, E.D.

    1987-01-01

    Indicator atcokriging is an alternative to disjunctive kriging for estimation of spatial distributions. One way to determine which of these techniques is more accurate for estimation of spatial distributions is to apply each to a particular type of data. A procedure is developed for evaluation of disjunctive kriging and indicator atcokriging for such an application. Application of this procedure to earthquake ground motion data found disjunctive kriging to be at least as accurate as indicator atcokriging for estimation of spatial distributions for peak horizontal acceleration. Indicator atcokriging was superior for all other types of earthquake ground motion data.

  4. Aquaculture in artificially developed wetlands in urban areas: an application of the bivariate relationship between soil and surface water in landscape ecology.

    PubMed

    Paul, Abhijit

    2011-01-01

    Wetlands show a strong bivariate relationship between soil and surface water. Artificially developed wetlands help to build landscape ecology and make built environments sustainable. The bheries, wetlands of eastern Calcutta (India), utilize the city sewage to develop urban aquaculture that supports the local fish industries and opens a new frontier in sustainable environmental planning research.

  5. Sequential Temporal Dependencies in Associations between Symptoms of Depression and Posttraumatic Stress Disorder: An Application of Bivariate Latent Difference Score Structural Equation Modeling

    ERIC Educational Resources Information Center

    King, Daniel W.; King, Lynda A.; McArdle, John J.; Shalev, Arieh Y.; Doron-LaMarca, Susan

    2009-01-01

    Depression and posttraumatic stress disorder (PTSD) are highly comorbid conditions that may arise following exposure to psychological trauma. This study examined their temporal sequencing and mutual influence using bivariate latent difference score structural equation modeling. Longitudinal data from 182 emergency room patients revealed level of…

  6. Yrast excitations around {sup 132}Sn: The two and three valence- proton N = 82 isotones {sup 134}Te and {sup 135}I

    SciTech Connect

    Daly, P.J.; Zhang, C.T.; Bhattacharyya, P.

    1996-11-01

    Large multidetector {gamma}-ray arrays, which can separate the prompt {gamma}-ray cascades within a single fission product nucleus (of moderate yield) from the bulk of prompt {gamma}-rays, has now opened new prospects for studies of yrast excitations in {sup 132}Sn and the few valence particle nuclei around it. Measurements were performed at Eurogam II using a {sup 248}Cm source. This paper features the results for the two and three valence proton N=82 isotones {sup 134}Te and {sup 135}I which exhibit simple clearcut excitation modes, resembling {sup 210}Po and {sup 211}At, their well studied N=126 counterparts in the {sup 208}Pb region. A search was made for new {sup 135}I transitions by setting a single coincidence gate on 1134 keV {gamma}-rays; strong 288, 572, 690, 725, 1661, 1695, and 2247 keV coincident {gamma}-rays were identified as {sup 135}I {gamma}-rays. In summary, yrast excitations to above 5.5 MeV excitation energy in the 2- and 3-proton nuclei {sup 134}Te and {sup 135}I have been established and interpreted with help of shell model calculations using empirical nucleon-nucleon interactions. This opens possibilities for exploring simple excitation modes in the {sup 132}Sn region under conditions comparable with but not identical to those in the well-studied {sup 208}Pb region.

  7. Submaximal Exercise VO2 and Q During 30-Day 6 degree Head-Down Bed Rest with Isotonic and Isokinetic Exercise Training

    NASA Technical Reports Server (NTRS)

    Greenleaf, J. E.; Bernauer, E. M.; Erti, A. C.

    1995-01-01

    Submaximal exercise (61+3% peak VO2) metabolism was measured before (AC day-2) and on bed rest day 4, 11, and 25 in 19 healthy men (32-42 yr) allocated into no exercise (NOE, N=5) control, and isotonic exercise (ITE, N=7)and isokinetic exercise (IKE, N=7) training groups. Training was conducting supine for two 30-min periods/d for 6 d/wk: ITE was 60-90% peak VO2: IKE was peak knee flexion-extension at 100 deg/s. Supine submaximal exercise 102 decreased significantly (*p<0.05) by 10.3%, with ITE and by 7.3%* with IKE; similar to the submaximal cardiac output (Q) change of -14.5%* (ITE) and -203%* (IKE), but different from change in peak VO2 (+1.4% with ITE and - 10.2%, with IKE) and plasma volume of -3.7% (ITE) and - 18.0% * (IKE). Thus, reduction of submaximal V02 during prolonged bed rest appears to respond to submaximal Q but is not related to change in peak VO2 or plasma volume.

  8. THE GALEX ARECIBO SDSS SURVEY. VII. THE BIVARIATE NEUTRAL HYDROGEN-STELLAR MASS FUNCTION FOR MASSIVE GALAXIES

    SciTech Connect

    Lemonias, Jenna J.; Schiminovich, David; Catinella, Barbara; Heckman, Timothy M.; Moran, Sean M.

    2013-10-20

    We present the bivariate neutral atomic hydrogen (H I)-stellar mass function (HISMF) φ(M{sub H{sub I}}, M{sub *}) for massive (log M{sub *}/M{sub ☉} \\gt 10) galaxies derived from a sample of 480 local (0.025 < z < 0.050) galaxies observed in H I at Arecibo as part of the GALEX Arecibo SDSS Survey. We fit six different models to the HISMF and find that a Schechter function that extends down to a 1% H I gas fraction, with an additional fractional contribution below that limit, is the best parameterization of the HISMF. We calculate Ω{sub H{sub I,{sub M{sub *>10{sup 1}{sup 0}}}}} and find that massive galaxies contribute 41% of the H I density in the local universe. In addition to the binned HISMF, we derive a continuous bivariate fit, which reveals that the Schechter parameters only vary weakly with stellar mass: M{sub H{sub I}{sup *}}, the characteristic H I mass, scales as M{sub *}{sup 0.39}; α, the slope of the HISMF at moderate H I masses, scales as M{sub *}{sup 0.07}; and f, the fraction of galaxies with H I gas fraction greater than 1%, scales as M{sub *}{sup -0.24}. The variation of f with stellar mass should be a strong constraint for numerical simulations. To understand the physical mechanisms that produce the shape of the HISMF, we redefine the parameters of the Schechter function as explicit functions of stellar mass and star formation rate (SFR) to produce a trivariate fit. This analysis reveals strong trends with SFR. While M{sub H{sub I}{sup *}} varies weakly with stellar mass and SFR (M{sub H{sub I}{sup *}} ∝ M{sub *}{sup 0.22}, M{sub H{sub I}{sup *}} ∝ SFR{sup –0.03}), α is a stronger function of both stellar mass and especially SFR (α ∝ M{sub *}{sup 0.47}, α ∝ SFR{sup 0.95}). The HISMF is a crucial tool that can be used to constrain cosmological galaxy simulations, test observational predictions of the H I content of populations of galaxies, and identify galaxies whose properties deviate from average trends.

  9. A bivariate genome-wide association study identifies ADAM12 as a novel susceptibility gene for Kashin-Beck disease

    PubMed Central

    Hao, Jingcan; Wang, Wenyu; Wen, Yan; Xiao, Xiao; He, Awen; Guo, Xiong; Yang, Tielin; Liu, Xiaogang; Shen, Hui; Chen, Xiangding; Tian, Qing; Deng, Hong-Wen; Zhang, Feng

    2016-01-01

    Kashin-Beck disease (KBD) is a chronic osteoarthropathy, which manifests as joint deformities and growth retardation. Only a few genetic studies of growth retardation associated with the KBD have been carried out by now. In this study, we conducted a two-stage bivariate genome-wide association study (BGWAS) of the KBD using joint deformities and body height as study phenotypes, totally involving 2,417 study subjects. Articular cartilage specimens from 8 subjects were collected for immunohistochemistry. In the BGWAS, ADAM12 gene achieved the most significant association (rs1278300 p-value = 9.25 × 10−9) with the KBD. Replication study observed significant association signal at rs1278300 (p-value = 0.007) and rs1710287 (p-value = 0.002) of ADAM12 after Bonferroni correction. Immunohistochemistry revealed significantly decreased expression level of ADAM12 protein in the KBD articular cartilage (average positive chondrocyte rate = 47.59 ± 7.79%) compared to healthy articular cartilage (average positive chondrocyte rate = 64.73 ± 5.05%). Our results suggest that ADAM12 gene is a novel susceptibility gene underlying both joint destruction and growth retardation of the KBD. PMID:27545300

  10. The Pattern of Variation between Diarrhea and Malaria Coexistence with Corresponding Risk Factors in, Chikhwawa, Malawi: A Bivariate Multilevel Analysis

    PubMed Central

    Masangwi, Salule; Ferguson, Neil; Grimason, Anthony; Morse, Tracy; Kazembe, Lawrence

    2015-01-01

    Developing countries face a huge burden of infectious diseases, a number of which co-exist. This paper estimates the pattern and variation of malaria and diarrhea coexistence in Chikhwawa, a district in Southern Malawi using bivariate multilevel modelling with Bayesian estimation. A probit link was employed to examine hierarchically built data from a survey of individuals (n = 6,727) nested within households (n = 1,380) nested within communities (n = 33). Results show significant malaria [σu12=0.901 (95% CI:0.746,1.056)] and diarrhea [σu22=1.009 (95% CI:0.860,1.158)] variations with a strong correlation between them [ru(1,2)=0.565] at household level. There are significant malaria [σv12=0.053(95% CI:0.018,0.088)] and diarrhea [σv22=0.099(95% CI:0.030,0.168)] variations at community level but with a small correlation [rv(1,2)=0.124] between them. There is also significant correlation between malaria and diarrhea at individual level [re(1,2)=0.241]. These results suggest a close association between reported malaria-like illness and diarrheal illness especially at household and individual levels in Southern Malawi. PMID:26197332

  11. Bivariate and multivariate analyses of the correlations between stability of the erythrocyte membrane, serum lipids and hematological variables.

    PubMed

    Bernardino Neto, M; de Avelar, E B; Arantes, T S; Jordão, I A; da Costa Huss, J C; de Souza, T M T; de Souza Penha, V A; da Silva, S C; de Souza, P C A; Tavares, M; Penha-Silva, N

    2013-01-01

    The observation that the fluidity must remain within a critical interval, outside which the stability and functionality of the cell tends to decrease, shows that stability, fluidity and function are related and that the measure of erythrocyte stability allows inferences about the fluidity or functionality of these cells. This study determined the biochemical and hematological variables that are directly or indirectly related to erythrocyte stability in a population of 71 volunteers. Data were evaluated by bivariate and multivariate analysis. The erythrocyte stability showed a greater association with hematological variables than the biochemical variables. The RDW stands out for its strong correlation with the stability of erythrocyte membrane, without being heavily influenced by other factors. Regarding the biochemical variables, the erythrocyte stability was more sensitive to LDL-C. Erythrocyte stability was significantly associated with RDW and LDL-C. Thus, the level of LDL-C is a consistent link between stability and functionality, suggesting that a measure of stability could be more one indirect parameter for assessing the risk of degenerative processes associated with high levels of LDL-C.

  12. Bivariate genome-wide association analyses identified genes with pleiotropic effects for femoral neck bone geometry and age at menarche.

    PubMed

    Ran, Shu; Pei, Yu-Fang; Liu, Yong-Jun; Zhang, Lei; Han, Ying-Ying; Hai, Rong; Tian, Qing; Lin, Yong; Yang, Tie-Lin; Guo, Yan-Fang; Shen, Hui; Thethi, Inderpal S; Zhu, Xue-Zhen; Deng, Hong-Wen

    2013-01-01

    Femoral neck geometric parameters (FNGPs), which include cortical thickness (CT), periosteal diameter (W), buckling ratio (BR), cross-sectional area (CSA), and section modulus (Z), contribute to bone strength and may predict hip fracture risk. Age at menarche (AAM) is an important risk factor for osteoporosis and bone fractures in women. Some FNGPs are genetically correlated with AAM. In this study, we performed a bivariate genome-wide association study (GWAS) to identify new candidate genes responsible for both FNGPs and AAM. In the discovery stage, we tested 760,794 SNPs in 1,728 unrelated Caucasian subject, followed by replication analyses in independent samples of US Caucasians (with 501 subjects) and Chinese (with 826 subjects). We found six SNPs that were associated with FNGPs and AAM. These SNPs are located in three genes (i.e. NRCAM, IDS and LOC148145), suggesting these three genes may co-regulate FNGPs and AAM. Our findings may help improve the understanding of genetic architecture and pathophysiological mechanisms underlying both osteoporosis and AAM.

  13. The Pattern of Variation between Diarrhea and Malaria Coexistence with Corresponding Risk Factors in, Chikhwawa, Malawi: A Bivariate Multilevel Analysis.

    PubMed

    Masangwi, Salule; Ferguson, Neil; Grimason, Anthony; Morse, Tracy; Kazembe, Lawrence

    2015-07-01

    Developing countries face a huge burden of infectious diseases, a number of which co-exist. This paper estimates the pattern and variation of malaria and diarrhea coexistence in Chikhwawa, a district in Southern Malawi using bivariate multilevel modelling with Bayesian estimation. A probit link was employed to examine hierarchically built data from a survey of individuals (n = 6,727) nested within households (n = 1,380) nested within communities (n = 33). Results show significant malaria [σ²μ₁=0.901 (95% CI:0.746,1.056)] and diarrhea [σ²μ₂=1.009 (95% CI:0.860,1.158)] variations with a strong correlation between them [r(¹,²)μ=0.565] at household level. There are significant malaria [σ²ν₁=0.053 (95% CI: 0.018,0.088)] and diarrhea [σ²ν₂=0.099(95% CI : 0.030,0.168) ] variations at community level but with a small correlation [r(¹,²) ν=0.124] between them. There is also significant correlation between malaria and diarrhea at individual level [ r(¹,²) e=0.241]. These results suggest a close association between reported malaria-like illness and diarrheal illness especially at household and individual levels in Southern Malawi.

  14. On the sources of the height-intelligence correlation: new insights from a bivariate ACE model with assortative mating.

    PubMed

    Beauchamp, Jonathan P; Cesarini, David; Johannesson, Magnus; Lindqvist, Erik; Apicella, Coren

    2011-03-01

    A robust positive correlation between height and intelligence, as measured by IQ tests, has been established in the literature. This paper makes several contributions toward establishing the causes of this association. First, we extend the standard bivariate ACE model to account for assortative mating. The more general theoretical framework provides several key insights, including formulas to decompose a cross-trait genetic correlation into components attributable to assortative mating and pleiotropy and to decompose a cross-trait within-family correlation. Second, we use a large dataset of male twins drawn from Swedish conscription records and examine how well genetic and environmental factors explain the association between (i) height and intelligence and (ii) height and military aptitude, a professional psychologist's assessment of a conscript's ability to deal with wartime stress. For both traits, we find suggestive evidence of a shared genetic architecture with height, but we demonstrate that point estimates are very sensitive to assumed degrees of assortative mating. Third, we report a significant within-family correlation between height and intelligence (p^ = 0.10), suggesting that pleiotropy might be at play.

  15. Submaximal exercise VO2 and Qc during 30-day 6 degrees head-down bed rest with isotonic and isokinetic exercise training

    NASA Technical Reports Server (NTRS)

    Greenleaf, J. E.; Ertl, A. C.; Bernauer, E. M.

    1996-01-01

    BACKGROUND: Maintaining intermediary metabolism is necessary for the health and well-being of astronauts on long-duration spaceflights. While peak oxygen uptake (VO2) is consistently decreased during prolonged bed rest, submaximal VO2 is either unchanged or decreased. METHODS: Submaximal exercise metabolism (61 +/- 3% peak VO2) was measured during ambulation (AMB day-2) and on bed rest days 4, 11, and 25 in 19 healthy men (32-42 yr) allocated into no exercise (NOE, N = 5) control, and isotonic exercise (ITE, N = 7) and isokinetic exercise (IKE, N = 7) training groups. Exercise training was conducted supine for two 30-min periods per day for 6 d per week: ITE training was intermittent at 60-90% peak VO2; IKE training was 10 sets of 5 repetitions of peak knee flexion-extension force at a velocity of 100 degrees s-1. Cardiac output was measured with the indirect Fick CO2 method, and plasma volume with Evans blue dye dilution. RESULTS: Supine submaximal exercise VO2 decreased significantly (*p < 0.05) by 10.3%* with ITE and by 7.3%* with IKE; similar to the submaximal cardiac output decrease of 14.5%* (ITE) and 20.3%* (IKE), but different from change in peak VO2 (+1.4% with ITE and -10.2%* with IKE) and decrease in plasma volume of -3.7% (ITE) and -18.0%* (IKE). Reduction of submaximal VO2 during bed rest correlated 0.79 (p < 0.01) with submaximal Qc, but was not related to change in peak VO2 or plasma volume. CONCLUSION: Reduction in submaximal oxygen uptake during prolonged bed rest is related to decrease in exercise but not resting cardiac output; perturbations in active skeletal muscle metabolism may be involved.

  16. Slope Estimation for Bivariate Longitudinal Outcomes Adjusting for Informative Right Censoring Using Discrete Survival Model: Application to the Renal Transplant Cohort.

    PubMed

    Jaffa, Miran A; Woolson, Robert F; Lipsitz, Stuart R

    2011-04-01

    Patients undergoing renal transplantation are prone to graft failure which causes lost of follow-up measures on their blood urea nitrogen and serum creatinine levels. These two outcomes are measured repeatedly over time to assess renal function following transplantation. Loss of follow-up on these bivariate measures results in informative right censoring, a common problem in longitudinal data that should be adjusted for so that valid estimates are obtained. In this study, we propose a bivariate model that jointly models these two longitudinal correlated outcomes and generates population and individual slopes adjusting for informative right censoring using a discrete survival approach. The proposed approach is applied to the clinical dataset of patients who had undergone renal transplantation. A simulation study validates the effectiveness of the approach.

  17. Cosmetic Plastic Surgery Statistics

    MedlinePlus

    2014 Cosmetic Plastic Surgery Statistics Cosmetic Procedure Trends 2014 Plastic Surgery Statistics Report Please credit the AMERICAN SOCIETY OF PLASTIC SURGEONS when citing statistical data or using ...

  18. Monotone Bivariate Interpolation Code

    SciTech Connect

    Fritsch, Fred

    1992-08-27

    BIMOND is a FORTRAN 77 subroutine for piecewise bicubic interpolation to data on a rectangular mesh, which reproduces the monotonicity of the data. A driver program, BIMOND1, is provided which reads data, computes the interpolating surface parameters, and evaluates the function on a mesh suitable for plotting.

  19. Bivariate and multivariate analyses of the influence of blood variables of patients submitted to Roux-en-Y gastric bypass on the stability of erythrocyte membrane against the chaotropic action of ethanol.

    PubMed

    de Arvelos, Leticia Ramos; Rocha, Vanessa Custódio Afonso; Felix, Gabriela Pereira; da Cunha, Cleine Chagas; Bernardino Neto, Morun; da Silva Garrote Filho, Mario; de Fátima Pinheiro, Conceição; Resende, Elmiro Santos; Penha-Silva, Nilson

    2013-03-01

    The stability of the erythrocyte membrane, which is essential for the maintenance of cell functions, occurs in a critical region of fluidity, which depends largely on its composition and the composition and characteristics of the medium. As the composition of the erythrocyte membrane is influenced by several blood variables, the stability of the erythrocyte membrane must have relations with them. The present study aimed to evaluate, by bivariate and multivariate statistical analyses, the correlations and causal relationships between hematologic and biochemical variables and the stability of the erythrocyte membrane against the chaotropic action of ethanol. The validity of this type of analysis depends on the homogeneity of the population and on the variability of the studied parameters, conditions that can be filled by patients who undergo bariatric surgery by the technique of Roux-en-Y gastric bypass since they will suffer feeding restrictions that have great impact on their blood composition. Pathway analysis revealed that an increase in hemoglobin leads to decreased stability of the cell, probably through a process mediated by an increase in mean corpuscular volume. Furthermore, an increase in the mean corpuscular hemoglobin (MCH) leads to an increase in erythrocyte membrane stability, probably because higher values of MCH are associated with smaller quantities of red blood cells and a larger contact area between the cell membrane and ethanol present in the medium.

  20. Can cholesterol be used to distinguish pleural exudates from transudates? evidence from a bivariate meta-analysis

    PubMed Central

    2014-01-01

    Background Many studies have investigated whether pleural cholesterol levels can aid in diagnosis of pleural exudates, and the results have varied considerably. To gain a more reliable answer to this question, we meta-analyzed the literature on using pleural cholesterol or the ratio of cholesterol in pleural fluid to cholesterol in serum (P/S cholesterol ratio) as diagnostic tests to help identify pleural exudates. Methods Literature databases were systematically searched for studies examining accuracy of pleural cholesterol or P/S cholesterol ratios for diagnosing pleural exudates. Data on sensitivity, specificity, positive/negative likelihood ratio (PLR/NLR), and diagnostic odds ratio (DOR) were pooled using bivariate-effects models. Summary receiver operating characteristic (SROC) curves and area under the curve (AUC) were used to summarize overall test performance. Results Our meta-analysis included up to 20 studies involving 3,496 subjects. Summary estimates for pleural cholesterol in the diagnosis of pleural exudates were as follows: sensitivity, 0.88 (95%CI 0.84 to 0.92); specificity, 0.96 (95% CI 0.92 to 0.98); PLR, 20.31 (95% CI 11.21 to 36.78); NLR, 0.12 (95% CI 0.09 to 0.17); DOR, 167.06 (95% CI 76.79 to 363.95); and AUC 0.97 (95% CI 0.95 to 0.98). The corresponding summary performance estimates for using the P/S cholesterol ratio were as follows: sensitivity, 0.94 (95% CI 0.92 to 0.96); specificity, 0.87 (95% CI 0.83 to 0.91); PLR 7.46 (95% CI, 5.47 to 10.19); NLR, 0.07 (95% CI 0.05 to 0.10); DOR, 107.74 (95% CI 60.91 to 190.60); and AUC 0.97 (95% CI 0.95 to 0.98). Conclusions Both pleural cholesterol level and the P/S cholesterol ratio are helpful for the diagnosis of pleural exudates. Nevertheless, the results of pleural cholesterol assays should be interpreted in parallel with the results of traditional tests and clinical information. PMID:24731290

  1. Predict! Teaching Statistics Using Informational Statistical Inference

    ERIC Educational Resources Information Center

    Makar, Katie

    2013-01-01

    Statistics is one of the most widely used topics for everyday life in the school mathematics curriculum. Unfortunately, the statistics taught in schools focuses on calculations and procedures before students have a chance to see it as a useful and powerful tool. Researchers have found that a dominant view of statistics is as an assortment of tools…

  2. Isotonic Glycerol and Sodium Hyaluronate Containing Artificial Tear Decreases Conjunctivochalasis after One and Three Months: A Self-Controlled, Unmasked Study

    PubMed Central

    Kiss, Huba J.; Németh, János

    2015-01-01

    Dry eye complaints are ranked as the most frequent symptoms of patients visiting ophthalmologists. Conjunctivochalasis is a common dry eye disorder, which can cause an unstable tear film and ocular discomfort. The severe conjunctivochalasis characterized by high LId-Parallel COnjunctival Folds (LIPCOF) degree usually requires surgical intervention, where a conservative therapy would be highly desirable. Here we examined the efficacy of a preservative-free, inorganic salt-free unit-dose artificial tear, called Conheal containing isotonic glycerol and 0.015% sodium hyaluronate in a prospective, unmasked, self-controlled study involving 20 patients. The regular use of the glycerol/hyaluronate artificial tear in three months caused a significant improvement in the recorded parameters. Conjunctivochalasis decreased from a mean LIPCOF degree of 2.9±0.4 on both eyes to 1.4±0.6 on the right (median decrease of -2 points, 95% CI from -2.0 to -1.0), and to 1.4±0.7 on the left eye (median decrease of -1 points, 95% CI from -2.0 to -1.0) (p<0.001 for both sides). The tear film breakup time (TFBUT) lengthened from 4.8±1.9 seconds on both eyes to 5.9±2.3 seconds (mean increase of 1.1 seconds, 95% CI from 0.2 to 2.0) and 5.7±1.8 seconds (mean increase of 0.9 seconds, 95% CI from 0.3 to 1.5) on the right and left eyes, respectively (pright eyes = 0.020, pleft eyes = 0.004). The corneal lissamine staining (Oxford Scheme grade) was reduced from 1.3±0.6 on the right and 1.4±0.6 on the left eye significantly (p<0.001) to 0.3±0.4 and 0.2±0.4 on the right and the left eyes. The Ocular Surface Disease Index (OSDI) questionnaire score indicating the subjective complaints of the patients also decreased from a mean value of 36.2±25.3 to 15.6±16.7 (p<0.001). In this study, the artificial tear, Conheal decreased the grade of the conjunctivochalasis significantly after one month of regular use already, from the LIPCOF degree 3, considered as indication of conjunctival surgery, to a

  3. Statistical Reference Datasets

    National Institute of Standards and Technology Data Gateway

    Statistical Reference Datasets (Web, free access)   The Statistical Reference Datasets is also supported by the Standard Reference Data Program. The purpose of this project is to improve the accuracy of statistical software by providing reference datasets with certified computational results that enable the objective evaluation of statistical software.

  4. Explorations in statistics: statistical facets of reproducibility.

    PubMed

    Curran-Everett, Douglas

    2016-06-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This eleventh installment of Explorations in Statistics explores statistical facets of reproducibility. If we obtain an experimental result that is scientifically meaningful and statistically unusual, we would like to know that our result reflects a general biological phenomenon that another researcher could reproduce if (s)he repeated our experiment. But more often than not, we may learn this researcher cannot replicate our result. The National Institutes of Health and the Federation of American Societies for Experimental Biology have created training modules and outlined strategies to help improve the reproducibility of research. These particular approaches are necessary, but they are not sufficient. The principles of hypothesis testing and estimation are inherent to the notion of reproducibility in science. If we want to improve the reproducibility of our research, then we need to rethink how we apply fundamental concepts of statistics to our science.

  5. Developments in Statistical Education.

    ERIC Educational Resources Information Center

    Kapadia, Ramesh

    1980-01-01

    The current status of statistics education at the secondary level is reviewed, with particular attention focused on the various instructional programs in England. A description and preliminary evaluation of the Schools Council Project on Statistical Education is included. (MP)

  6. Mathematical and statistical analysis

    NASA Technical Reports Server (NTRS)

    Houston, A. Glen

    1988-01-01

    The goal of the mathematical and statistical analysis component of RICIS is to research, develop, and evaluate mathematical and statistical techniques for aerospace technology applications. Specific research areas of interest include modeling, simulation, experiment design, reliability assessment, and numerical analysis.

  7. Pitfalls in statistical landslide susceptibility modelling

    NASA Astrophysics Data System (ADS)

    Schröder, Boris; Vorpahl, Peter; Märker, Michael; Elsenbeer, Helmut

    2010-05-01

    lack of explanatory information in the chosen set of predictor variables, the model residuals need to be checked for spatial auto¬correlation. Therefore, we calculate spline correlograms. In addition to this, we investigate partial dependency plots and bivariate interactions plots considering possible interactions between predictors to improve model interpretation. Aiming at presenting this toolbox for model quality assessment, we investigate the influence of strategies in the construction of training datasets for statistical models on model quality.

  8. Finding Statistical Data.

    ERIC Educational Resources Information Center

    Bopp, Richard E.; Van Der Laan, Sharon J.

    1985-01-01

    Presents a search strategy for locating time-series or cross-sectional statistical data in published sources which was designed for undergraduate students who require 30 units of data for five separate variables in a statistical model. Instructional context and the broader applicability of the search strategy for general statistical research is…

  9. Avoiding Statistical Mistakes

    ERIC Educational Resources Information Center

    Strasser, Nora

    2007-01-01

    Avoiding statistical mistakes is important for educators at all levels. Basic concepts will help you to avoid making mistakes using statistics and to look at data with a critical eye. Statistical data is used at educational institutions for many purposes. It can be used to support budget requests, changes in educational philosophy, changes to…

  10. Ethics in Statistics

    ERIC Educational Resources Information Center

    Lenard, Christopher; McCarthy, Sally; Mills, Terence

    2014-01-01

    There are many different aspects of statistics. Statistics involves mathematics, computing, and applications to almost every field of endeavour. Each aspect provides an opportunity to spark someone's interest in the subject. In this paper we discuss some ethical aspects of statistics, and describe how an introduction to ethics has been…

  11. Statistical quality management

    NASA Astrophysics Data System (ADS)

    Vanderlaan, Paul

    1992-10-01

    Some aspects of statistical quality management are discussed. Quality has to be defined as a concrete, measurable quantity. The concepts of Total Quality Management (TQM), Statistical Process Control (SPC), and inspection are explained. In most cases SPC is better than inspection. It can be concluded that statistics has great possibilities in the field of TQM.

  12. Validation of a flow cytometric (FCM) in vitro rat hepatocyte DNA repair assay employing the bivariate BrdUrd-FITC/PI immunocytochemical technique

    SciTech Connect

    Selden, J.R.; Dolbeare, F.; Clair, J.H.; DeLuca, J.G. Lawrence Livermore National Lab., CA )

    1993-01-01

    An in vitro FCM DNA repair assay has been developed. Cultures of rat hapatocytes were exposed to a battery of chemicals for 18-20 hrs. Compounds were selected based upon both their genotoxic and carcinogenic characteristics. Evidence of DNA repair was noted by detecting BrdUrd uptake. Low intensity BrdUrd-FITC fluorescent signals from repairing cells were visualized by use of linear uni- or bi-variate histograms. This assay's sensitivity was directly compared to that of autoradiography. Results revealed the following: (1) A high correlation exists between genotoxicity and DNA repair; (2) The results of these assays were generally in agreement; and, (3) The sensitivity of this FCM DNA repair assay compares favorably to that of autoradiography. Thus, this assay provides a sensitive and reliable means of identifying agents which induce DNA repair in mammalian cells.

  13. Association of Supply Type with Fecal Contamination of Source Water and Household Stored Drinking Water in Developing Countries: A Bivariate Meta-analysis

    PubMed Central

    Bain, Robert E.S.; Cronk, Ryan; Wright, Jim A.; Bartram, Jamie

    2015-01-01

    Background Access to safe drinking water is essential for health. Monitoring access to drinking water focuses on water supply type at the source, but there is limited evidence on whether quality differences at the source persist in water stored in the household. Objectives We assessed the extent of fecal contamination at the source and in household stored water (HSW) and explored the relationship between contamination at each sampling point and water supply type. Methods We performed a bivariate random-effects meta-analysis of 45 studies, identified through a systematic review, that reported either the proportion of samples free of fecal indicator bacteria and/or individual sample bacteria counts for source and HSW, disaggregated by supply type. Results Water quality deteriorated substantially between source and stored water. The mean percentage of contaminated samples (noncompliance) at the source was 46% (95% CI: 33, 60%), whereas mean noncompliance in HSW was 75% (95% CI: 64, 84%). Water supply type was significantly associated with noncompliance at the source (p < 0.001) and in HSW (p = 0.03). Source water (OR = 0.2; 95% CI: 0.1, 0.5) and HSW (OR = 0.3; 95% CI: 0.2, 0.8) from piped supplies had significantly lower odds of contamination compared with non-piped water, potentially due to residual chlorine. Conclusions Piped water is less likely to be contaminated compared with other water supply types at both the source and in HSW. A focus on upgrading water services to piped supplies may help improve safety, including for those drinking stored water. Citation Shields KF, Bain RE, Cronk R, Wright JA, Bartram J. 2015. Association of supply type with fecal contamination of source water and household stored drinking water in developing countries: a bivariate meta-analysis. Environ Health Perspect 123:1222–1231; http://dx.doi.org/10.1289/ehp.1409002 PMID:25956006

  14. Statistical Conclusion Validity: Some Common Threats and Simple Remedies

    PubMed Central

    García-Pérez, Miguel A.

    2012-01-01

    The ultimate goal of research is to produce dependable knowledge or to provide the evidence that may guide practical decisions. Statistical conclusion validity (SCV) holds when the conclusions of a research study are founded on an adequate analysis of the data, generally meaning that adequate statistical methods are used whose small-sample behavior is accurate, besides being logically capable of providing an answer to the research question. Compared to the three other traditional aspects of research validity (external validity, internal validity, and construct validity), interest in SCV has recently grown on evidence that inadequate data analyses are sometimes carried out which yield conclusions that a proper analysis of the data would not have supported. This paper discusses evidence of three common threats to SCV that arise from widespread recommendations or practices in data analysis, namely, the use of repeated testing and optional stopping without control of Type-I error rates, the recommendation to check the assumptions of statistical tests, and the use of regression whenever a bivariate relation or the equivalence between two variables is studied. For each of these threats, examples are presented and alternative practices that safeguard SCV are discussed. Educational and editorial changes that may improve the SCV of published research are also discussed. PMID:22952465

  15. Statistical analysis of single-trial Granger causality spectra.

    PubMed

    Brovelli, Andrea

    2012-01-01

    Granger causality analysis is becoming central for the analysis of interactions between neural populations and oscillatory networks. However, it is currently unclear whether single-trial estimates of Granger causality spectra can be used reliably to assess directional influence. We addressed this issue by combining single-trial Granger causality spectra with statistical inference based on general linear models. The approach was assessed on synthetic and neurophysiological data. Synthetic bivariate data was generated using two autoregressive processes with unidirectional coupling. We simulated two hypothetical experimental conditions: the first mimicked a constant and unidirectional coupling, whereas the second modelled a linear increase in coupling across trials. The statistical analysis of single-trial Granger causality spectra, based on t-tests and linear regression, successfully recovered the underlying pattern of directional influence. In addition, we characterised the minimum number of trials and coupling strengths required for significant detection of directionality. Finally, we demonstrated the relevance for neurophysiology by analysing two local field potentials (LFPs) simultaneously recorded from the prefrontal and premotor cortices of a macaque monkey performing a conditional visuomotor task. Our results suggest that the combination of single-trial Granger causality spectra and statistical inference provides a valuable tool for the analysis of large-scale cortical networks and brain connectivity.

  16. Online use statistics.

    PubMed

    Tannery, Nancy Hrinya; Silverman, Deborah L; Epstein, Barbara A

    2002-01-01

    Online use statistics can provide libraries with a tool to be used when developing an online collection of resources. Statistics can provide information on overall use of a collection, individual print and electronic journal use, and collection use by specific user populations. They can also be used to determine the number of user licenses to purchase. This paper focuses on the issue of use statistics made available for one collection of online resources.

  17. Statistical distribution sampling

    NASA Technical Reports Server (NTRS)

    Johnson, E. S.

    1975-01-01

    Determining the distribution of statistics by sampling was investigated. Characteristic functions, the quadratic regression problem, and the differential equations for the characteristic functions are analyzed.

  18. Multidimensional Visual Statistical Learning

    ERIC Educational Resources Information Center

    Turk-Browne, Nicholas B.; Isola, Phillip J.; Scholl, Brian J.; Treat, Teresa A.

    2008-01-01

    Recent studies of visual statistical learning (VSL) have demonstrated that statistical regularities in sequences of visual stimuli can be automatically extracted, even without intent or awareness. Despite much work on this topic, however, several fundamental questions remain about the nature of VSL. In particular, previous experiments have not…

  19. Statistics and Measurements

    PubMed Central

    Croarkin, M. Carroll

    2001-01-01

    For more than 50 years, the Statistical Engineering Division (SED) has been instrumental in the success of a broad spectrum of metrology projects at NBS/NIST. This paper highlights fundamental contributions of NBS/NIST statisticians to statistics and to measurement science and technology. Published methods developed by SED staff, especially during the early years, endure as cornerstones of statistics not only in metrology and standards applications, but as data-analytic resources used across all disciplines. The history of statistics at NBS/NIST began with the formation of what is now the SED. Examples from the first five decades of the SED illustrate the critical role of the division in the successful resolution of a few of the highly visible, and sometimes controversial, statistical studies of national importance. A review of the history of major early publications of the division on statistical methods, design of experiments, and error analysis and uncertainty is followed by a survey of several thematic areas. The accompanying examples illustrate the importance of SED in the history of statistics, measurements and standards: calibration and measurement assurance, interlaboratory tests, development of measurement methods, Standard Reference Materials, statistical computing, and dissemination of measurement technology. A brief look forward sketches the expanding opportunity and demand for SED statisticians created by current trends in research and development at NIST. PMID:27500023

  20. Explorations in Statistics: Regression

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2011-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This seventh installment of "Explorations in Statistics" explores regression, a technique that estimates the nature of the relationship between two things for which we may only surmise a mechanistic or predictive connection.…

  1. On Statistical Testing.

    ERIC Educational Resources Information Center

    Huberty, Carl J.

    An approach to statistical testing, which combines Neyman-Pearson hypothesis testing and Fisher significance testing, is recommended. The use of P-values in this approach is discussed in some detail. The author also discusses some problems which are often found in introductory statistics textbooks. The problems involve the definitions of…

  2. Reform in Statistical Education

    ERIC Educational Resources Information Center

    Huck, Schuyler W.

    2007-01-01

    Two questions are considered in this article: (a) What should professionals in school psychology do in an effort to stay current with developments in applied statistics? (b) What should they do with their existing knowledge to move from surface understanding of statistics to deep understanding? Written for school psychologists who have completed…

  3. Demonstrating Poisson Statistics.

    ERIC Educational Resources Information Center

    Vetterling, William T.

    1980-01-01

    Describes an apparatus that offers a very lucid demonstration of Poisson statistics as applied to electrical currents, and the manner in which such statistics account for shot noise when applied to macroscopic currents. The experiment described is intended for undergraduate physics students. (HM)

  4. Statistical Summaries: Public Institutions.

    ERIC Educational Resources Information Center

    Virginia State Council of Higher Education, Richmond.

    This document, presents a statistical portrait of the Virginia's 17 public higher education institutions. Data provided include: enrollment figures (broken down in categories such as sex, residency, full- and part-time status, residence, ethnicity, age, and level of postsecondary education); FTE figures; admissions statistics (such as number…

  5. Statistics 101 for Radiologists.

    PubMed

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced. PMID:26466186

  6. Explorations in Statistics: Power

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2010-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fifth installment of "Explorations in Statistics" revisits power, a concept fundamental to the test of a null hypothesis. Power is the probability that we reject the null hypothesis when it is false. Four things affect…

  7. Applied Statistics with SPSS

    ERIC Educational Resources Information Center

    Huizingh, Eelko K. R. E.

    2007-01-01

    Accessibly written and easy to use, "Applied Statistics Using SPSS" is an all-in-one self-study guide to SPSS and do-it-yourself guide to statistics. What is unique about Eelko Huizingh's approach is that this book is based around the needs of undergraduate students embarking on their own research project, and its self-help style is designed to…

  8. Application Statistics 1987.

    ERIC Educational Resources Information Center

    Council of Ontario Universities, Toronto.

    Summary statistics on application and registration patterns of applicants wishing to pursue full-time study in first-year places in Ontario universities (for the fall of 1987) are given. Data on registrations were received indirectly from the universities as part of their annual submission of USIS/UAR enrollment data to Statistics Canada and MCU.…

  9. Introduction to Statistical Physics

    NASA Astrophysics Data System (ADS)

    Casquilho, João Paulo; Ivo Cortez Teixeira, Paulo

    2014-12-01

    Preface; 1. Random walks; 2. Review of thermodynamics; 3. The postulates of statistical physics. Thermodynamic equilibrium; 4. Statistical thermodynamics – developments and applications; 5. The classical ideal gas; 6. The quantum ideal gas; 7. Magnetism; 8. The Ising model; 9. Liquid crystals; 10. Phase transitions and critical phenomena; 11. Irreversible processes; Appendixes; Index.

  10. Deconstructing Statistical Analysis

    ERIC Educational Resources Information Center

    Snell, Joel

    2014-01-01

    Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…

  11. Water Quality Statistics

    ERIC Educational Resources Information Center

    Hodgson, Ted; Andersen, Lyle; Robison-Cox, Jim; Jones, Clain

    2004-01-01

    Water quality experiments, especially the use of macroinvertebrates as indicators of water quality, offer an ideal context for connecting statistics and science. In the STAR program for secondary students and teachers, water quality experiments were also used as a context for teaching statistics. In this article, we trace one activity that uses…

  12. Statistics 101 for Radiologists.

    PubMed

    Anvari, Arash; Halpern, Elkan F; Samir, Anthony E

    2015-10-01

    Diagnostic tests have wide clinical applications, including screening, diagnosis, measuring treatment effect, and determining prognosis. Interpreting diagnostic test results requires an understanding of key statistical concepts used to evaluate test efficacy. This review explains descriptive statistics and discusses probability, including mutually exclusive and independent events and conditional probability. In the inferential statistics section, a statistical perspective on study design is provided, together with an explanation of how to select appropriate statistical tests. Key concepts in recruiting study samples are discussed, including representativeness and random sampling. Variable types are defined, including predictor, outcome, and covariate variables, and the relationship of these variables to one another. In the hypothesis testing section, we explain how to determine if observed differences between groups are likely to be due to chance. We explain type I and II errors, statistical significance, and study power, followed by an explanation of effect sizes and how confidence intervals can be used to generalize observed effect sizes to the larger population. Statistical tests are explained in four categories: t tests and analysis of variance, proportion analysis tests, nonparametric tests, and regression techniques. We discuss sensitivity, specificity, accuracy, receiver operating characteristic analysis, and likelihood ratios. Measures of reliability and agreement, including κ statistics, intraclass correlation coefficients, and Bland-Altman graphs and analysis, are introduced.

  13. Understanding Undergraduate Statistical Anxiety

    ERIC Educational Resources Information Center

    McKim, Courtney

    2014-01-01

    The purpose of this study was to understand undergraduate students' views of statistics. Results reveal that students with less anxiety have a higher interest in statistics and also believe in their ability to perform well in the course. Also students who have a more positive attitude about the class tend to have a higher belief in their…

  14. Explorations in Statistics: Correlation

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2010-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This sixth installment of "Explorations in Statistics" explores correlation, a familiar technique that estimates the magnitude of a straight-line relationship between two variables. Correlation is meaningful only when the…

  15. Multivariate statistical approach to a data set of dioxin and furan contaminations in human milk

    SciTech Connect

    Lindstrom, G.U.M.; Sjostrom, M.; Swanson, S.E. ); Furst, P.; Kruger, C.; Meemken, H.A.; Groebel, W. )

    1988-05-01

    The levels of chlorinated dibenzodioxins, PCDDs, and dibenzofurans, PCDFs, in human milk have been of great concern after the discovery of the toxic 2,3,7,8-substituted isomers in milk of European origin. As knowledge of environmental contamination of human breast milk increases, questions will continue to be asked about possible risks from breast feeding. Before any recommendations can be made, there must be knowledge of contaminant levels in mothers' breast milk. Researchers have measured PCB and 17 different dioxins and furans in human breast milk samples. To date the data has only been analyzed by univariate and bivariate statistical methods. However to extract as much information as possible from this data set, multivariate statistical methods must be used. Here the authors present a multivariate analysis where the relationships between the polychlorinated compounds and the personalia of the mothers have been studied. For the data analysis partial least squares (PLS) modelling has been used.

  16. The statistical analysis of multivariate serological frequency data.

    PubMed

    Reyment, Richard A

    2005-11-01

    Data occurring in the form of frequencies are common in genetics-for example, in serology. Examples are provided by the AB0 group, the Rhesus group, and also DNA data. The statistical analysis of tables of frequencies is carried out using the available methods of multivariate analysis with usually three principal aims. One of these is to seek meaningful relationships between the components of a data set, the second is to examine relationships between populations from which the data have been obtained, the third is to bring about a reduction in dimensionality. This latter aim is usually realized by means of bivariate scatter diagrams using scores computed from a multivariate analysis. The multivariate statistical analysis of tables of frequencies cannot safely be carried out by standard multivariate procedures because they represent compositions and are therefore embedded in simplex space, a subspace of full space. Appropriate procedures for simplex space are compared and contrasted with simple standard methods of multivariate analysis ("raw" principal component analysis). The study shows that the differences between a log-ratio model and a simple logarithmic transformation of proportions may not be very great, particularly as regards graphical ordinations, but important discrepancies do occur. The divergencies between logarithmically based analyses and raw data are, however, great. Published data on Rhesus alleles observed for Italian populations are used to exemplify the subject. PMID:16024067

  17. The statistical analysis of multivariate serological frequency data.

    PubMed

    Reyment, Richard A

    2005-11-01

    Data occurring in the form of frequencies are common in genetics-for example, in serology. Examples are provided by the AB0 group, the Rhesus group, and also DNA data. The statistical analysis of tables of frequencies is carried out using the available methods of multivariate analysis with usually three principal aims. One of these is to seek meaningful relationships between the components of a data set, the second is to examine relationships between populations from which the data have been obtained, the third is to bring about a reduction in dimensionality. This latter aim is usually realized by means of bivariate scatter diagrams using scores computed from a multivariate analysis. The multivariate statistical analysis of tables of frequencies cannot safely be carried out by standard multivariate procedures because they represent compositions and are therefore embedded in simplex space, a subspace of full space. Appropriate procedures for simplex space are compared and contrasted with simple standard methods of multivariate analysis ("raw" principal component analysis). The study shows that the differences between a log-ratio model and a simple logarithmic transformation of proportions may not be very great, particularly as regards graphical ordinations, but important discrepancies do occur. The divergencies between logarithmically based analyses and raw data are, however, great. Published data on Rhesus alleles observed for Italian populations are used to exemplify the subject.

  18. LED champing: statistically blessed?

    PubMed

    Wang, Zhuo

    2015-06-10

    LED champing (smart mixing of individual LEDs to match the desired color and lumens) and color mixing strategies have been widely used to maintain the color consistency of light engines. Light engines with champed LEDs can easily achieve the color consistency of a couple MacAdam steps with widely distributed LEDs to begin with. From a statistical point of view, the distributions for the color coordinates and the flux after champing are studied. The related statistical parameters are derived, which facilitate process improvements such as Six Sigma and are instrumental to statistical quality control for mass productions. PMID:26192863

  19. Statistics: A Brief Overview

    PubMed Central

    Winters, Ryan; Winters, Andrew; Amedee, Ronald G.

    2010-01-01

    The Accreditation Council for Graduate Medical Education sets forth a number of required educational topics that must be addressed in residency and fellowship programs. We sought to provide a primer on some of the important basic statistical concepts to consider when examining the medical literature. It is not essential to understand the exact workings and methodology of every statistical test encountered, but it is necessary to understand selected concepts such as parametric and nonparametric tests, correlation, and numerical versus categorical data. This working knowledge will allow you to spot obvious irregularities in statistical analyses that you encounter. PMID:21603381

  20. Informal Statistics Help Desk

    NASA Technical Reports Server (NTRS)

    Young, M.; Koslovsky, M.; Schaefer, Caroline M.; Feiveson, A. H.

    2017-01-01

    Back by popular demand, the JSC Biostatistics Laboratory and LSAH statisticians are offering an opportunity to discuss your statistical challenges and needs. Take the opportunity to meet the individuals offering expert statistical support to the JSC community. Join us for an informal conversation about any questions you may have encountered with issues of experimental design, analysis, or data visualization. Get answers to common questions about sample size, repeated measures, statistical assumptions, missing data, multiple testing, time-to-event data, and when to trust the results of your analyses.

  1. {sup 110,116}Cd({alpha},{alpha}){sup 110,116}Cd elastic scattering and systematic investigation of elastic {alpha} scattering cross sections along the Z=48 isotopic and N=62 isotonic chains

    SciTech Connect

    Kiss, G. G.; Fueloep, Zs.; Gyuerky, Gy.; Elekes, Z.; Farkas, J.; Somorjai, E.; Mohr, P.; Yalcin, C.; Galaviz, D.; Gueray, R. T.; Oezkan, N.; Goerres, J.

    2011-06-15

    The elastic scattering cross sections for the reactions {sup 110,116}Cd({alpha},{alpha}){sup 110,116}Cd at energies above and below the Coulomb barrier are presented to provide a sensitive test for the {alpha}-nucleus optical potential parameter sets. Additional constraints for the optical potential are taken from the analysis of elastic scattering excitation functions at backward angles which are available in literature. Moreover, the variation of the elastic {alpha} scattering cross sections along the Z=48 isotopic and N=62 isotonic chain is investigated by the study of the ratios of the {sup 106,110,116}Cd({alpha},{alpha}){sup 106,110,116}Cd scattering cross sections at E{sub cm{approx_equal}}15.6and18.8 MeV and the ratio of the {sup 110}Cd({alpha},{alpha}){sup 110}Cd and {sup 112}Sn({alpha},{alpha}){sup 112}Sn reaction cross sections at E{sub cm{approx_equal}}18.8 MeV, respectively. These ratios are sensitive probes for the {alpha}-nucleus optical potential parametrizations. The potentials under study are a basic prerequisite for the prediction of {alpha}-induced reaction cross sections (e.g., for the calculation of stellar reaction rates in the astrophysical p or {gamma} process).

  2. Determining trace amounts and the origin of formaldehyde impurity in Neisseria meningitidis A/C/Y/W-135-DT conjugate vaccine formulated in isotonic aqueous 1× PBS by improved C18-UPLC method.

    PubMed

    Gudlavalleti, Seshu K; Crawford, Erika N; Tran, Nhi N; Orten, Dana J; Harder, Jeffery D; Reddy, Jeeri R

    2015-03-25

    The ability to accurately measure and report trace amounts of residual formaldehyde impurity in a vaccine product is not only critical in the product release but also a regulatory requirement. In many bacterial or viral vaccine manufacturing procedures, formaldehyde is used either at a live culture inactivation step or at a protein de-toxification step or at both. Reported here is a validated and improved C18-UPLC method (developed based on previously published C-8 HPLC method) to determine the traces of formaldehyde process impurity in a liquid form Neisseria meningitidis A/C/Y/W-135-DT conjugate vaccine formulated in isotonic aqueous 1× PBS. UPLC C-18 column and the conditions described distinctly resolved the 2,4-DNPH-HCHO adduct from the un-reacted 2,4-DNPH as detected by TUV detector at 360 nm. This method was shown to be compatible with PBS formulation and extremely sensitive (with a quantitation limit of 0.05 ppm) and aided to determine formaldehyde contamination sources by evaluating the in-process materials as a track-down analysis. Final nanogram levels of formaldehyde in the formulated single dose vialed vaccine mainly originated from the diphtheria toxoid carrier protein used in the production of the conjugate vaccine, whereas relative contribution from polysaccharide API was minimal.

  3. Statistics of the sagas

    NASA Astrophysics Data System (ADS)

    Richfield, Jon; bookfeller

    2016-07-01

    In reply to Ralph Kenna and Pádraig Mac Carron's feature article “Maths meets myths” in which they describe how they are using techniques from statistical physics to characterize the societies depicted in ancient Icelandic sagas.

  4. Brain Tumor Statistics

    MedlinePlus

    ... facts and statistics here include brain and central nervous system tumors (including spinal cord, pituitary and pineal gland ... U.S. living with a primary brain and central nervous system tumor. This year, nearly 17,000 people will ...

  5. Titanic: A Statistical Exploration.

    ERIC Educational Resources Information Center

    Takis, Sandra L.

    1999-01-01

    Uses the available data about the Titanic's passengers to interest students in exploring categorical data and the chi-square distribution. Describes activities incorporated into a statistics class and gives additional resources for collecting information about the Titanic. (ASK)

  6. Elements of Statistics

    NASA Astrophysics Data System (ADS)

    Grégoire, G.

    2016-05-01

    This chapter is devoted to two objectives. The first one is to answer the request expressed by attendees of the first Astrostatistics School (Annecy, October 2013) to be provided with an elementary vademecum of statistics that would facilitate understanding of the given courses. In this spirit we recall very basic notions, that is definitions and properties that we think sufficient to benefit from courses given in the Astrostatistical School. Thus we give briefly definitions and elementary properties on random variables and vectors, distributions, estimation and tests, maximum likelihood methodology. We intend to present basic ideas in a hopefully comprehensible way. We do not try to give a rigorous presentation, and due to the place devoted to this chapter, can cover only a rather limited field of statistics. The second aim is to focus on some statistical tools that are useful in classification: basic introduction to Bayesian statistics, maximum likelihood methodology, Gaussian vectors and Gaussian mixture models.

  7. Plague Maps and Statistics

    MedlinePlus

    ... and Statistics Recommend on Facebook Tweet Share Compartir Plague in the United States Plague was first introduced ... per year in the United States: 1900-2012. Plague Worldwide Plague epidemics have occurred in Africa, Asia, ...

  8. Cooperative Learning in Statistics.

    ERIC Educational Resources Information Center

    Keeler, Carolyn M.; And Others

    1994-01-01

    Formal use of cooperative learning techniques proved effective in improving student performance and retention in a freshman level statistics course. Lectures interspersed with group activities proved effective in increasing conceptual understanding and overall class performance. (11 references) (Author)

  9. Purposeful Statistical Investigations

    ERIC Educational Resources Information Center

    Day, Lorraine

    2014-01-01

    Lorraine Day provides us with a great range of statistical investigations using various resources such as maths300 and TinkerPlots. Each of the investigations link mathematics to students' lives and provide engaging and meaningful contexts for mathematical inquiry.

  10. Tuberculosis Data and Statistics

    MedlinePlus

    ... Organization Chart Advisory Groups Federal TB Task Force Data and Statistics Language: English Español (Spanish) Recommend on ... United States publication. PDF [6 MB] Interactive TB Data Tool Online Tuberculosis Information System (OTIS) OTIS is ...

  11. Understanding Solar Flare Statistics

    NASA Astrophysics Data System (ADS)

    Wheatland, M. S.

    2005-12-01

    A review is presented of work aimed at understanding solar flare statistics, with emphasis on the well known flare power-law size distribution. Although avalanche models are perhaps the favoured model to describe flare statistics, their physical basis is unclear, and they are divorced from developing ideas in large-scale reconnection theory. An alternative model, aimed at reconciling large-scale reconnection models with solar flare statistics, is revisited. The solar flare waiting-time distribution has also attracted recent attention. Observed waiting-time distributions are described, together with what they might tell us about the flare phenomenon. Finally, a practical application of flare statistics to flare prediction is described in detail, including the results of a year of automated (web-based) predictions from the method.

  12. Modelling the vicious circle between obesity and physical activity in children and adolescents using a bivariate probit model with endogenous regressors.

    PubMed

    Yeh, C-Y; Chen, L-J; Ku, P-W; Chen, C-M

    2015-01-01

    The increasing prevalence of obesity in children and adolescents has become one of the most important public health issues around the world. Lack of physical activity is a risk factor for obesity, while being obese could reduce the likelihood of participating in physical activity. Failing to account for the endogeneity between obesity and physical activity would result in biased estimation. This study investigates the relationship between overweight and physical activity by taking endogeneity into consideration. It develops an endogenous bivariate probit model estimated by the maximum likelihood method. The data included 4008 boys and 4197 girls in the 5th-9th grades in Taiwan in 2007-2008. The relationship between overweight and physical activity is significantly negative in the endogenous model, but insignificant in the comparative exogenous model. This endogenous relationship presents a vicious circle in which lower levels of physical activity lead to overweight, while those who are already overweight engage in less physical activity. The results not only reveal the importance of endogenous treatment, but also demonstrate the robust negative relationship between these two factors. An emphasis should be put on overweight and obese children and adolescents in order to break the vicious circle. Promotion of physical activity by appropriate counselling programmes and peer support could be effective in reducing the prevalence of obesity in children and adolescents.

  13. Statistical process control

    SciTech Connect

    Oakland, J.S.

    1986-01-01

    Addressing the increasing importance for firms to have a thorough knowledge of statistically based quality control procedures, this book presents the fundamentals of statistical process control (SPC) in a non-mathematical, practical way. It provides real-life examples and data drawn from a wide variety of industries. The foundations of good quality management and process control, and control of conformance and consistency during production are given. Offers clear guidance to those who wish to understand and implement modern SPC techniques.

  14. Statistical Physics of Fields

    NASA Astrophysics Data System (ADS)

    Kardar, Mehran

    2006-06-01

    While many scientists are familiar with fractals, fewer are familiar with the concepts of scale-invariance and universality which underly the ubiquity of their shapes. These properties may emerge from the collective behaviour of simple fundamental constituents, and are studied using statistical field theories. Based on lectures for a course in statistical mechanics taught by Professor Kardar at Massachusetts Institute of Technology, this textbook demonstrates how such theories are formulated and studied. Perturbation theory, exact solutions, renormalization groups, and other tools are employed to demonstrate the emergence of scale invariance and universality, and the non-equilibrium dynamics of interfaces and directed paths in random media are discussed. Ideal for advanced graduate courses in statistical physics, it contains an integrated set of problems, with solutions to selected problems at the end of the book. A complete set of solutions is available to lecturers on a password protected website at www.cambridge.org/9780521873413. Based on lecture notes from a course on Statistical Mechanics taught by the author at MIT Contains 65 exercises, with solutions to selected problems Features a thorough introduction to the methods of Statistical Field theory Ideal for graduate courses in Statistical Physics

  15. Statistical Physics of Particles

    NASA Astrophysics Data System (ADS)

    Kardar, Mehran

    2006-06-01

    Statistical physics has its origins in attempts to describe the thermal properties of matter in terms of its constituent particles, and has played a fundamental role in the development of quantum mechanics. Based on lectures for a course in statistical mechanics taught by Professor Kardar at Massachusetts Institute of Technology, this textbook introduces the central concepts and tools of statistical physics. It contains a chapter on probability and related issues such as the central limit theorem and information theory, and covers interacting particles, with an extensive description of the van der Waals equation and its derivation by mean field approximation. It also contains an integrated set of problems, with solutions to selected problems at the end of the book. It will be invaluable for graduate and advanced undergraduate courses in statistical physics. A complete set of solutions is available to lecturers on a password protected website at www.cambridge.org/9780521873420. Based on lecture notes from a course on Statistical Mechanics taught by the author at MIT Contains 89 exercises, with solutions to selected problems Contains chapters on probability and interacting particles Ideal for graduate courses in Statistical Mechanics

  16. Statistical Physics of Fracture

    SciTech Connect

    Alava, Mikko; Nukala, Phani K; Zapperi, Stefano

    2006-05-01

    Disorder and long-range interactions are two of the key components that make material failure an interesting playfield for the application of statistical mechanics. The cornerstone in this respect has been lattice models of the fracture in which a network of elastic beams, bonds, or electrical fuses with random failure thresholds are subject to an increasing external load. These models describe on a qualitative level the failure processes of real, brittle, or quasi-brittle materials. This has been particularly important in solving the classical engineering problems of material strength: the size dependence of maximum stress and its sample-to-sample statistical fluctuations. At the same time, lattice models pose many new fundamental questions in statistical physics, such as the relation between fracture and phase transitions. Experimental results point out to the existence of an intriguing crackling noise in the acoustic emission and of self-affine fractals in the crack surface morphology. Recent advances in computer power have enabled considerable progress in the understanding of such models. Among these partly still controversial issues, are the scaling and size-effects in material strength and accumulated damage, the statistics of avalanches or bursts of microfailures, and the morphology of the crack surface. Here we present an overview of the results obtained with lattice models for fracture, highlighting the relations with statistical physics theories and more conventional fracture mechanics approaches.

  17. Helping Alleviate Statistical Anxiety with Computer Aided Statistical Classes

    ERIC Educational Resources Information Center

    Stickels, John W.; Dobbs, Rhonda R.

    2007-01-01

    This study, Helping Alleviate Statistical Anxiety with Computer Aided Statistics Classes, investigated whether undergraduate students' anxiety about statistics changed when statistics is taught using computers compared to the traditional method. Two groups of students were questioned concerning their anxiety about statistics. One group was taught…

  18. Suite versus composite statistics

    USGS Publications Warehouse

    Balsillie, J.H.; Tanner, W.F.

    1999-01-01

    Suite and composite methodologies, two statistically valid approaches for producing statistical descriptive measures, are investigated for sample groups representing a probability distribution where, in addition, each sample is probability distribution. Suite and composite means (first moment measures) are always equivalent. Composite standard deviations (second moment measures) are always larger than suite standard deviations. Suite and composite values for higher moment measures have more complex relationships. Very seldom, however, are they equivalent, and they normally yield statistically significant but different results. Multiple samples are preferable to single samples (including composites) because they permit the investigator to examine sample-to-sample variability. These and other relationships for suite and composite probability distribution analyses are investigated and reported using granulometric data.

  19. Candidate Assembly Statistical Evaluation

    1998-07-15

    The Savannah River Site (SRS) receives aluminum clad spent Material Test Reactor (MTR) fuel from all over the world for storage and eventual reprocessing. There are hundreds of different kinds of MTR fuels and these fuels will continue to be received at SRS for approximately ten more years. SRS''s current criticality evaluation methodology requires the modeling of all MTR fuels utilizing Monte Carlo codes, which is extremely time consuming and resource intensive. Now that amore » significant number of MTR calculations have been conducted it is feasible to consider building statistical models that will provide reasonable estimations of MTR behavior. These statistical models can be incorporated into a standardized model homogenization spreadsheet package to provide analysts with a means of performing routine MTR fuel analyses with a minimal commitment of time and resources. This became the purpose for development of the Candidate Assembly Statistical Evaluation (CASE) program at SRS.« less

  20. Perception in statistical graphics

    NASA Astrophysics Data System (ADS)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  1. Deformed Quantum Statistics

    NASA Astrophysics Data System (ADS)

    Inomata, Akira

    1997-03-01

    To understand possible physical consequences of quantum deformation, we investigate statistical behaviors of a quon gas. The quon is an object which obeys the minimally deformed commutator (or q-mutator): a a† - q a†a=1 with -1≤ q≤ 1. Although q=1 and q=-1 appear to correspond respectively to boson and fermion statistics, it is not easy to create a gas which unifies the boson gas and the fermion gas. We present a model which is able to interpolates between the two limits. The quon gas shows the Bose-Einstein condensation near the Boson limit in two dimensions.

  2. Inference for the Bivariate and Multivariate Hidden Truncated Pareto(type II) and Pareto(type IV) Distribution and Some Measures of Divergence Related to Incompatibility of Probability Distribution

    ERIC Educational Resources Information Center

    Ghosh, Indranil

    2011-01-01

    Consider a discrete bivariate random variable (X, Y) with possible values x[subscript 1], x[subscript 2],..., x[subscript I] for X and y[subscript 1], y[subscript 2],..., y[subscript J] for Y. Further suppose that the corresponding families of conditional distributions, for X given values of Y and of Y for given values of X are available. We…

  3. Statistical insight: a review.

    PubMed

    Vardell, Emily; Garcia-Barcena, Yanira

    2012-01-01

    Statistical Insight is a database that offers the ability to search across multiple sources of data, including the federal government, private organizations, research centers, and international intergovernmental organizations in one search. Two sample searches on the same topic, a basic and an advanced, were conducted to evaluate the database.

  4. Pilot Class Testing: Statistics.

    ERIC Educational Resources Information Center

    Washington Univ., Seattle. Washington Foreign Language Program.

    Statistics derived from test score data from the pilot classes participating in the Washington Foreign Language Program are presented in tables in this report. An index accompanies the tables, itemizing the classes by level (FLES, middle, and high school), grade test, language skill, and school. MLA-Coop test performances for each class were…

  5. Statistical Reasoning over Lunch

    ERIC Educational Resources Information Center

    Selmer, Sarah J.; Bolyard, Johnna J.; Rye, James A.

    2011-01-01

    Students in the 21st century are exposed daily to a staggering amount of numerically infused media. In this era of abundant numeric data, students must be able to engage in sound statistical reasoning when making life decisions after exposure to varied information. The context of nutrition can be used to engage upper elementary and middle school…

  6. Selected Outdoor Recreation Statistics.

    ERIC Educational Resources Information Center

    Bureau of Outdoor Recreation (Dept. of Interior), Washington, DC.

    In this recreational information report, 96 tables are compiled from Bureau of Outdoor Recreation programs and surveys, other governmental agencies, and private sources. Eight sections comprise the document: (1) The Bureau of Outdoor Recreation, (2) Federal Assistance to Recreation, (3) Recreation Surveys for Planning, (4) Selected Statistics of…

  7. Statistics for Learning Genetics

    ERIC Educational Resources Information Center

    Charles, Abigail Sheena

    2012-01-01

    This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in,…

  8. Spitball Scatterplots in Statistics

    ERIC Educational Resources Information Center

    Wagaman, John C.

    2012-01-01

    This paper describes an active learning idea that I have used in my applied statistics class as a first lesson in correlation and regression. Students propel spitballs from various standing distances from the target and use the recorded data to determine if the spitball accuracy is associated with standing distance and review the algebra of lines…

  9. Geopositional Statistical Methods

    NASA Technical Reports Server (NTRS)

    Ross, Kenton

    2006-01-01

    RMSE based methods distort circular error estimates (up to 50% overestimation). The empirical approach is the only statistically unbiased estimator offered. Ager modification to Shultz approach is nearly unbiased, but cumbersome. All methods hover around 20% uncertainty (@ 95% confidence) for low geopositional bias error estimates. This requires careful consideration in assessment of higher accuracy products.

  10. Learning Statistical Concepts

    ERIC Educational Resources Information Center

    Akram, Muhammad; Siddiqui, Asim Jamal; Yasmeen, Farah

    2004-01-01

    In order to learn the concept of statistical techniques one needs to run real experiments that generate reliable data. In practice, the data from some well-defined process or system is very costly and time consuming. It is difficult to run real experiments during the teaching period in the university. To overcome these difficulties, statisticians…

  11. Education Statistics Quarterly, 2003.

    ERIC Educational Resources Information Center

    Marenus, Barbara; Burns, Shelley; Fowler, William; Greene, Wilma; Knepper, Paula; Kolstad, Andrew; McMillen Seastrom, Marilyn; Scott, Leslie

    2003-01-01

    This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released in a 3-month period. Each issue also contains a message from the NCES on a timely…

  12. Analogies for Understanding Statistics

    ERIC Educational Resources Information Center

    Hocquette, Jean-Francois

    2004-01-01

    This article describes a simple way to explain the limitations of statistics to scientists and students to avoid the publication of misleading conclusions. Biologists examine their results extremely critically and carefully choose the appropriate analytic methods depending on their scientific objectives. However, no such close attention is usually…

  13. Statistical Significance Testing.

    ERIC Educational Resources Information Center

    McLean, James E., Ed.; Kaufman, Alan S., Ed.

    1998-01-01

    The controversy about the use or misuse of statistical significance testing has become the major methodological issue in educational research. This special issue contains three articles that explore the controversy, three commentaries on these articles, an overall response, and three rejoinders by the first three authors. They are: (1)…

  14. The Statistical Fermi Paradox

    NASA Astrophysics Data System (ADS)

    Maccone, C.

    In this paper is provided the statistical generalization of the Fermi paradox. The statistics of habitable planets may be based on a set of ten (and possibly more) astrobiological requirements first pointed out by Stephen H. Dole in his book Habitable planets for man (1964). The statistical generalization of the original and by now too simplistic Dole equation is provided by replacing a product of ten positive numbers by the product of ten positive random variables. This is denoted the SEH, an acronym standing for “Statistical Equation for Habitables”. The proof in this paper is based on the Central Limit Theorem (CLT) of Statistics, stating that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable (Lyapunov form of the CLT). It is then shown that: 1. The new random variable NHab, yielding the number of habitables (i.e. habitable planets) in the Galaxy, follows the log- normal distribution. By construction, the mean value of this log-normal distribution is the total number of habitable planets as given by the statistical Dole equation. 2. The ten (or more) astrobiological factors are now positive random variables. The probability distribution of each random variable may be arbitrary. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into the SEH by allowing an arbitrary probability distribution for each factor. This is both astrobiologically realistic and useful for any further investigations. 3. By applying the SEH it is shown that the (average) distance between any two nearby habitable planets in the Galaxy may be shown to be inversely proportional to the cubic root of NHab. This distance is denoted by new random variable D. The relevant probability density function is derived, which was named the "Maccone distribution" by Paul Davies in

  15. The Statistical Drake Equation

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2010-12-01

    We provide the statistical generalization of the Drake equation. From a simple product of seven positive numbers, the Drake equation is now turned into the product of seven positive random variables. We call this "the Statistical Drake Equation". The mathematical consequences of this transformation are then derived. The proof of our results is based on the Central Limit Theorem (CLT) of Statistics. In loose terms, the CLT states that the sum of any number of independent random variables, each of which may be ARBITRARILY distributed, approaches a Gaussian (i.e. normal) random variable. This is called the Lyapunov Form of the CLT, or the Lindeberg Form of the CLT, depending on the mathematical constraints assumed on the third moments of the various probability distributions. In conclusion, we show that: The new random variable N, yielding the number of communicating civilizations in the Galaxy, follows the LOGNORMAL distribution. Then, as a consequence, the mean value of this lognormal distribution is the ordinary N in the Drake equation. The standard deviation, mode, and all the moments of this lognormal N are also found. The seven factors in the ordinary Drake equation now become seven positive random variables. The probability distribution of each random variable may be ARBITRARY. The CLT in the so-called Lyapunov or Lindeberg forms (that both do not assume the factors to be identically distributed) allows for that. In other words, the CLT "translates" into our statistical Drake equation by allowing an arbitrary probability distribution for each factor. This is both physically realistic and practically very useful, of course. An application of our statistical Drake equation then follows. The (average) DISTANCE between any two neighboring and communicating civilizations in the Galaxy may be shown to be inversely proportional to the cubic root of N. Then, in our approach, this distance becomes a new random variable. We derive the relevant probability density

  16. Exposure time independent summary statistics for assessment of drug dependent cell line growth inhibition

    PubMed Central

    2014-01-01

    Background In vitro generated dose-response curves of human cancer cell lines are widely used to develop new therapeutics. The curves are summarised by simplified statistics that ignore the conventionally used dose-response curves’ dependency on drug exposure time and growth kinetics. This may lead to suboptimal exploitation of data and biased conclusions on the potential of the drug in question. Therefore we set out to improve the dose-response assessments by eliminating the impact of time dependency. Results First, a mathematical model for drug induced cell growth inhibition was formulated and used to derive novel dose-response curves and improved summary statistics that are independent of time under the proposed model. Next, a statistical analysis workflow for estimating the improved statistics was suggested consisting of 1) nonlinear regression models for estimation of cell counts and doubling times, 2) isotonic regression for modelling the suggested dose-response curves, and 3) resampling based method for assessing variation of the novel summary statistics. We document that conventionally used summary statistics for dose-response experiments depend on time so that fast growing cell lines compared to slowly growing ones are considered overly sensitive. The adequacy of the mathematical model is tested for doxorubicin and found to fit real data to an acceptable degree. Dose-response data from the NCI60 drug screen were used to illustrate the time dependency and demonstrate an adjustment correcting for it. The applicability of the workflow was illustrated by simulation and application on a doxorubicin growth inhibition screen. The simulations show that under the proposed mathematical model the suggested statistical workflow results in unbiased estimates of the time independent summary statistics. Variance estimates of the novel summary statistics are used to conclude that the doxorubicin screen covers a significant diverse range of responses ensuring it is

  17. On the joint statistics of stable random processes

    NASA Astrophysics Data System (ADS)

    Hopcraft, K. I.; Jakeman, E.

    2011-10-01

    A utilitarian continuous bi-variate random process whose first-order probability density function is a stable random variable is constructed. Results paralleling some of those familiar from the theory of Gaussian noise are derived. In addition to the joint-probability density for the process, these include fractional moments and structure functions. Although the correlation functions for stable processes other than Gaussian do not exist, we show that there is coherence between values adopted by the process at different times, which identifies a characteristic evolution with time. The distribution of the derivative of the process, and the joint-density function of the value of the process and its derivative measured at the same time are evaluated. These enable properties to be calculated analytically such as level crossing statistics and those related to the random telegraph wave. When the stable process is fractal, the proportion of time it spends at zero is finite and some properties of this quantity are evaluated, an optical interpretation for which is provided.

  18. Statistical region merging.

    PubMed

    Nock, Richard; Nielsen, Frank

    2004-11-01

    This paper explores a statistical basis for a process often described in computer vision: image segmentation by region merging following a particular order in the choice of regions. We exhibit a particular blend of algorithmics and statistics whose segmentation error is, as we show, limited from both the qualitative and quantitative standpoints. This approach can be efficiently approximated in linear time/space, leading to a fast segmentation algorithm tailored to processing images described using most common numerical pixel attribute spaces. The conceptual simplicity of the approach makes it simple to modify and cope with hard noise corruption, handle occlusion, authorize the control of the segmentation scale, and process unconventional data such as spherical images. Experiments on gray-level and color images, obtained with a short readily available C-code, display the quality of the segmentations obtained.

  19. Modeling cosmic void statistics

    NASA Astrophysics Data System (ADS)

    Hamaus, Nico; Sutter, P. M.; Wandelt, Benjamin D.

    2016-10-01

    Understanding the internal structure and spatial distribution of cosmic voids is crucial when considering them as probes of cosmology. We present recent advances in modeling void density- and velocity-profiles in real space, as well as void two-point statistics in redshift space, by examining voids identified via the watershed transform in state-of-the-art ΛCDM n-body simulations and mock galaxy catalogs. The simple and universal characteristics that emerge from these statistics indicate the self-similarity of large-scale structure and suggest cosmic voids to be among the most pristine objects to consider for future studies on the nature of dark energy, dark matter and modified gravity.

  20. Statistical evaluation of forecasts

    NASA Astrophysics Data System (ADS)

    Mader, Malenka; Mader, Wolfgang; Gluckman, Bruce J.; Timmer, Jens; Schelter, Björn

    2014-08-01

    Reliable forecasts of extreme but rare events, such as earthquakes, financial crashes, and epileptic seizures, would render interventions and precautions possible. Therefore, forecasting methods have been developed which intend to raise an alarm if an extreme event is about to occur. In order to statistically validate the performance of a prediction system, it must be compared to the performance of a random predictor, which raises alarms independent of the events. Such a random predictor can be obtained by bootstrapping or analytically. We propose an analytic statistical framework which, in contrast to conventional methods, allows for validating independently the sensitivity and specificity of a forecasting method. Moreover, our method accounts for the periods during which an event has to remain absent or occur after a respective forecast.

  1. Journey Through Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Yang, C. N.

    2013-05-01

    My first involvement with statistical mechanics and the many body problem was when I was a student at The National Southwest Associated University in Kunming during the war. At that time Professor Wang Zhu-Xi had just come back from Cambridge, England, where he was a student of Fowler, and his thesis was on phase transitions, a hot topic at that time, and still a very hot topic today...

  2. Statistical Methods in Cosmology

    NASA Astrophysics Data System (ADS)

    Verde, L.

    2010-03-01

    The advent of large data-set in cosmology has meant that in the past 10 or 20 years our knowledge and understanding of the Universe has changed not only quantitatively but also, and most importantly, qualitatively. Cosmologists rely on data where a host of useful information is enclosed, but is encoded in a non-trivial way. The challenges in extracting this information must be overcome to make the most of a large experimental effort. Even after having converged to a standard cosmological model (the LCDM model) we should keep in mind that this model is described by 10 or more physical parameters and if we want to study deviations from it, the number of parameters is even larger. Dealing with such a high dimensional parameter space and finding parameters constraints is a challenge on itself. Cosmologists want to be able to compare and combine different data sets both for testing for possible disagreements (which could indicate new physics) and for improving parameter determinations. Finally, cosmologists in many cases want to find out, before actually doing the experiment, how much one would be able to learn from it. For all these reasons, sophisiticated statistical techniques are being employed in cosmology, and it has become crucial to know some statistical background to understand recent literature in the field. I will introduce some statistical tools that any cosmologist should know about in order to be able to understand recently published results from the analysis of cosmological data sets. I will not present a complete and rigorous introduction to statistics as there are several good books which are reported in the references. The reader should refer to those.

  3. Statistics of entrance times

    NASA Astrophysics Data System (ADS)

    Talkner, Peter

    2003-07-01

    The statistical properties of the transitions of a discrete Markov process are investigated in terms of entrance times. A simple formula for their density is given and used to measure the synchronization of a process with a periodic driving force. For the McNamara-Wiesenfeld model of stochastic resonance we find parameter regions in which the transition frequency of the process is locked with the frequency of the external driving.

  4. 1979 DOE statistical symposium

    SciTech Connect

    Gardiner, D.A.; Truett T.

    1980-09-01

    The 1979 DOE Statistical Symposium was the fifth in the series of annual symposia designed to bring together statisticians and other interested parties who are actively engaged in helping to solve the nation's energy problems. The program included presentations of technical papers centered around exploration and disposal of nuclear fuel, general energy-related topics, and health-related issues, and workshops on model evaluation, risk analysis, analysis of large data sets, and resource estimation.

  5. Quantum U-statistics

    SciTech Connect

    Guta, Madalin; Butucea, Cristina

    2010-10-15

    The notion of a U-statistic for an n-tuple of identical quantum systems is introduced in analogy to the classical (commutative) case: given a self-adjoint 'kernel' K acting on (C{sup d}){sup '}x{sup r} with rstatistics converges in moments to a linear combination of Hermite polynomials in canonical variables of a canonical commutation relation algebra defined through the quantum central limit theorem. In the special cases of nondegenerate kernels and kernels of order of 2, it is shown that the convergence holds in the stronger distribution sense. Two types of applications in quantum statistics are described: testing beyond the two simple hypotheses scenario and quantum metrology with interacting Hamiltonians.

  6. Statistical Inference at Work: Statistical Process Control as an Example

    ERIC Educational Resources Information Center

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  7. Bivariate mixture modeling of transferrin saturation and serum ferritin concentration in Asians, African Americans, Hispanics, and whites in the Hemochromatosis and Iron Overload Screening (HEIRS) Study

    PubMed Central

    Mclaren, Christine E.; Gordeuk, Victor R.; Chen, Wen-Pin; Barton, James C.; Acton, Ronald T.; Speechley, Mark; Castro, Oswaldo; Adams, Paul C.; Snively, Beverly M.; Harris, Emily L.; Reboussin, David M.; Mclachlan, Geoffrey J.; Bean, Richard

    2013-01-01

    Bivariate mixture modeling was used to analyze joint population distributions of transferrin saturation (TS) and serum ferritin concentration (SF) measured in the Hemochromatosis and Iron Overload Screening (HEIRS) Study. Four components (C1, C2, C3, and C4) with successively age-adjusted increasing means for TS and SF were identified in data from 26,832 African Americans, 12,620 Asians, 12,264 Hispanics, and 43,254 whites. The largest component, C2, had normal mean TS (21% to 26% for women, 29% to 30% for men) and SF (43–82 μg/L for women, 165–242 μg/L for men), which consisted of component proportions greater than 0.59 for women and greater than 0.68 for men. C3 and C4 had progressively greater mean values for TS and SF with progressively lesser component proportions. C1 had mean TS values less than 16% for women (<20% for men) and SF values less than 28 μg/L for women (<47 μg/L for men). Compared with C2, adjusted odds of iron deficiency were significantly greater in C1 (14.9–47.5 for women, 60.6–3530 for men), adjusted odds of liver disease were significantly greater in C3 and C4 for African-American women and all men, and adjusted odds of any HFE mutation were increased in C3 (1.4–1.8 for women, 1.2–1.9 for men) and in C4 for Hispanic and white women (1.5 and 5.2, respectively) and men (2.8 and 4.7, respectively). Joint mixture modeling identifies a component with lesser SF and TS at risk for iron deficiency and 2 components with greater SF and TS at risk for liver disease or HFE mutations. This approach can identify populations in which hereditary or acquired factors influence metabolism measurement. PMID:18201677

  8. The bivariate brightness function of galaxies and a demonstration of the impact of surface brightness selection effects on luminosity function estimations

    NASA Astrophysics Data System (ADS)

    Cross, Nicholas; Driver, Simon P.

    2002-01-01

    In this paper we fit an analytic function to the bivariate brightness distribution (BBD) of galaxies. It is a combination of the classical Schechter Function convolved with a Gaussian distribution in surface brightness: thus incorporating the luminosity-surface brightness correlation as seen in many recent data sets. We fit this function to a recent measurement of the BBD based on 45000 galaxies from the Two-Degree Field Galaxy Redshift Survey. The parameters for the best-fitting model are φ*=(0.0206+/-0.0009)h3Mpc-3, Mbj*-5logh=(-19.72+/-0.04)mag, α=-1.05+/-0.02, βμ=0.281+/-0.007, μe,bj*=(22.45+/-0.01)magarcsec-2 and σμ=0.517+/-0.006. φ*, Mbj* and α equate to the conventional Schechter parameters. βμ is the slope of the luminosity-surface brightness correlation, μe,bj* is the characteristic effective surface brightness at Mbj* and σμ is the width of the Gaussian. Using a BBF we explore the impact of the limiting detection isophote on classical measures of the galaxy luminosity distribution. We demonstrate that if isophotal magnitudes are used then errors of ΔMbj*~0.62mag, Δφ*~26 per cent and Δα~0.04 are likely for μlim,bj=24.0magarcsec-2. If Gaussian corrected magnitudes are used these change to ΔMbj*~0.38mag, Δφ*~11 per cent and Δα<0.01 for μlim,bj=24.0magarcsec-2. Hence while the faint-end slope, α, appears fairly robust to surface brightness issues, both the M* and φ* values are highly dependent. The range over which these parameters were seen to vary is fully consistent with the scatter in the published values, reproducing the range of observed luminosity densities (1.1

  9. Statistical design for microwave systems

    NASA Technical Reports Server (NTRS)

    Cooke, Roland; Purviance, John

    1991-01-01

    This paper presents an introduction to statistical system design. Basic ideas needed to understand statistical design and a method for implementing statistical design are presented. The nonlinear characteristics of the system amplifiers and mixers are accounted for in the given examples. The specification of group delay, signal-to-noise ratio and output power are considered in these statistical designs.

  10. Experimental Mathematics and Computational Statistics

    SciTech Connect

    Bailey, David H.; Borwein, Jonathan M.

    2009-04-30

    The field of statistics has long been noted for techniques to detect patterns and regularities in numerical data. In this article we explore connections between statistics and the emerging field of 'experimental mathematics'. These includes both applications of experimental mathematics in statistics, as well as statistical methods applied to computational mathematics.

  11. NASA Pocket Statistics

    NASA Technical Reports Server (NTRS)

    1995-01-01

    NASA Pocket Statistics is published for the use of NASA managers and their staff. Included herein is Administrative and Organizational information, summaries of Space Flight Activity including the NASA Major Launch Record, and NASA Procurement, Financial, and Manpower data. The NASA Major Launch Record includes all launches of Scout class and larger vehicles. Vehicle and spacecraft development flights are also included in the Major Launch Record. Shuttle missions are counted as one launch and one payload, where free flying payloads are not involved. Satellites deployed from the cargo bay of the Shuttle and placed in a separate orbit or trajectory are counted as an additional payload.

  12. NASA Pocket Statistics

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Pocket Statistics is published for the use of NASA managers and their staff. Included herein is Administrative and Organizational information, summaries of Space Flight Activity including the NASA Major Launch Record, and NASA Procurement, Financial, and Manpower data. The NASA Major Launch Record includes all launches of Scout class and larger vehicles. Vehicle and spacecraft development flights are also included in the Major Launch Record. Shuttle missions are counted as one launch and one payload, where free flying payloads are not involved. Satellites deployed from the cargo bay of the Shuttle and placed in a separate orbit or trajectory are counted as an additional payload.

  13. NASA Pocket Statistics

    NASA Technical Reports Server (NTRS)

    1996-01-01

    This booklet of pocket statistics includes the 1996 NASA Major Launch Record, NASA Procurement, Financial, and Workforce data. The NASA Major Launch Record includes all launches of Scout class and larger vehicles. Vehicle and spacecraft development flights are also included in the Major Luanch Record. Shuttle missions are counted as one launch and one payload, where free flying payloads are not involved. Satellites deployed from the cargo bay of the Shuttle and placed in a separate orbit or trajectory are counted as an additional payload.

  14. Who Needs Statistics? | Poster

    Cancer.gov

    You may know the feeling. You have collected a lot of new data on an important experiment. Now you are faced with multiple groups of data, a sea of numbers, and a deadline for submitting your paper to a peer-reviewed journal. And you are not sure which data are relevant, or even the best way to present them. The statisticians at Data Management Services (DMS) know how to help. This small group of experts provides a wide array of statistical and mathematical consulting services to the scientific community at NCI at Frederick and NCI-Bethesda.

  15. Statistical physics and ecology

    NASA Astrophysics Data System (ADS)

    Volkov, Igor

    This work addresses the applications of the methods of statistical physics to problems in population ecology. A theoretical framework based on stochastic Markov processes for the unified neutral theory of biodiversity is presented and an analytical solution for the distribution of the relative species abundance distribution both in the large meta-community and in the small local community is obtained. It is shown that the framework of the current neutral theory in ecology can be easily generalized to incorporate symmetric density dependence. An analytically tractable model is studied that provides an accurate description of beta-diversity and exhibits novel scaling behavior that leads to links between ecological measures such as relative species abundance and the species area relationship. We develop a simple framework that incorporates the Janzen-Connell, dispersal and immigration effects and leads to a description of the distribution of relative species abundance, the equilibrium species richness, beta-diversity and the species area relationship, in good accord with data. Also it is shown that an ecosystem can be mapped into an unconventional statistical ensemble and is quite generally tuned in the vicinity of a phase transition where bio-diversity and the use of resources are optimized. We also perform a detailed study of the unconventional statistical ensemble, in which, unlike in physics, the total number of particles and the energy are not fixed but bounded. We show that the temperature and the chemical potential play a dual role: they determine the average energy and the population of the levels in the system and at the same time they act as an imbalance between the energy and population ceilings and the corresponding average values. Different types of statistics (Boltzmann, Bose-Einstein, Fermi-Dirac and one corresponding to the description of a simple ecosystem) are considered. In all cases, we show that the systems may undergo a first or a second order

  16. International petroleum statistics report

    SciTech Connect

    1995-10-01

    The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report presents data on international oil production, demand, imports, exports and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). Section 2 presents an oil supply/demand balance for the world, in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries.

  17. Statistics of Sxy estimates

    NASA Technical Reports Server (NTRS)

    Freilich, M. H.; Pawka, S. S.

    1987-01-01

    The statistics of Sxy estimates derived from orthogonal-component measurements are examined. Based on results of Goodman (1957), the probability density function (pdf) for Sxy(f) estimates is derived, and a closed-form solution for arbitrary moments of the distribution is obtained. Characteristic functions are used to derive the exact pdf of Sxy(tot). In practice, a simple Gaussian approximation is found to be highly accurate even for relatively few degrees of freedom. Implications for experiment design are discussed, and a maximum-likelihood estimator for a posterior estimation is outlined.

  18. Genesis of some tertiary Indian coals from the chemical composition of ash - a statistical approach: Part 1

    NASA Astrophysics Data System (ADS)

    Sharma, Arpita; Saikia, Ananya; Khare, Puja; Baruah, B. P.

    2014-10-01

    In the present investigation, 37 numbers of high sulphur tertiary coal samples from Meghalaya, India have been studied on the basis of proximate and ash analysis. Various statistical tools like Bivariant Analysis, Principal Component Analysis (PCA) and Hierarchical Clustering Analysis (HCA), and also the geochemical indicators were applied to determine the dominant detrital or authigenic affinity of the ash forming elements in these coals. The genetic interpretation of coal as well as the coal ash has been carried out based on chemical compositions of high temperature ash (HTA) by using Detrital/Authigenic Index. X-Ray Diffraction (XRD) analysis was also carried out to study the mineralogy of the studied coal ashes. Both statistical tools and geochemical indicators have confirmed the detrital nature of these coals as well as the ash forming elements.

  19. Fragile entanglement statistics

    NASA Astrophysics Data System (ADS)

    Brody, Dorje C.; Hughston, Lane P.; Meier, David M.

    2015-10-01

    If X and Y are independent, Y and Z are independent, and so are X and Z, one might be tempted to conclude that X, Y, and Z are independent. But it has long been known in classical probability theory that, intuitive as it may seem, this is not true in general. In quantum mechanics one can ask whether analogous statistics can emerge for configurations of particles in certain types of entangled states. The explicit construction of such states, along with the specification of suitable sets of observables that have the purported statistical properties, is not entirely straightforward. We show that an example of such a configuration arises in the case of an N-particle GHZ state, and we are able to identify a family of observables with the property that the associated measurement outcomes are independent for any choice of 2,3,\\ldots ,N-1 of the particles, even though the measurement outcomes for all N particles are not independent. Although such states are highly entangled, the entanglement turns out to be ‘fragile’, i.e. the associated density matrix has the property that if one traces out the freedom associated with even a single particle, the resulting reduced density matrix is separable.

  20. Statistical clumped isotope signatures.

    PubMed

    Röckmann, T; Popa, M E; Krol, M C; Hofmann, M E G

    2016-08-18

    High precision measurements of molecules containing more than one heavy isotope may provide novel constraints on element cycles in nature. These so-called clumped isotope signatures are reported relative to the random (stochastic) distribution of heavy isotopes over all available isotopocules of a molecule, which is the conventional reference. When multiple indistinguishable atoms of the same element are present in a molecule, this reference is calculated from the bulk (≈average) isotopic composition of the involved atoms. We show here that this referencing convention leads to apparent negative clumped isotope anomalies (anti-clumping) when the indistinguishable atoms originate from isotopically different populations. Such statistical clumped isotope anomalies must occur in any system where two or more indistinguishable atoms of the same element, but with different isotopic composition, combine in a molecule. The size of the anti-clumping signal is closely related to the difference of the initial isotope ratios of the indistinguishable atoms that have combined. Therefore, a measured statistical clumped isotope anomaly, relative to an expected (e.g. thermodynamical) clumped isotope composition, may allow assessment of the heterogeneity of the isotopic pools of atoms that are the substrate for formation of molecules.

  1. International petroleum statistics report

    SciTech Connect

    1997-05-01

    The International Petroleum Statistics Report is a monthly publication that provides current international oil data. This report is published for the use of Members of Congress, Federal agencies, State agencies, industry, and the general public. Publication of this report is in keeping with responsibilities given the Energy Information Administration in Public Law 95-91. The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.

  2. Statistical clumped isotope signatures

    PubMed Central

    Röckmann, T.; Popa, M. E.; Krol, M. C.; Hofmann, M. E. G.

    2016-01-01

    High precision measurements of molecules containing more than one heavy isotope may provide novel constraints on element cycles in nature. These so-called clumped isotope signatures are reported relative to the random (stochastic) distribution of heavy isotopes over all available isotopocules of a molecule, which is the conventional reference. When multiple indistinguishable atoms of the same element are present in a molecule, this reference is calculated from the bulk (≈average) isotopic composition of the involved atoms. We show here that this referencing convention leads to apparent negative clumped isotope anomalies (anti-clumping) when the indistinguishable atoms originate from isotopically different populations. Such statistical clumped isotope anomalies must occur in any system where two or more indistinguishable atoms of the same element, but with different isotopic composition, combine in a molecule. The size of the anti-clumping signal is closely related to the difference of the initial isotope ratios of the indistinguishable atoms that have combined. Therefore, a measured statistical clumped isotope anomaly, relative to an expected (e.g. thermodynamical) clumped isotope composition, may allow assessment of the heterogeneity of the isotopic pools of atoms that are the substrate for formation of molecules. PMID:27535168

  3. [Comment on] Statistical discrimination

    NASA Astrophysics Data System (ADS)

    Chinn, Douglas

    In the December 8, 1981, issue of Eos, a news item reported the conclusion of a National Research Council study that sexual discrimination against women with Ph.D.'s exists in the field of geophysics. Basically, the item reported that even when allowances are made for motherhood the percentage of female Ph.D.'s holding high university and corporate positions is significantly lower than the percentage of male Ph.D.'s holding the same types of positions. The sexual discrimination conclusion, based only on these statistics, assumes that there are no basic psychological differences between men and women that might cause different populations in the employment group studied. Therefore, the reasoning goes, after taking into account possible effects from differences related to anatomy, such as women stopping their careers in order to bear and raise children, the statistical distributions of positions held by male and female Ph.D.'s ought to be very similar to one another. Any significant differences between the distributions must be caused primarily by sexual discrimination.

  4. Statistical clumped isotope signatures

    NASA Astrophysics Data System (ADS)

    Röckmann, T.; Popa, M. E.; Krol, M. C.; Hofmann, M. E. G.

    2016-08-01

    High precision measurements of molecules containing more than one heavy isotope may provide novel constraints on element cycles in nature. These so-called clumped isotope signatures are reported relative to the random (stochastic) distribution of heavy isotopes over all available isotopocules of a molecule, which is the conventional reference. When multiple indistinguishable atoms of the same element are present in a molecule, this reference is calculated from the bulk (≈average) isotopic composition of the involved atoms. We show here that this referencing convention leads to apparent negative clumped isotope anomalies (anti-clumping) when the indistinguishable atoms originate from isotopically different populations. Such statistical clumped isotope anomalies must occur in any system where two or more indistinguishable atoms of the same element, but with different isotopic composition, combine in a molecule. The size of the anti-clumping signal is closely related to the difference of the initial isotope ratios of the indistinguishable atoms that have combined. Therefore, a measured statistical clumped isotope anomaly, relative to an expected (e.g. thermodynamical) clumped isotope composition, may allow assessment of the heterogeneity of the isotopic pools of atoms that are the substrate for formation of molecules.

  5. Statistical clumped isotope signatures.

    PubMed

    Röckmann, T; Popa, M E; Krol, M C; Hofmann, M E G

    2016-01-01

    High precision measurements of molecules containing more than one heavy isotope may provide novel constraints on element cycles in nature. These so-called clumped isotope signatures are reported relative to the random (stochastic) distribution of heavy isotopes over all available isotopocules of a molecule, which is the conventional reference. When multiple indistinguishable atoms of the same element are present in a molecule, this reference is calculated from the bulk (≈average) isotopic composition of the involved atoms. We show here that this referencing convention leads to apparent negative clumped isotope anomalies (anti-clumping) when the indistinguishable atoms originate from isotopically different populations. Such statistical clumped isotope anomalies must occur in any system where two or more indistinguishable atoms of the same element, but with different isotopic composition, combine in a molecule. The size of the anti-clumping signal is closely related to the difference of the initial isotope ratios of the indistinguishable atoms that have combined. Therefore, a measured statistical clumped isotope anomaly, relative to an expected (e.g. thermodynamical) clumped isotope composition, may allow assessment of the heterogeneity of the isotopic pools of atoms that are the substrate for formation of molecules. PMID:27535168

  6. Sufficient Statistics: an Example

    NASA Technical Reports Server (NTRS)

    Quirein, J.

    1973-01-01

    The feature selection problem is considered resulting from the transformation x = Bz where B is a k by n matrix of rank k and k is or = to n. Such a transformation can be considered to reduce the dimension of each observation vector z, and in general, such a transformation results in a loss of information. In terms of the divergence, this information loss is expressed by the fact that the average divergence D sub B computed using variable x is less than or equal to the average divergence D computed using variable z. If D sub B = D, then B is said to be a sufficient statistic for the average divergence D. If B is a sufficient statistic for the average divergence, then it can be shown that the probability of misclassification computed using variable x (of dimension k is or = to n) is equal to the probability of misclassification computed using variable z. Also included is what is believed to be a new proof of the well known fact that D is or = to D sub B. Using the techniques necessary to prove the above fact, it is shown that the Brattacharyya distance as measured by variable x is less than or equal to the Brattacharyya distance as measured by variable z.

  7. Isotonic And Isokinetic Exercise During Bed Rest

    NASA Technical Reports Server (NTRS)

    Greenleaf, J. E.; Wade, C. E.; Bernauer, E. M.; Trowbridge, T. S.; Ertl, A. C.

    1993-01-01

    Brief, intense activity prevents deterioration of peak oxygen uptake, a measure of work capacity. Study intended to explore effectiveness of exercise in maintaining fitness during long missions in microgravity so crewmembers able to keep up arduous work of building and expanding Space Station. Showed that intermittent, intense exercise of short duration more effective than similar exercise at lower intensity for longer times measured in previous studies. Intense short-term exercise seems to maintain volumes of plasma and red blood cells at normal levels.

  8. Relationship between Graduate Students' Statistics Self-Efficacy, Statistics Anxiety, Attitude toward Statistics, and Social Support

    ERIC Educational Resources Information Center

    Perepiczka, Michelle; Chandler, Nichelle; Becerra, Michael

    2011-01-01

    Statistics plays an integral role in graduate programs. However, numerous intra- and interpersonal factors may lead to successful completion of needed coursework in this area. The authors examined the extent of the relationship between self-efficacy to learn statistics and statistics anxiety, attitude towards statistics, and social support of 166…

  9. Statistics of entrance times

    NASA Astrophysics Data System (ADS)

    Talkner, Peter

    2003-03-01

    The statistical properties of discrete Markov processes are investigated in terms of entrance times. Simple relations are given for their density and higher order distributions. These quantities are used for introducing a generalized Rice phase and for characterizing the synchronization of a process with an external driving force. For the McNamara Wiesenfeld model of stochastic resonance parameter regions (spanned by the noise strength, driving frequency and strength) are identified in which the process is locked with the frequency of the external driving and in which the diffusion of the Rice phase becomes minimal. At the same time the Fano factor of the number of entrances per period of the driving force has a minimum.

  10. Statistical crack mechanics

    SciTech Connect

    Dienes, J.K.

    1983-01-01

    An alternative to the use of plasticity theory to characterize the inelastic behavior of solids is to represent the flaws by statistical methods. We have taken such an approach to study fragmentation because it offers a number of advantages. Foremost among these is that, by considering the effects of flaws, it becomes possible to address the underlying physics directly. For example, we have been able to explain why rocks exhibit large strain-rate effects (a consequence of the finite growth rate of cracks), why a spherical explosive imbedded in oil shale produces a cavity with a nearly square section (opening of bedding cracks) and why propellants may detonate following low-speed impact (a consequence of frictional hot spots).

  11. Conditional statistical model building

    NASA Astrophysics Data System (ADS)

    Hansen, Mads Fogtmann; Hansen, Michael Sass; Larsen, Rasmus

    2008-03-01

    We present a new statistical deformation model suited for parameterized grids with different resolutions. Our method models the covariances between multiple grid levels explicitly, and allows for very efficient fitting of the model to data on multiple scales. The model is validated on a data set consisting of 62 annotated MR images of Corpus Callosum. One fifth of the data set was used as a training set, which was non-rigidly registered to each other without a shape prior. From the non-rigidly registered training set a shape prior was constructed by performing principal component analysis on each grid level and using the results to construct a conditional shape model, conditioning the finer parameters with the coarser grid levels. The remaining shapes were registered with the constructed shape prior. The dice measures for the registration without prior and the registration with a prior were 0.875 +/- 0.042 and 0.8615 +/- 0.051, respectively.

  12. Statistical design controversy

    SciTech Connect

    Evans, L.S.; Hendrey, G.R.; Thompson, K.H.

    1985-02-01

    This article was in response to criticisms received by Evans, Hendrey, and Thompson that their article was biased because of omissions and misrepresentations. The authors contend that experimental designs having only one plot per treatment ''were, from the outset, not capable of differentiating between treatment effects and field-position effects,'' remains valid and is supported by decades of agronomic research. Several men, Irving, Troiano, and McCune thought of the article as a review of all studies of acidic rain effects on soybeans. It was not. The article was written over the concern of the comparisons which were being made among studies which purport to evaluate effects of acid deposition on field-grown crops, and implicitly assumes that all of the studies are of equal scientific value. They are not. Only experimental approaches that are well-focused and designed with appropriate agronomic and statistical procedures should be used for credible regional and national assessments of crop inventories. 12 references.

  13. BIG DATA AND STATISTICS

    PubMed Central

    Rossell, David

    2016-01-01

    Big Data brings unprecedented power to address scientific, economic and societal issues, but also amplifies the possibility of certain pitfalls. These include using purely data-driven approaches that disregard understanding the phenomenon under study, aiming at a dynamically moving target, ignoring critical data collection issues, summarizing or preprocessing the data inadequately and mistaking noise for signal. We review some success stories and illustrate how statistical principles can help obtain more reliable information from data. We also touch upon current challenges that require active methodological research, such as strategies for efficient computation, integration of heterogeneous data, extending the underlying theory to increasingly complex questions and, perhaps most importantly, training a new generation of scientists to develop and deploy these strategies. PMID:27722040

  14. Statistical physics ""Beyond equilibrium

    SciTech Connect

    Ecke, Robert E

    2009-01-01

    The scientific challenges of the 21st century will increasingly involve competing interactions, geometric frustration, spatial and temporal intrinsic inhomogeneity, nanoscale structures, and interactions spanning many scales. We will focus on a broad class of emerging problems that will require new tools in non-equilibrium statistical physics and that will find application in new material functionality, in predicting complex spatial dynamics, and in understanding novel states of matter. Our work will encompass materials under extreme conditions involving elastic/plastic deformation, competing interactions, intrinsic inhomogeneity, frustration in condensed matter systems, scaling phenomena in disordered materials from glasses to granular matter, quantum chemistry applied to nano-scale materials, soft-matter materials, and spatio-temporal properties of both ordinary and complex fluids.

  15. Wide Wide World of Statistics: International Statistics on the Internet.

    ERIC Educational Resources Information Center

    Foudy, Geraldine

    2000-01-01

    Explains how to find statistics on the Internet, especially international statistics. Discusses advantages over print sources, including convenience, currency of information, cost effectiveness, and value-added formatting; sources of international statistics; United Nations agencies; search engines and power searching; and evaluating sources. (LRW)

  16. Understanding Statistics and Statistics Education: A Chinese Perspective

    ERIC Educational Resources Information Center

    Shi, Ning-Zhong; He, Xuming; Tao, Jian

    2009-01-01

    In recent years, statistics education in China has made great strides. However, there still exists a fairly large gap with the advanced levels of statistics education in more developed countries. In this paper, we identify some existing problems in statistics education in Chinese schools and make some proposals as to how they may be overcome. We…

  17. Statistical Literacy: Developing a Youth and Adult Education Statistical Project

    ERIC Educational Resources Information Center

    Conti, Keli Cristina; Lucchesi de Carvalho, Dione

    2014-01-01

    This article focuses on the notion of literacy--general and statistical--in the analysis of data from a fieldwork research project carried out as part of a master's degree that investigated the teaching and learning of statistics in adult education mathematics classes. We describe the statistical context of the project that involved the…

  18. Electrical Breakdown In Nitrogen At Low Pressure - Physical Processes And Statistics

    NASA Astrophysics Data System (ADS)

    Gocic, S.

    2010-07-01

    The results of investigation of the electrical breakdown in nitrogen, obtained in combined approach based on measuring of the current-voltage characteristic, modeling of basic physical processes and statistical analysis of the breakdown time delay are presented in this report. Measurement of the current-voltage characteristics with additional monitoring of spatial and temporal distribution of the emission from discharge provides information concerned on development of different regime of low-pressure gas discharge and on processes of the electrical breakdown and discharge maintenance. The presented model of the gas discharge includes the kinetics of mains constituents of the nitrogen plasma, charged particles, vibrationally manifold of molecular ground state, molecular singlet and triplet states and nitrogen atoms. The model is applied in case of a homogenous electric field, at electric field to gas density ratio E/N of 1000 Td (1Td = 10^-17 Vcm^2). The obtained results show that the main mechanism of a nitrogen atoms production in this case is the molecular dissociation in a direct electron impact, while influence of highly excited vibrationall states can be neglected. Also, two new distributions of the statistical time delay of electrical breakdown in nitrogen, the Gaussian and Gauss-exponential ones, are presented. Distributions are theoretically founded on binomial distribution for the occurrence of initiating electrons and described by using analytical and numerical models. Moreover, the correlation coefficient between the statistical and formative time delay of electrical breakdown in nitrogen is de- termined. Starting from bivariate normal (Gaussian) distribution of two random variables, the analytical distribution of the electrical breakdown time delay is theoretically founded on correlation of the dependent statistical and formative time delay. Gaussian density dis- tribution of the electrical breakdown time delay goes to Gaussian of the formative time or

  19. Heart Disease and Stroke Statistics

    MedlinePlus

    ... Nutrition (PDF) Obesity (PDF) Peripheral Artery Disease (PDF) ... statistics, please contact the American Heart Association National Center, Office of Science & Medicine at statistics@heart.org . Please direct all ...

  20. Muscular Dystrophy: Data and Statistics

    MedlinePlus

    ... Statistics Recommend on Facebook Tweet Share Compartir MD STAR net Data and Statistics The following data and ... research [ Read Article ] For more information on MD STAR net see Research and Tracking . Key Findings Feature ...

  1. Thoughts About Theories and Statistics.

    PubMed

    Fawcett, Jacqueline

    2015-07-01

    The purpose of this essay is to share my ideas about the connection between theories and statistics. The essay content reflects my concerns about some researchers' and readers' apparent lack of clarity about what constitutes appropriate statistical testing and conclusions about the empirical adequacy of theories. The reciprocal relation between theories and statistics is emphasized and the conclusion is that statistics without direction from theory is no more than a hobby.

  2. Springer Handbook of Engineering Statistics

    NASA Astrophysics Data System (ADS)

    Pham, Hoang

    The Springer Handbook of Engineering Statistics gathers together the full range of statistical techniques required by engineers from all fields to gain sensible statistical feedback on how their processes or products are functioning and to give them realistic predictions of how these could be improved.

  3. Statistical log analysis made practical

    SciTech Connect

    Mitchell, W.K.; Nelson, R.J. )

    1991-06-01

    This paper discusses the advantages of a statistical approach to log analysis. Statistical techniques use inverse methods to calculate formation parameters. The use of statistical techniques has been limited, however, by the complexity of the mathematics and lengthy computer time required to minimize traditionally used nonlinear equations.

  4. Invention Activities Support Statistical Reasoning

    ERIC Educational Resources Information Center

    Smith, Carmen Petrick; Kenlan, Kris

    2016-01-01

    Students' experiences with statistics and data analysis in middle school are often limited to little more than making and interpreting graphs. Although students may develop fluency in statistical procedures and vocabulary, they frequently lack the skills necessary to apply statistical reasoning in situations other than clear-cut textbook examples.…

  5. Explorations in Statistics: the Bootstrap

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2009-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This fourth installment of Explorations in Statistics explores the bootstrap. The bootstrap gives us an empirical approach to estimate the theoretical variability among possible values of a sample statistic such as the…

  6. Teaching Statistics Online Using "Excel"

    ERIC Educational Resources Information Center

    Jerome, Lawrence

    2011-01-01

    As anyone who has taught or taken a statistics course knows, statistical calculations can be tedious and error-prone, with the details of a calculation sometimes distracting students from understanding the larger concepts. Traditional statistics courses typically use scientific calculators, which can relieve some of the tedium and errors but…

  7. Statistics Anxiety and Instructor Immediacy

    ERIC Educational Resources Information Center

    Williams, Amanda S.

    2010-01-01

    The purpose of this study was to investigate the relationship between instructor immediacy and statistics anxiety. It was predicted that students receiving immediacy would report lower levels of statistics anxiety. Using a pretest-posttest-control group design, immediacy was measured using the Instructor Immediacy scale. Statistics anxiety was…

  8. Statistics: It's in the Numbers!

    ERIC Educational Resources Information Center

    Deal, Mary M.; Deal, Walter F., III

    2007-01-01

    Mathematics and statistics play important roles in peoples' lives today. A day hardly passes that they are not bombarded with many different kinds of statistics. As consumers they see statistical information as they surf the web, watch television, listen to their satellite radios, or even read the nutrition facts panel on a cereal box in the…

  9. Statistics of indistinguishable particles.

    PubMed

    Wittig, Curt

    2009-07-01

    The wave function of a system containing identical particles takes into account the relationship between a particle's intrinsic spin and its statistical property. Specifically, the exchange of two identical particles having odd-half-integer spin results in the wave function changing sign, whereas the exchange of two identical particles having integer spin is accompanied by no such sign change. This is embodied in a term (-1)(2s), which has the value +1 for integer s (bosons), and -1 for odd-half-integer s (fermions), where s is the particle spin. All of this is well-known. In the nonrelativistic limit, a detailed consideration of the exchange of two identical particles shows that exchange is accompanied by a 2pi reorientation that yields the (-1)(2s) term. The same bookkeeping is applicable to the relativistic case described by the proper orthochronous Lorentz group, because any proper orthochronous Lorentz transformation can be expressed as the product of spatial rotations and a boost along the direction of motion. PMID:19552474

  10. International petroleum statistics report

    SciTech Connect

    1996-05-01

    The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1084 through 1994.

  11. International petroleum statistics report

    SciTech Connect

    1995-11-01

    The International Petroleum Statistics Report presents data on international oil production, demand, imports, exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.

  12. International petroleum statistics report

    SciTech Connect

    1995-07-27

    The International Petroleum Statistics Report presents data on international oil production, demand, imports, and exports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1994; OECD stocks from 1973 through 1994; and OECD trade from 1984 through 1994.

  13. Topics in statistical mechanics

    SciTech Connect

    Elser, V.

    1984-05-01

    This thesis deals with four independent topics in statistical mechanics: (1) the dimer problem is solved exactly for a hexagonal lattice with general boundary using a known generating function from the theory of partitions. It is shown that the leading term in the entropy depends on the shape of the boundary; (2) continuum models of percolation and self-avoiding walks are introduced with the property that their series expansions are sums over linear graphs with intrinsic combinatorial weights and explicit dimension dependence; (3) a constrained SOS model is used to describe the edge of a simple cubic crystal. Low and high temperature results are derived as well as the detailed behavior near the crystal facet; (4) the microscopic model of the lambda-transition involving atomic permutation cycles is reexamined. In particular, a new derivation of the two-component field theory model of the critical behavior is presented. Results for a lattice model originally proposed by Kikuchi are extended with a high temperature series expansion and Monte Carlo simulation. 30 references.

  14. Statistical mechanics of nucleosomes

    NASA Astrophysics Data System (ADS)

    Chereji, Razvan V.

    Eukaryotic cells contain long DNA molecules (about two meters for a human cell) which are tightly packed inside the micrometric nuclei. Nucleosomes are the basic packaging unit of the DNA which allows this millionfold compactification. A longstanding puzzle is to understand the principles which allow cells to both organize their genomes into chromatin fibers in the crowded space of their nuclei, and also to keep the DNA accessible to many factors and enzymes. With the nucleosomes covering about three quarters of the DNA, their positions are essential because these influence which genes can be regulated by the transcription factors and which cannot. We study physical models which predict the genome-wide organization of the nucleosomes and also the relevant energies which dictate this organization. In the last five years, the study of chromatin knew many important advances. In particular, in the field of nucleosome positioning, new techniques of identifying nucleosomes and the competing DNA-binding factors appeared, as chemical mapping with hydroxyl radicals, ChIP-exo, among others, the resolution of the nucleosome maps increased by using paired-end sequencing, and the price of sequencing an entire genome decreased. We present a rigorous statistical mechanics model which is able to explain the recent experimental results by taking into account nucleosome unwrapping, competition between different DNA-binding proteins, and both the interaction between histones and DNA, and between neighboring histones. We show a series of predictions of our new model, all in agreement with the experimental observations.

  15. International petroleum statistics report

    SciTech Connect

    1997-07-01

    The International Petroleum Statistics Report is a monthly publication that provides current international data. The report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent 12 months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. World oil production and OECD demand data are for the years 1970 through 1996; OECD stocks from 1973 through 1996; and OECD trade from 1986 through 1996.

  16. International petroleum statistics report

    SciTech Connect

    1996-10-01

    The International Petroleum Statistics Report presents data on international oil production, demand, imports, and stocks. The report has four sections. Section 1 contains time series data on world oil production, and on oil demand and stocks in the Organization for Economic Cooperation and Development (OECD). This section contains annual data beginning in 1985, and monthly data for the most recent two years. Section 2 presents an oil supply/demand balance for the world. This balance is presented in quarterly intervals for the most recent two years. Section 3 presents data on oil imports by OECD countries. This section contains annual data for the most recent year, quarterly data for the most recent two quarters, and monthly data for the most recent twelve months. Section 4 presents annual time series data on world oil production and oil stocks, demand, and trade in OECD countries. Word oil production and OECD demand data are for the years 1970 through 1995; OECD stocks from 1973 through 1995; and OECD trade from 1985 through 1995.

  17. A statistical mechanical problem?

    PubMed Central

    Costa, Tommaso; Ferraro, Mario

    2014-01-01

    The problem of deriving the processes of perception and cognition or the modes of behavior from states of the brain appears to be unsolvable in view of the huge numbers of elements involved. However, neural activities are not random, nor independent, but constrained to form spatio-temporal patterns, and thanks to these restrictions, which in turn are due to connections among neurons, the problem can at least be approached. The situation is similar to what happens in large physical ensembles, where global behaviors are derived by microscopic properties. Despite the obvious differences between neural and physical systems a statistical mechanics approach is almost inescapable, since dynamics of the brain as a whole are clearly determined by the outputs of single neurons. In this paper it will be shown how, starting from very simple systems, connectivity engenders levels of increasing complexity in the functions of the brain depending on specific constraints. Correspondingly levels of explanations must take into account the fundamental role of constraints and assign at each level proper model structures and variables, that, on one hand, emerge from outputs of the lower levels, and yet are specific, in that they ignore irrelevant details. PMID:25228891

  18. Statistical Mechanics of Zooplankton.

    PubMed

    Hinow, Peter; Nihongi, Ai; Strickler, J Rudi

    2015-01-01

    Statistical mechanics provides the link between microscopic properties of many-particle systems and macroscopic properties such as pressure and temperature. Observations of similar "microscopic" quantities exist for the motion of zooplankton, as well as many species of other social animals. Herein, we propose to take average squared velocities as the definition of the "ecological temperature" of a population under different conditions on nutrients, light, oxygen and others. We test the usefulness of this definition on observations of the crustacean zooplankton Daphnia pulicaria. In one set of experiments, D. pulicaria is infested with the pathogen Vibrio cholerae, the causative agent of cholera. We find that infested D. pulicaria under light exposure have a significantly greater ecological temperature, which puts them at a greater risk of detection by visual predators. In a second set of experiments, we observe D. pulicaria in cold and warm water, and in darkness and under light exposure. Overall, our ecological temperature is a good discriminator of the crustacean's swimming behavior.

  19. Statistical Mechanics of Zooplankton

    PubMed Central

    Hinow, Peter; Nihongi, Ai; Strickler, J. Rudi

    2015-01-01

    Statistical mechanics provides the link between microscopic properties of many-particle systems and macroscopic properties such as pressure and temperature. Observations of similar “microscopic” quantities exist for the motion of zooplankton, as well as many species of other social animals. Herein, we propose to take average squared velocities as the definition of the “ecological temperature” of a population under different conditions on nutrients, light, oxygen and others. We test the usefulness of this definition on observations of the crustacean zooplankton Daphnia pulicaria. In one set of experiments, D. pulicaria is infested with the pathogen Vibrio cholerae, the causative agent of cholera. We find that infested D. pulicaria under light exposure have a significantly greater ecological temperature, which puts them at a greater risk of detection by visual predators. In a second set of experiments, we observe D. pulicaria in cold and warm water, and in darkness and under light exposure. Overall, our ecological temperature is a good discriminator of the crustacean’s swimming behavior. PMID:26270537

  20. Petroleum statistics in France

    SciTech Connect

    De Saint Germain, H.; Lamiraux, C.

    1995-08-01

    33 oil companies, including Elf, Exxon, Agip, Conoco as well as Coparex, Enron, Hadson, Midland, Hunt, Canyon and Union Texas are present in oil and gas exploration and production in France. The production of oil and gas in France amounts to some 60,000 bopd of oil and 350 MMcfpd of marketed natural gas each year, which still accounts for 3.5% and 10% for French domestic needs, respectively. To date, 166 fields have been discovered, representing a total reserve of 3 billion bbl of crude oil and 13 trillion cf of raw gas. These fields are concentrated in two major onshore sedimentary basins of Mesozoic age, which are the Aquitaine basin and the Paris basin. The Aquitaine basin should be subdivided into two distinct domains: The Parentis basin where the largest field Parentis was discovered in 1954 with still production of about 3700 bopd of oil and where Les Arbouslers field, discovered at the end of 1991, is currently producing about 10,000 bopd of oil. The northern Pyrenees and their foreland, where the Lacq field, discovered in 1951, has produced about 7.7 tcf of gas since 1957, and is still producing 138 MMcfpd. In the Paris basin, the two large oil fields are Villeperclue discovered in 1982 by Triton and Total, and Chaunoy, discovered in 1983 by Essorep, which are still producing about 10,000 and 15,000 bopd, respectively. The last significantly sized discovery occurred in 1990 with Itteville by Elf Aquitaine which is currently producing 4,200 bopd. The poster shows statistical data related to the past 20 years of oil and gas exploration and production in France.

  1. Ideal statistically quasi Cauchy sequences

    NASA Astrophysics Data System (ADS)

    Savas, Ekrem; Cakalli, Huseyin

    2016-08-01

    An ideal I is a family of subsets of N, the set of positive integers which is closed under taking finite unions and subsets of its elements. A sequence (xk) of real numbers is said to be S(I)-statistically convergent to a real number L, if for each ɛ > 0 and for each δ > 0 the set { n ∈N :1/n | { k ≤n :| xk-L | ≥ɛ } | ≥δ } belongs to I. We introduce S(I)-statistically ward compactness of a subset of R, the set of real numbers, and S(I)-statistically ward continuity of a real function in the senses that a subset E of R is S(I)-statistically ward compact if any sequence of points in E has an S(I)-statistically quasi-Cauchy subsequence, and a real function is S(I)-statistically ward continuous if it preserves S(I)-statistically quasi-Cauchy sequences where a sequence (xk) is called to be S(I)-statistically quasi-Cauchy when (Δxk) is S(I)-statistically convergent to 0. We obtain results related to S(I)-statistically ward continuity, S(I)-statistically ward compactness, Nθ-ward continuity, and slowly oscillating continuity.

  2. Basic statistics in cell biology.

    PubMed

    Vaux, David L

    2014-01-01

    The physicist Ernest Rutherford said, "If your experiment needs statistics, you ought to have done a better experiment." Although this aphorism remains true for much of today's research in cell biology, a basic understanding of statistics can be useful to cell biologists to help in monitoring the conduct of their experiments, in interpreting the results, in presenting them in publications, and when critically evaluating research by others. However, training in statistics is often focused on the sophisticated needs of clinical researchers, psychologists, and epidemiologists, whose conclusions depend wholly on statistics, rather than the practical needs of cell biologists, whose experiments often provide evidence that is not statistical in nature. This review describes some of the basic statistical principles that may be of use to experimental biologists, but it does not cover the sophisticated statistics needed for papers that contain evidence of no other kind.

  3. Chemists, Access, Statistics

    NASA Astrophysics Data System (ADS)

    Holmes, Jon L.

    2000-06-01

    IP-number access. Current subscriptions can be upgraded to IP-number access at little additional cost. We are pleased to be able to offer to institutions and libraries this convenient mode of access to subscriber only resources at JCE Online. JCE Online Usage Statistics We are continually amazed by the activity at JCE Online. So far, the year 2000 has shown a marked increase. Given the phenomenal overall growth of the Internet, perhaps our surprise is not warranted. However, during the months of January and February 2000, over 38,000 visitors requested over 275,000 pages. This is a monthly increase of over 33% from the October-December 1999 levels. It is good to know that people are visiting, but we would very much like to know what you would most like to see at JCE Online. Please send your suggestions to JCEOnline@chem.wisc.edu. For those who are interested, JCE Online year-to-date statistics are available. Biographical Snapshots of Famous Chemists: Mission Statement Feature Editor: Barbara Burke Chemistry Department, California State Polytechnic University-Pomona, Pomona, CA 91768 phone: 909/869-3664 fax: 909/869-4616 email: baburke@csupomona.edu The primary goal of this JCE Internet column is to provide information about chemists who have made important contributions to chemistry. For each chemist, there is a short biographical "snapshot" that provides basic information about the person's chemical work, gender, ethnicity, and cultural background. Each snapshot includes links to related websites and to a biobibliographic database. The database provides references for the individual and can be searched through key words listed at the end of each snapshot. All students, not just science majors, need to understand science as it really is: an exciting, challenging, human, and creative way of learning about our natural world. Investigating the life experiences of chemists can provide a means for students to gain a more realistic view of chemistry. In addition students

  4. "Just Another Statistic"

    PubMed

    Machtay; Glatstein

    1998-01-01

    On returning from a medical meeting, we learned that sadly a patient, "Mr. B.," had passed away. His death was a completely unexpected surprise. He had been doing well nine months after a course of intensive radiotherapy for a locally advanced head and neck cancer; in his most recent follow-up notes, he was described as a "complete remission." Nonetheless, he apparently died peacefully in his sleep from a cardiac arrest one night and was found the next day by a concerned neighbor. In our absence, after Mr. B. expired, his death certificate was filled out by a physician who didn't know him in detail, but did know why he recently was treated in our department. The cause of death was listed as head and neck cancer. It wasn't long after his death before we began to receive those notorious "requests for additional information," letters from the statistical office of a well-known cooperative group. Mr. B., as it turns out, was on a clinical trial, and it was "vital" to know further details of the circumstances of his passing. Perhaps this very large cancer had been controlled and Mr. B. succumbed to old age (helped along by the tobacco industry). On the other hand, maybe the residual "fibrosis" in his neck was actually packed with active tumor and his left carotid artery was finally 100% pinched off, or maybe he suffered a massive pulmonary embolism from cancer-related hypercoagulability. The forms and requests were completed with a succinct "cause of death uncertain," adding, "please have the Study Chairs call to discuss this difficult case." Often clinical reports of outcomes utilize and emphasize the endpoint "disease specific survival" (DSS). Like overall survival (OS), the DSS can be calculated by actuarial methods, with patients who have incomplete follow-up "censored" at the time of last follow-up pending further information. In the DSS, however, deaths unrelated to the index cancer of interest are censored at the time of death; thus, a death from intercurrent

  5. Statistics without Tears: Complex Statistics with Simple Arithmetic

    ERIC Educational Resources Information Center

    Smith, Brian

    2011-01-01

    One of the often overlooked aspects of modern statistics is the analysis of time series data. Modern introductory statistics courses tend to rush to probabilistic applications involving risk and confidence. Rarely does the first level course linger on such useful and fascinating topics as time series decomposition, with its practical applications…

  6. SOCR: Statistics Online Computational Resource

    PubMed Central

    Dinov, Ivo D.

    2011-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741

  7. Characterizations of linear sufficient statistics

    NASA Technical Reports Server (NTRS)

    Peters, B. C., Jr.; Redner, R.; Decell, H. P., Jr.

    1976-01-01

    A necessary and sufficient condition is developed such that there exists a continous linear sufficient statistic T for a dominated collection of totally finite measures defined on the Borel field generated by the open sets of a Banach space X. In particular, corollary necessary and sufficient conditions are given so that there exists a rank K linear sufficient statistic T for any finite collection of probability measures having n-variate normal densities. In this case a simple calculation, involving only the population means and covariances, determines the smallest integer K for which there exists a rank K linear sufficient statistic T (as well as an associated statistic T itself).

  8. Statistical concepts in metrology with a postscript on statistical graphics

    NASA Astrophysics Data System (ADS)

    Ku, Harry H.

    1988-08-01

    Statistical Concepts in Metrology was originally written as Chapter 2 for the Handbook of Industrial Metrology published by the American Society of Tool and Manufacturing Engineers, 1967. It was reprinted as one of 40 papers in NBS Special Publication 300, Volume 1, Precision Measurement and Calibration; Statistical Concepts and Procedures, 1969. Since then this chapter has been used as basic text in statistics in Bureau-sponsored courses and seminars, including those for Electricity, Electronics, and Analytical Chemistry. While concepts and techniques introduced in the original chapter remain valid and appropriate, some additions on recent development of graphical methods for the treatment of data would be useful. Graphical methods can be used effectively to explore information in data sets prior to the application of classical statistical procedures. For this reason additional sections on statistical graphics are added as a postscript.

  9. Research Design and Statistical Design.

    ERIC Educational Resources Information Center

    Szymanski, Edna Mora

    1993-01-01

    Presents fourth editorial in series, this one describing research design and explaining its relationship to statistical design. Research design, validity, and research approaches are examined, quantitative research designs and hypothesis testing are described, and control and statistical designs are discussed. Concludes with section on the art of…

  10. Explorations in Statistics: Confidence Intervals

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2009-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This third installment of "Explorations in Statistics" investigates confidence intervals. A confidence interval is a range that we expect, with some level of confidence, to include the true value of a population parameter…

  11. Book Trade Research and Statistics.

    ERIC Educational Resources Information Center

    Bosch, Stephen; Ink, Gary; Lofquist, William S.

    1998-01-01

    Provides data on prices of U.S. and foreign materials; book title output and average prices, 1996 final and 1997 preliminary figures; book sales statistics, 1997--AAP preliminary estimates; U.S. trade in books, 1997; international book title output, 1990-95; book review media statistics; and number of book outlets in the U.S. and Canada. (PEN)

  12. Representational Versatility in Learning Statistics

    ERIC Educational Resources Information Center

    Graham, Alan T.; Thomas, Michael O. J.

    2005-01-01

    Statistical data can be represented in a number of qualitatively different ways, the choice depending on the following three conditions: the concepts to be investigated; the nature of the data; and the purpose for which they were collected. This paper begins by setting out frameworks that describe the nature of statistical thinking in schools, and…

  13. Statistics Anxiety among Postgraduate Students

    ERIC Educational Resources Information Center

    Koh, Denise; Zawi, Mohd Khairi

    2014-01-01

    Most postgraduate programmes, that have research components, require students to take at least one course of research statistics. Not all postgraduate programmes are science based, there are a significant number of postgraduate students who are from the social sciences that will be taking statistics courses, as they try to complete their…

  14. Motivating Play Using Statistical Reasoning

    ERIC Educational Resources Information Center

    Cross Francis, Dionne I.; Hudson, Rick A.; Lee, Mi Yeon; Rapacki, Lauren; Vesperman, Crystal Marie

    2014-01-01

    Statistical literacy is essential in everyone's personal lives as consumers, citizens, and professionals. To make informed life and professional decisions, students are required to read, understand, and interpret vast amounts of information, much of which is quantitative. To develop statistical literacy so students are able to make sense of…

  15. Statistical Methods in Psychology Journals.

    ERIC Educational Resources Information Center

    Willkinson, Leland

    1999-01-01

    Proposes guidelines for revising the American Psychological Association (APA) publication manual or other APA materials to clarify the application of statistics in research reports. The guidelines are intended to induce authors and editors to recognize the thoughtless application of statistical methods. Contains 54 references. (SLD)

  16. Computing contingency statistics in parallel.

    SciTech Connect

    Bennett, Janine Camille; Thompson, David; Pebay, Philippe Pierre

    2010-09-01

    Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy, and {chi}{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel.We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.

  17. Education Statistics Quarterly, Spring 2001.

    ERIC Educational Resources Information Center

    Education Statistics Quarterly, 2001

    2001-01-01

    The "Education Statistics Quarterly" gives a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data products and funding opportunities developed over a 3-month period. Each issue also…

  18. SOCR: Statistics Online Computational Resource

    ERIC Educational Resources Information Center

    Dinov, Ivo D.

    2006-01-01

    The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an…

  19. Book Trade Research and Statistics.

    ERIC Educational Resources Information Center

    Bosch, Stephen; Ink, Gary; Greco, Albert N.

    1999-01-01

    Presents: "Prices of United States and Foreign Published Materials"; "Book Title Output and Average Prices"; "Book Sales Statistics, 1998"; "United States Book Exports and Imports: 1998"; "International Book Title Output: 1990-96"; "Number of Book Outlets in the United States and Canada"; and "Book Review Media Statistics". (AEF)

  20. Book Trade Research and Statistics.

    ERIC Educational Resources Information Center

    Sullivan, Sharon G.; Ink, Gary; Grabois, Andrew; Barr, Catherine

    2001-01-01

    Includes six articles that discuss research and statistics relating to the book trade. Topics include prices of U.S. and foreign materials; book title output and average prices; book sales statistics; book exports and imports; book outlets in the U.S. and Canada; and books and other media reviewed. (LRW)

  1. Book Trade Research and Statistics.

    ERIC Educational Resources Information Center

    Alexander, Adrian W.; And Others

    1994-01-01

    The six articles in this section examine prices of U.S. and foreign materials; book title output and average prices; book sales statistics; U.S. book exports and imports; number of book outlets in the United States and Canada; and book review media statistics. (LRW)

  2. Education Statistics Quarterly, Fall 2000.

    ERIC Educational Resources Information Center

    Dillow, Sally, Ed.

    2000-01-01

    The "Education Statistics Quarterly" gives a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released during a 3-month period. Each message also contains a message from…

  3. Students' Attitudes toward Statistics (STATS).

    ERIC Educational Resources Information Center

    Sutarso, Toto

    The purposes of this study were to develop an instrument to measure students' attitude toward statistics (STATS), and to define the underlying dimensions that comprise the STATS. The instrument consists of 24 items. The sample included 79 male and 97 female students from the statistics classes at the College of Education and the College of…

  4. Statistical Factors in Complexation Reactions.

    ERIC Educational Resources Information Center

    Chung, Chung-Sun

    1985-01-01

    Four cases which illustrate statistical factors in complexation reactions (where two of the reactants are monodentate ligands) are presented. Included are tables showing statistical factors for the reactions of: (1) square-planar complexes; (2) tetrahedral complexes; and (3) octahedral complexes. (JN)

  5. Students' attitudes towards learning statistics

    NASA Astrophysics Data System (ADS)

    Ghulami, Hassan Rahnaward; Hamid, Mohd Rashid Ab; Zakaria, Roslinazairimah

    2015-05-01

    Positive attitude towards learning is vital in order to master the core content of the subject matters under study. This is unexceptional in learning statistics course especially at the university level. Therefore, this study investigates the students' attitude towards learning statistics. Six variables or constructs have been identified such as affect, cognitive competence, value, difficulty, interest, and effort. The instrument used for the study is questionnaire that was adopted and adapted from the reliable instrument of Survey of Attitudes towards Statistics(SATS©). This study is conducted to engineering undergraduate students in one of the university in the East Coast of Malaysia. The respondents consist of students who were taking the applied statistics course from different faculties. The results are analysed in terms of descriptive analysis and it contributes to the descriptive understanding of students' attitude towards the teaching and learning process of statistics.

  6. Probability, Information and Statistical Physics

    NASA Astrophysics Data System (ADS)

    Kuzemsky, A. L.

    2016-03-01

    In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.

  7. Statistical Thermodynamics and Microscale Thermophysics

    NASA Astrophysics Data System (ADS)

    Carey, Van P.

    1999-08-01

    Many exciting new developments in microscale engineering are based on the application of traditional principles of statistical thermodynamics. In this text Van Carey offers a modern view of thermodynamics, interweaving classical and statistical thermodynamic principles and applying them to current engineering systems. He begins with coverage of microscale energy storage mechanisms from a quantum mechanics perspective and then develops the fundamental elements of classical and statistical thermodynamics. Subsequent chapters discuss applications of equilibrium statistical thermodynamics to solid, liquid, and gas phase systems. The remainder of the book is devoted to nonequilibrium thermodynamics of transport phenomena and to nonequilibrium effects and noncontinuum behavior at the microscale. Although the text emphasizes mathematical development, Carey includes many examples and exercises to illustrate how the theoretical concepts are applied to systems of scientific and engineering interest. In the process he offers a fresh view of statistical thermodynamics for advanced undergraduate and graduate students, as well as practitioners, in mechanical, chemical, and materials engineering.

  8. Nursing student attitudes toward statistics.

    PubMed

    Mathew, Lizy; Aktan, Nadine M

    2014-04-01

    Nursing is guided by evidence-based practice. To understand and apply research to practice, nurses must be knowledgeable in statistics; therefore, it is crucial to promote a positive attitude toward statistics among nursing students. The purpose of this quantitative cross-sectional study was to assess differences in attitudes toward statistics among undergraduate nursing, graduate nursing, and undergraduate non-nursing students. The Survey of Attitudes Toward Statistics Scale-36 (SATS-36) was used to measure student attitudes, with higher scores denoting more positive attitudes. The convenience sample was composed of 175 students from a public university in the northeastern United States. Statistically significant relationships were found among some of the key demographic variables. Graduate nursing students had a significantly lower score on the SATS-36, compared with baccalaureate nursing and non-nursing students. Therefore, an innovative nursing curriculum that incorporates knowledge of student attitudes and key demographic variables may result in favorable outcomes.

  9. [Big data in official statistics].

    PubMed

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  10. [Big data in official statistics].

    PubMed

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany. PMID:26077871

  11. Clinical statistics: five key statistical concepts for clinicians.

    PubMed

    Choi, Yong-Geun

    2013-10-01

    Statistics is the science of data. As the foundation of scientific knowledge, data refers to evidentiary facts from the nature of reality by human action, observation, or experiment. Clinicians should be aware of the conditions of good data to support the validity of clinical modalities in reading scientific articles, one of the resources to revise or update their clinical knowledge and skills. The cause-effect link between clinical modality and outcome is ascertained as pattern statistic. The uniformity of nature guarantees the recurrence of data as the basic scientific evidence. Variation statistics are examined for patterns of recurrence. This provides information on the probability of recurrence of the cause-effect phenomenon. Multiple causal factors of natural phenomenon need a counterproof of absence in terms of the control group. A pattern of relation between a causal factor and an effect becomes recognizable, and thus, should be estimated as relation statistic. The type and meaning of each relation statistic should be well-understood. A study regarding a sample from the population of wide variations require clinicians to be aware of error statistics due to random chance. Incomplete human sense, coarse measurement instrument, and preconceived idea as a hypothesis that tends to bias the research, which gives rise to the necessity of keen critical independent mind with regard to the reported data.

  12. Characterizations of linear sufficient statistics

    NASA Technical Reports Server (NTRS)

    Peters, B. C., Jr.; Reoner, R.; Decell, H. P., Jr.

    1977-01-01

    A surjective bounded linear operator T from a Banach space X to a Banach space Y must be a sufficient statistic for a dominated family of probability measures defined on the Borel sets of X. These results were applied, so that they characterize linear sufficient statistics for families of the exponential type, including as special cases the Wishart and multivariate normal distributions. The latter result was used to establish precisely which procedures for sampling from a normal population had the property that the sample mean was a sufficient statistic.

  13. An introduction to statistical finance

    NASA Astrophysics Data System (ADS)

    Bouchaud, Jean-Philippe

    2002-10-01

    We summarize recent research in a rapid growing field, that of statistical finance, also called ‘econophysics’. There are three main themes in this activity: (i) empirical studies and the discovery of interesting universal features in the statistical texture of financial time series, (ii) the use of these empirical results to devise better models of risk and derivative pricing, of direct interest for the financial industry, and (iii) the study of ‘agent-based models’ in order to unveil the basic mechanisms that are responsible for the statistical ‘anomalies’ observed in financial time series. We give a brief overview of some of the results in these three directions.

  14. Program for standard statistical distributions

    NASA Technical Reports Server (NTRS)

    Falls, L. W.

    1972-01-01

    Development of procedure to describe frequency distributions involved in statistical theory is discussed. Representation of frequency distributions by first order differential equation is presented. Classification of various types of distributions based on Pearson parameters is analyzed.

  15. Statistical ecology comes of age

    PubMed Central

    Gimenez, Olivier; Buckland, Stephen T.; Morgan, Byron J. T.; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M.; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M.; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-01-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1–4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data. PMID:25540151

  16. Statistical ecology comes of age.

    PubMed

    Gimenez, Olivier; Buckland, Stephen T; Morgan, Byron J T; Bez, Nicolas; Bertrand, Sophie; Choquet, Rémi; Dray, Stéphane; Etienne, Marie-Pierre; Fewster, Rachel; Gosselin, Frédéric; Mérigot, Bastien; Monestiez, Pascal; Morales, Juan M; Mortier, Frédéric; Munoz, François; Ovaskainen, Otso; Pavoine, Sandrine; Pradel, Roger; Schurr, Frank M; Thomas, Len; Thuiller, Wilfried; Trenkel, Verena; de Valpine, Perry; Rexstad, Eric

    2014-12-01

    The desire to predict the consequences of global environmental change has been the driver towards more realistic models embracing the variability and uncertainties inherent in ecology. Statistical ecology has gelled over the past decade as a discipline that moves away from describing patterns towards modelling the ecological processes that generate these patterns. Following the fourth International Statistical Ecology Conference (1-4 July 2014) in Montpellier, France, we analyse current trends in statistical ecology. Important advances in the analysis of individual movement, and in the modelling of population dynamics and species distributions, are made possible by the increasing use of hierarchical and hidden process models. Exciting research perspectives include the development of methods to interpret citizen science data and of efficient, flexible computational algorithms for model fitting. Statistical ecology has come of age: it now provides a general and mathematically rigorous framework linking ecological theory and empirical data.

  17. Back Pain Facts and Statistics

    MedlinePlus

    ... Pain and Depression Preventing Travel Aches and Strains Back Pain Facts and Statistics Although doctors of chiropractic (DCs) ... time. 1 A few interesting facts about back pain: Low back pain is the single leading cause of disability ...

  18. Statistical description of turbulent dispersion

    NASA Astrophysics Data System (ADS)

    Brouwers, J. J. H.

    2012-12-01

    We derive a comprehensive statistical model for dispersion of passive or almost passive admixture particles such as fine particulate matter, aerosols, smoke, and fumes in turbulent flow. The model rests on the Markov limit for particle velocity. It is in accordance with the asymptotic structure of turbulence at large Reynolds number as described by Kolmogorov. The model consists of Langevin and diffusion equations in which the damping and diffusivity are expressed by expansions in powers of the reciprocal Kolmogorov constant C0. We derive solutions of O(C00) and O(C0-1). We truncate at O(C0-2) which is shown to result in an error of a few percentages in predicted dispersion statistics for representative cases of turbulent flow. We reveal analogies and remarkable differences between the solutions of classical statistical mechanics and those of statistical turbulence.

  19. The Malpractice of Statistical Interpretation

    ERIC Educational Resources Information Center

    Fraas, John W.; Newman, Isadore

    1978-01-01

    Problems associated with the use of gain scores, analysis of covariance, multicollinearity, part and partial correlation, and the lack of rectilinearity in regression are discussed. Particular attention is paid to the misuse of statistical techniques. (JKS)

  20. National Center for Health Statistics

    MedlinePlus

    ... Topics Data and Tools Publications News and Events Population Surveys National Health and Nutrition Examination Survey National Health Interview Survey National Survey of Family Growth Vital Records National Vital Statistics System National Death ...

  1. Spina Bifida Data and Statistics

    MedlinePlus

    ... Materials About Us Information For... Media Policy Makers Data and Statistics Recommend on Facebook Tweet Share Compartir ... non-Hispanic white and non-Hispanic black women. Data from 12 state-based birth defects tracking programs ...

  2. Birth Defects Data and Statistics

    MedlinePlus

    ... Websites About Us Information For... Media Policy Makers Data & Statistics Language: English Español (Spanish) Recommend on Facebook ... of birth defects in the United States. For data on specific birth defects, please visit the specific ...

  3. Hidden Statistics of Schroedinger Equation

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2011-01-01

    Work was carried out in determination of the mathematical origin of randomness in quantum mechanics and creating a hidden statistics of Schr dinger equation; i.e., to expose the transitional stochastic process as a "bridge" to the quantum world. The governing equations of hidden statistics would preserve such properties of quantum physics as superposition, entanglement, and direct-product decomposability while allowing one to measure its state variables using classical methods.

  4. Statistical Physics, 2nd Edition

    NASA Astrophysics Data System (ADS)

    Mandl, F.

    1989-01-01

    The Manchester Physics Series General Editors: D. J. Sandiford; F. Mandl; A. C. Phillips Department of Physics and Astronomy, University of Manchester Properties of Matter B. H. Flowers and E. Mendoza Optics Second Edition F. G. Smith and J. H. Thomson Statistical Physics Second Edition E. Mandl Electromagnetism Second Edition I. S. Grant and W. R. Phillips Statistics R. J. Barlow Solid State Physics Second Edition J. R. Hook and H. E. Hall Quantum Mechanics F. Mandl Particle Physics Second Edition B. R. Martin and G. Shaw The Physics of Stars Second Edition A. C. Phillips Computing for Scientists R. J. Barlow and A. R. Barnett Statistical Physics, Second Edition develops a unified treatment of statistical mechanics and thermodynamics, which emphasises the statistical nature of the laws of thermodynamics and the atomic nature of matter. Prominence is given to the Gibbs distribution, leading to a simple treatment of quantum statistics and of chemical reactions. Undergraduate students of physics and related sciences will find this a stimulating account of the basic physics and its applications. Only an elementary knowledge of kinetic theory and atomic physics, as well as the rudiments of quantum theory, are presupposed for an understanding of this book. Statistical Physics, Second Edition features: A fully integrated treatment of thermodynamics and statistical mechanics. A flow diagram allowing topics to be studied in different orders or omitted altogether. Optional "starred" and highlighted sections containing more advanced and specialised material for the more ambitious reader. Sets of problems at the end of each chapter to help student understanding. Hints for solving the problems are given in an Appendix.

  5. [Statistical process control in healthcare].

    PubMed

    Anhøj, Jacob; Bjørn, Brian

    2009-05-18

    Statistical process control (SPC) is a branch of statistical science which comprises methods for the study of process variation. Common cause variation is inherent in any process and predictable within limits. Special cause variation is unpredictable and indicates change in the process. The run chart is a simple tool for analysis of process variation. Run chart analysis may reveal anomalies that suggest shifts or unusual patterns that are attributable to special cause variation. PMID:19454196

  6. Statistics.

    PubMed

    1993-02-01

    In 1984, 99% of abortions conducted in Bombay, India, were of female fetuses. In 1986-87, 30,000-50,000 female fetuses were aborted in India. In 1987-88, 7 Delhi clinics conducted 13,000 sex determination tests. Thus, discrimination against females begins before birth in India. Some states (Maharashtra, Goa, and Gujarat) have drafted legislation to prevent the use of prenatal diagnostic tests (e.g., ultrasonography) for sex determination purposes. Families make decisions about an infant's nutrition based on the infant's sex so it is not surprising to see a higher incidence of morbidity among girls than boys (e.g., for respiratory infections in 1985, 55.5% vs. 27.3%). Consequently, they are more likely to die than boys. Even though vasectomy is simpler and safer than tubectomy, the government promotes female sterilizations. The percentage of all sexual sterilizations being tubectomy has increased steadily from 84% to 94% (1986-90). Family planning programs focus on female contraceptive methods, despite the higher incidence of adverse health effects from female methods (e.g., IUD causes pain and heavy bleeding). Some women advocates believe the effects to be so great that India should ban contraceptives and injectable contraceptives. The maternal mortality rate is quite high (460/100,000 live births), equaling a lifetime risk of 1:18 of a pregnancy-related death. 70% of these maternal deaths are preventable. Leading causes of maternal deaths in India are anemia, hemorrhage, eclampsia, sepsis, and abortion. Most pregnant women do not receive prenatal care. Untrained personnel attend about 70% of deliveries in rural areas and 29% in urban areas. Appropriate health services and other interventions would prevent the higher age specific death rates for females between 0 and 35 years old. Even though the government does provide maternal and child health services, it needs to stop decreasing resource allocate for health and start increasing it. PMID:12286355

  7. Applied extreme-value statistics

    SciTech Connect

    Kinnison, R.R.

    1983-05-01

    The statistical theory of extreme values is a well established part of theoretical statistics. Unfortunately, it is seldom part of applied statistics and is infrequently a part of statistical curricula except in advanced studies programs. This has resulted in the impression that it is difficult to understand and not of practical value. In recent environmental and pollution literature, several short articles have appeared with the purpose of documenting all that is necessary for the practical application of extreme value theory to field problems (for example, Roberts, 1979). These articles are so concise that only a statistician can recognise all the subtleties and assumptions necessary for the correct use of the material presented. The intent of this text is to expand upon several recent articles, and to provide the necessary statistical background so that the non-statistician scientist can recognize and extreme value problem when it occurs in his work, be confident in handling simple extreme value problems himself, and know when the problem is statistically beyond his capabilities and requires consultation.

  8. On More Sensitive Periodogram Statistics

    NASA Astrophysics Data System (ADS)

    Bélanger, G.

    2016-05-01

    Period searches in event data have traditionally used the Rayleigh statistic, R 2. For X-ray pulsars, the standard has been the Z 2 statistic, which sums over more than one harmonic. For γ-rays, the H-test, which optimizes the number of harmonics to sum, is often used. These periodograms all suffer from the same problem, namely artifacts caused by correlations in the Fourier components that arise from testing frequencies with a non-integer number of cycles. This article addresses this problem. The modified Rayleigh statistic is discussed, its generalization to any harmonic, {{ R }}k2, is formulated, and from the latter, the modified Z 2 statistic, {{ Z }}2, is constructed. Versions of these statistics for binned data and point measurements are derived, and it is shown that the variance in the uncertainties can have an important influence on the periodogram. It is shown how to combine the information about the signal frequency from the different harmonics to estimate its value with maximum accuracy. The methods are applied to an XMM-Newton observation of the Crab pulsar for which a decomposition of the pulse profile is presented, and shows that most of the power is in the second, third, and fifth harmonics. Statistical detection power of the {{ R }}k2 statistic is superior to the FFT and equivalent to the Lomb--Scargle (LS). Response to gaps in the data is assessed, and it is shown that the LS does not protect against the distortions they cause. The main conclusion of this work is that the classical R 2 and Z 2 should be replaced by {{ R }}k2 and {{ Z }}2 in all applications with event data, and the LS should be replaced by the {{ R }}k2 when the uncertainty varies from one point measurement to another.

  9. A Statistical Argument for the Weak Twin Primes Conjecture

    ERIC Educational Resources Information Center

    Bruckman, P. S.

    2006-01-01

    Certain definitions introduce appropriate concepts, among which are the definitions of the counting functions of the primes and twin primes, along with definitions of the correlation coefficient in a bivariate sample space. It is argued conjecturally that the characteristic functions of the prime "p" and of the quantity "p"+2 are highly…

  10. Integrable matrix theory: Level statistics

    NASA Astrophysics Data System (ADS)

    Scaramazza, Jasen A.; Shastry, B. Sriram; Yuzbashyan, Emil A.

    2016-09-01

    We study level statistics in ensembles of integrable N ×N matrices linear in a real parameter x . The matrix H (x ) is considered integrable if it has a prescribed number n >1 of linearly independent commuting partners Hi(x ) (integrals of motion) "]Hi(x ) ,Hj(x ) ]">H (x ) ,Hi(x ) =0 , for all x . In a recent work [Phys. Rev. E 93, 052114 (2016), 10.1103/PhysRevE.93.052114], we developed a basis-independent construction of H (x ) for any n from which we derived the probability density function, thereby determining how to choose a typical integrable matrix from the ensemble. Here, we find that typical integrable matrices have Poisson statistics in the N →∞ limit provided n scales at least as logN ; otherwise, they exhibit level repulsion. Exceptions to the Poisson case occur at isolated coupling values x =x0 or when correlations are introduced between typically independent matrix parameters. However, level statistics cross over to Poisson at O (N-0.5) deviations from these exceptions, indicating that non-Poissonian statistics characterize only subsets of measure zero in the parameter space. Furthermore, we present strong numerical evidence that ensembles of integrable matrices are stationary and ergodic with respect to nearest-neighbor level statistics.

  11. Statistical methods in language processing.

    PubMed

    Abney, Steven

    2011-05-01

    The term statistical methods here refers to a methodology that has been dominant in computational linguistics since about 1990. It is characterized by the use of stochastic models, substantial data sets, machine learning, and rigorous experimental evaluation. The shift to statistical methods in computational linguistics parallels a movement in artificial intelligence more broadly. Statistical methods have so thoroughly permeated computational linguistics that almost all work in the field draws on them in some way. There has, however, been little penetration of the methods into general linguistics. The methods themselves are largely borrowed from machine learning and information theory. We limit attention to that which has direct applicability to language processing, though the methods are quite general and have many nonlinguistic applications. Not every use of statistics in language processing falls under statistical methods as we use the term. Standard hypothesis testing and experimental design, for example, are not covered in this article. WIREs Cogni Sci 2011 2 315-322 DOI: 10.1002/wcs.111 For further resources related to this article, please visit the WIREs website.

  12. Statistics for People Who (Think They) Hate Statistics. Third Edition

    ERIC Educational Resources Information Center

    Salkind, Neil J.

    2007-01-01

    This text teaches an often intimidating and difficult subject in a way that is informative, personable, and clear. The author takes students through various statistical procedures, beginning with correlation and graphical representation of data and ending with inferential techniques and analysis of variance. In addition, the text covers SPSS, and…

  13. Writing to Learn Statistics in an Advanced Placement Statistics Course

    ERIC Educational Resources Information Center

    Northrup, Christian Glenn

    2012-01-01

    This study investigated the use of writing in a statistics classroom to learn if writing provided a rich description of problem-solving processes of students as they solved problems. Through analysis of 329 written samples provided by students, it was determined that writing provided a rich description of problem-solving processes and enabled…

  14. Ethical Statistics and Statistical Ethics: Making an Interdisciplinary Module

    ERIC Educational Resources Information Center

    Lesser, Lawrence M.; Nordenhaug, Erik

    2004-01-01

    This article describes an innovative curriculum module the first author created on the two-way exchange between statistics and applied ethics. The module, having no particular mathematical prerequisites beyond high school algebra, is part of an undergraduate interdisciplinary ethics course which begins with a 3-week introduction to basic applied…

  15. The Relationship between Statistics Self-Efficacy, Statistics Anxiety, and Performance in an Introductory Graduate Statistics Course

    ERIC Educational Resources Information Center

    Schneider, William R.

    2011-01-01

    The purpose of this study was to determine the relationship between statistics self-efficacy, statistics anxiety, and performance in introductory graduate statistics courses. The study design compared two statistics self-efficacy measures developed by Finney and Schraw (2003), a statistics anxiety measure developed by Cruise and Wilkins (1980),…

  16. Population and vital statistics, 1981.

    PubMed

    1983-02-01

    "For various reasons some of the data relating to population estimates, vital statistics and causes of death in 1981 were not included in the Statistical Abstract of Israel No. 33, 1982. The purpose of this [article] is to complete the missing data and to revise and update some other data." Statistics are included on population by age, sex, marital status, population group, origin, continent of birth, period of immigration, and religion; marriages, divorces, live births, deaths, natural increase, infant deaths, and stillbirths by religion; characteristics of persons marrying and divorcing, including place of residence, religion, age, previous marital status, and year and duration of marriage; live births, deaths, and infant deaths by district, sub-district, and type of locality of residence; deaths by age, sex, and continent of birth; infant deaths by age, sex, and population group; and selected life table values by population group and sex.

  17. The Statistical Loop Analyzer (SLA)

    NASA Technical Reports Server (NTRS)

    Lindsey, W. C.

    1985-01-01

    The statistical loop analyzer (SLA) is designed to automatically measure the acquisition, tracking and frequency stability performance characteristics of symbol synchronizers, code synchronizers, carrier tracking loops, and coherent transponders. Automated phase lock and system level tests can also be made using the SLA. Standard baseband, carrier and spread spectrum modulation techniques can be accomodated. Through the SLA's phase error jitter and cycle slip measurements the acquisition and tracking thresholds of the unit under test are determined; any false phase and frequency lock events are statistically analyzed and reported in the SLA output in probabilistic terms. Automated signal drop out tests can be performed in order to trouble shoot algorithms and evaluate the reacquisition statistics of the unit under test. Cycle slip rates and cycle slip probabilities can be measured using the SLA. These measurements, combined with bit error probability measurements, are all that are needed to fully characterize the acquisition and tracking performance of a digital communication system.

  18. Statistical prediction of cyclostationary processes

    SciTech Connect

    Kim, K.Y.

    2000-03-15

    Considered in this study is a cyclostationary generalization of an EOF-based prediction method. While linear statistical prediction methods are typically optimal in the sense that prediction error variance is minimal within the assumption of stationarity, there is some room for improved performance since many physical processes are not stationary. For instance, El Nino is known to be strongly phase locked with the seasonal cycle, which suggests nonstationarity of the El Nino statistics. Many geophysical and climatological processes may be termed cyclostationary since their statistics show strong cyclicity instead of stationarity. Therefore, developed in this study is a cyclostationary prediction method. Test results demonstrate that performance of prediction methods can be improved significantly by accounting for the cyclostationarity of underlying processes. The improvement comes from an accurate rendition of covariance structure both in space and time.

  19. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  20. Illustrating the practice of statistics

    SciTech Connect

    Hamada, Christina A; Hamada, Michael S

    2009-01-01

    The practice of statistics involves analyzing data and planning data collection schemes to answer scientific questions. Issues often arise with the data that must be dealt with and can lead to new procedures. In analyzing data, these issues can sometimes be addressed through the statistical models that are developed. Simulation can also be helpful in evaluating a new procedure. Moreover, simulation coupled with optimization can be used to plan a data collection scheme. The practice of statistics as just described is much more than just using a statistical package. In analyzing the data, it involves understanding the scientific problem and incorporating the scientist's knowledge. In modeling the data, it involves understanding how the data were collected and accounting for limitations of the data where possible. Moreover, the modeling is likely to be iterative by considering a series of models and evaluating the fit of these models. Designing a data collection scheme involves understanding the scientist's goal and staying within hislher budget in terms of time and the available resources. Consequently, a practicing statistician is faced with such tasks and requires skills and tools to do them quickly. We have written this article for students to provide a glimpse of the practice of statistics. To illustrate the practice of statistics, we consider a problem motivated by some precipitation data that our relative, Masaru Hamada, collected some years ago. We describe his rain gauge observational study in Section 2. We describe modeling and an initial analysis of the precipitation data in Section 3. In Section 4, we consider alternative analyses that address potential issues with the precipitation data. In Section 5, we consider the impact of incorporating additional infonnation. We design a data collection scheme to illustrate the use of simulation and optimization in Section 6. We conclude this article in Section 7 with a discussion.

  1. Assessment of Coastal and Urban Flooding Hazards Applying Extreme Value Analysis and Multivariate Statistical Techniques: A Case Study in Elwood, Australia

    NASA Astrophysics Data System (ADS)

    Guimarães Nobre, Gabriela; Arnbjerg-Nielsen, Karsten; Rosbjerg, Dan; Madsen, Henrik

    2016-04-01

    Traditionally, flood risk assessment studies have been carried out from a univariate frequency analysis perspective. However, statistical dependence between hydrological variables, such as extreme rainfall and extreme sea surge, is plausible to exist, since both variables to some extent are driven by common meteorological conditions. Aiming to overcome this limitation, multivariate statistical techniques has the potential to combine different sources of flooding in the investigation. The aim of this study was to apply a range of statistical methodologies for analyzing combined extreme hydrological variables that can lead to coastal and urban flooding. The study area is the Elwood Catchment, which is a highly urbanized catchment located in the city of Port Phillip, Melbourne, Australia. The first part of the investigation dealt with the marginal extreme value distributions. Two approaches to extract extreme value series were applied (Annual Maximum and Partial Duration Series), and different probability distribution functions were fit to the observed sample. Results obtained by using the Generalized Pareto distribution demonstrate the ability of the Pareto family to model the extreme events. Advancing into multivariate extreme value analysis, first an investigation regarding the asymptotic properties of extremal dependence was carried out. As a weak positive asymptotic dependence between the bivariate extreme pairs was found, the Conditional method proposed by Heffernan and Tawn (2004) was chosen. This approach is suitable to model bivariate extreme values, which are relatively unlikely to occur together. The results show that the probability of an extreme sea surge occurring during a one-hour intensity extreme precipitation event (or vice versa) can be twice as great as what would occur when assuming independent events. Therefore, presuming independence between these two variables would result in severe underestimation of the flooding risk in the study area.

  2. Ranald Macdonald and statistical inference.

    PubMed

    Smith, Philip T

    2009-05-01

    Ranald Roderick Macdonald (1945-2007) was an important contributor to mathematical psychology in the UK, as a referee and action editor for British Journal of Mathematical and Statistical Psychology and as a participant and organizer at the British Psychological Society's Mathematics, statistics and computing section meetings. This appreciation argues that his most important contribution was to the foundations of significance testing, where his concern about what information was relevant in interpreting the results of significance tests led him to be a persuasive advocate for the 'Weak Fisherian' form of hypothesis testing. PMID:19351454

  3. The Statistics of Visual Representation

    NASA Technical Reports Server (NTRS)

    Jobson, Daniel J.; Rahman, Zia-Ur; Woodell, Glenn A.

    2002-01-01

    The experience of retinex image processing has prompted us to reconsider fundamental aspects of imaging and image processing. Foremost is the idea that a good visual representation requires a non-linear transformation of the recorded (approximately linear) image data. Further, this transformation appears to converge on a specific distribution. Here we investigate the connection between numerical and visual phenomena. Specifically the questions explored are: (1) Is there a well-defined consistent statistical character associated with good visual representations? (2) Does there exist an ideal visual image? And (3) what are its statistical properties?

  4. Key China Energy Statistics 2012

    SciTech Connect

    Levine, Mark; Fridley, David; Lu, Hongyou; Fino-Chen, Cecilia

    2012-05-01

    The China Energy Group at Lawrence Berkeley National Laboratory (LBNL) was established in 1988. Over the years the Group has gained recognition as an authoritative source of China energy statistics through the publication of its China Energy Databook (CED). The Group has published seven editions to date of the CED (http://china.lbl.gov/research/chinaenergy-databook). This handbook summarizes key statistics from the CED and is expressly modeled on the International Energy Agency’s “Key World Energy Statistics” series of publications. The handbook contains timely, clearly-presented data on the supply, transformation, and consumption of all major energy sources.

  5. Key China Energy Statistics 2011

    SciTech Connect

    Levine, Mark; Fridley, David; Lu, Hongyou; Fino-Chen, Cecilia

    2012-01-15

    The China Energy Group at Lawrence Berkeley National Laboratory (LBNL) was established in 1988. Over the years the Group has gained recognition as an authoritative source of China energy statistics through the publication of its China Energy Databook (CED). In 2008 the Group published the Seventh Edition of the CED (http://china.lbl.gov/research/chinaenergy-databook). This handbook summarizes key statistics from the CED and is expressly modeled on the International Energy Agency’s “Key World Energy Statistics” series of publications. The handbook contains timely, clearly-presented data on the supply, transformation, and consumption of all major energy sources.

  6. Statistical mechanics of prion diseases.

    PubMed

    Slepoy, A; Singh, R R; Pázmándi, F; Kulkarni, R V; Cox, D L

    2001-07-30

    We present a two-dimensional, lattice based, protein-level statistical mechanical model for prion diseases (e.g., mad cow disease) with concomitant prion protein misfolding and aggregation. Our studies lead us to the hypothesis that the observed broad incubation time distribution in epidemiological data reflect fluctuation dominated growth seeded by a few nanometer scale aggregates, while much narrower incubation time distributions for innoculated lab animals arise from statistical self-averaging. We model "species barriers" to prion infection and assess a related treatment protocol. PMID:11497806

  7. On statistical aspects of Qjets

    NASA Astrophysics Data System (ADS)

    Ellis, Stephen D.; Hornig, Andrew; Krohn, David; Roy, Tuhin S.

    2015-01-01

    The process by which jet algorithms construct jets and subjets is inherently ambiguous and equally well motivated algorithms often return very different answers. The Qjets procedure was introduced by the authors to account for this ambiguity by considering many reconstructions of a jet at once, allowing one to assign a weight to each interpretation of the jet. Employing these weighted interpretations leads to an improvement in the statistical stability of many measurements. Here we explore in detail the statistical properties of these sets of weighted measurements and demonstrate how they can be used to improve the reach of jet-based studies.

  8. Statistical parameters for gloss evaluation

    SciTech Connect

    Peiponen, Kai-Erik; Juuti, Mikko

    2006-02-13

    The measurement of minute changes in local gloss has not been presented in international standards due to a lack of suitable glossmeters. The development of a diffractive-element-based glossmeter (DOG) made it possible to detect local variation of gloss from planar and complex-shaped surfaces. Hence, a demand for proper statistical gloss parameters for classifying surface quality by gloss, similar to the standardized surface roughness classification, has become necessary. In this letter, we define statistical gloss parameters and utilize them as an example in the characterization of gloss from metal surface roughness standards by the DOG.

  9. Statistical Mechanics of Prion Diseases

    NASA Astrophysics Data System (ADS)

    Slepoy, A.; Singh, R. R.; Pázmándi, F.; Kulkarni, R. V.; Cox, D. L.

    2001-07-01

    We present a two-dimensional, lattice based, protein-level statistical mechanical model for prion diseases (e.g., mad cow disease) with concomitant prion protein misfolding and aggregation. Our studies lead us to the hypothesis that the observed broad incubation time distribution in epidemiological data reflect fluctuation dominated growth seeded by a few nanometer scale aggregates, while much narrower incubation time distributions for innoculated lab animals arise from statistical self-averaging. We model ``species barriers'' to prion infection and assess a related treatment protocol.

  10. Statistical Constraints on Joy's Law

    NASA Astrophysics Data System (ADS)

    Amouzou, Ernest C.; Munoz-Jaramillo, Andres; Martens, Petrus C.; DeLuca, Edward E.

    2014-06-01

    Using sunspot data from the observatories at Mt. Wilson and Kodaikanal, active region tilt angles are analyzed for different active region sizes and latitude bins. A number of similarly-shaped statistical distributions were fitted to the data using maximum likelihood estimation. In all cases, we find that the statistical distribution best describing the number of active regions at a given tilt angle is a Laplace distribution with the form (2β)-1*exp(-|x-μ|/β), with 2° ≤ μ ≤ 11°, and 10° ≤ β ≤ 40°.

  11. Probability, statistics, and computational science.

    PubMed

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  12. Statistical Mechanics of Prion Diseases

    SciTech Connect

    Slepoy, A.; Singh, R. R. P.; Pazmandi, F.; Kulkarni, R. V.; Cox, D. L.

    2001-07-30

    We present a two-dimensional, lattice based, protein-level statistical mechanical model for prion diseases (e.g., mad cow disease) with concomitant prion protein misfolding and aggregation. Our studies lead us to the hypothesis that the observed broad incubation time distribution in epidemiological data reflect fluctuation dominated growth seeded by a few nanometer scale aggregates, while much narrower incubation time distributions for innoculated lab animals arise from statistical self-averaging. We model ''species barriers'' to prion infection and assess a related treatment protocol.

  13. Directory of Michigan Library Statistics. 1994 Edition. Reporting 1992 and 1993 Statistical Activities including: Public Library Statistics, Library Cooperative Statistics, Regional/Subregional Statistics.

    ERIC Educational Resources Information Center

    Leaf, Donald C., Comp.; Neely, Linda, Comp.

    This edition focuses on statistical data supplied by Michigan public libraries, public library cooperatives, and those public libraries which serve as regional or subregional outlets for blind and physically handicapped services. Since statistics in Michigan academic libraries are typically collected in odd-numbered years, they are not included…

  14. Teaching Statistics in Integration with Psychology

    ERIC Educational Resources Information Center

    Wiberg, Marie

    2009-01-01

    The aim was to revise a statistics course in order to get the students motivated to learn statistics and to integrate statistics more throughout a psychology course. Further, we wish to make students become more interested in statistics and to help them see the importance of using statistics in psychology research. To achieve this goal, several…

  15. 20 CFR 634.4 - Statistical standards.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 20 Employees' Benefits 3 2012-04-01 2012-04-01 false Statistical standards. 634.4 Section 634.4... System § 634.4 Statistical standards. Recipients shall agree to provide required data following the statistical standards prescribed by the Bureau of Labor Statistics for cooperative statistical programs....

  16. ARL Statistics 2007-2008

    ERIC Educational Resources Information Center

    Kyrillidou, Martha, Comp.; Bland, Les, Comp.

    2009-01-01

    "ARL Statistics 2007-2008" is the latest in a series of annual publications that describe collections, staffing, expenditures, and service activities for the 123 members of the Association of Research Libraries (ARL). Of these, 113 are university libraries; the remaining 10 are public, governmental, and nonprofit research libraries. Data reported…

  17. Education Statistics Quarterly, Spring 2002.

    ERIC Educational Resources Information Center

    Dillow, Sally, Ed.

    2002-01-01

    This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data products, and funding opportunities developed over a 3-month period. Each issue also contains a message…

  18. Statistics by Example, Finding Models.

    ERIC Educational Resources Information Center

    Mosteller, Frederick; And Others

    This booklet, part of a series of four which provide problems in probability and statistics for the secondary school level, is aimed at aiding the student in developing models as structure for data and in learning how to change models to fit real-life problems. Twelve different problem situations arising from biology, business, English, physical…

  19. Statistics by Example, Weighing Chances.

    ERIC Educational Resources Information Center

    Mosteller, Frederick; And Others

    Part of a series of four pamphlets providing problems in probability and statistics taken from real-life situations, this booklet develops probability methods through random numbers, simulations, and simple probability models, and presents the idea of scatter and residuals for analyzing complex data. The pamphlet was designed for a student having…

  20. Statistics by Example, Exploring Data.

    ERIC Educational Resources Information Center

    Mosteller, Frederick; And Others

    Part of a series of four pamphlets providing real-life problems in probability and statistics for the secondary school level, this booklet shows how to organize data in tables and graphs in order to get and to exhibit messages. Elementary probability concepts are also introduced. Fourteen different problem situations arising from biology,…

  1. Statistics of premixed flame cells

    NASA Technical Reports Server (NTRS)

    Noever, David A.

    1991-01-01

    The statistics of random cellular patterns in premixed flames are analyzed. Agreement is found with a variety of topological relations previously found for other networks, namely, Lewis's law and Aboav's law. Despite the diverse underlying physics, flame cells are shown to share a broad class of geometric properties with other random networks-metal grains, soap foams, bioconvection, and Langmuir monolayers.

  2. Education Statistics Quarterly, Fall 2001.

    ERIC Educational Resources Information Center

    Dillow, Sally, Ed.

    2001-01-01

    The publication gives a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications, data products, and funding opportunities developed over a 3-month period. Each issue also contains a message from…

  3. China's Statistical System and Resources

    ERIC Educational Resources Information Center

    Xue, Susan

    2004-01-01

    As the People's Republic of China plays an increasingly important role in international politics and trade, countries with economic interests there find they need to know more about this nation. Access to primary information sources, including official statistics from China, however, is very limited, as little exploration has been done into this…

  4. A Simple Statistical Thermodynamics Experiment

    ERIC Educational Resources Information Center

    LoPresto, Michael C.

    2010-01-01

    Comparing the predicted and actual rolls of combinations of both two and three dice can help to introduce many of the basic concepts of statistical thermodynamics, including multiplicity, probability, microstates, and macrostates, and demonstrate that entropy is indeed a measure of randomness, that disordered states (those of higher entropy) are…

  5. Education Statistics Quarterly, Winter 2001.

    ERIC Educational Resources Information Center

    Dillow, Sally, Ed.

    2002-01-01

    This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released in a 3-month period. Each issue also contains a message from the NCES on a timely…

  6. Understanding Statistics Using Computer Demonstrations

    ERIC Educational Resources Information Center

    Dunn, Peter K.

    2004-01-01

    This paper discusses programs that clarify some statistical ideas often discussed yet poorly understood by students. The programs adopt the approach of demonstrating what is happening, rather than using the computer to do the work for the students (and hide the understanding). The programs demonstrate normal probability plots, overfitting of…

  7. Inverting an Introductory Statistics Classroom

    ERIC Educational Resources Information Center

    Kraut, Gertrud L.

    2015-01-01

    The inverted classroom allows more in-class time for inquiry-based learning and for working through more advanced problem-solving activities than does the traditional lecture class. The skills acquired in this learning environment offer benefits far beyond the statistics classroom. This paper discusses four ways that can make the inverted…

  8. GPS: Geometry, Probability, and Statistics

    ERIC Educational Resources Information Center

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  9. Fit Indices Versus Test Statistics

    ERIC Educational Resources Information Center

    Yuan, Ke-Hai

    2005-01-01

    Model evaluation is one of the most important aspects of structural equation modeling (SEM). Many model fit indices have been developed. It is not an exaggeration to say that nearly every publication using the SEM methodology has reported at least one fit index. Most fit indices are defined through test statistics. Studies and interpretation of…

  10. Introductory Statistics and Fish Management.

    ERIC Educational Resources Information Center

    Jardine, Dick

    2002-01-01

    Describes how fisheries research and management data (available on a website) have been incorporated into an Introductory Statistics course. In addition to the motivation gained from seeing the practical relevance of the course, some students have participated in the data collection and analysis for the New Hampshire Fish and Game Department. (MM)

  11. What Price Statistical Tables Now?

    ERIC Educational Resources Information Center

    Hunt, Neville

    1997-01-01

    Describes the generation of all the tables required for school-level study of statistics using Microsoft's Excel spreadsheet package. Highlights cumulative binomial probabilities, cumulative Poisson probabilities, normal distribution, t-distribution, chi-squared distribution, F-distribution, random numbers, and accuracy. (JRH)

  12. Teaching Statistics through Learning Projects

    ERIC Educational Resources Information Center

    Moreira da Silva, Mauren Porciúncula; Pinto, Suzi Samá

    2014-01-01

    This paper aims to reflect on the teaching of statistics through student research, in the form of projects carried out by students on self-selected topics. The paper reports on a study carried out with two undergraduate classes using a methodology of teaching that we call "learning projects." Monitoring the development of the various…

  13. Tools for Basic Statistical Analysis

    NASA Technical Reports Server (NTRS)

    Luz, Paul L.

    2005-01-01

    Statistical Analysis Toolset is a collection of eight Microsoft Excel spreadsheet programs, each of which performs calculations pertaining to an aspect of statistical analysis. These programs present input and output data in user-friendly, menu-driven formats, with automatic execution. The following types of calculations are performed: Descriptive statistics are computed for a set of data x(i) (i = 1, 2, 3 . . . ) entered by the user. Normal Distribution Estimates will calculate the statistical value that corresponds to cumulative probability values, given a sample mean and standard deviation of the normal distribution. Normal Distribution from two Data Points will extend and generate a cumulative normal distribution for the user, given two data points and their associated probability values. Two programs perform two-way analysis of variance (ANOVA) with no replication or generalized ANOVA for two factors with four levels and three repetitions. Linear Regression-ANOVA will curvefit data to the linear equation y=f(x) and will do an ANOVA to check its significance.

  14. Education Statistics Quarterly, Summer 2001.

    ERIC Educational Resources Information Center

    Dillow, Sally, Ed.

    2001-01-01

    This publication provides a comprehensive overview of work done across all parts of the National Center for Education Statistics (NCES). Each issue contains short publications, summaries, and descriptions that cover all NCES publications and data products released during a 3-month period. Each issue also contains a message from the NCES on a…

  15. American Youth: A Statistical Snapshot.

    ERIC Educational Resources Information Center

    Wetzel, James R.

    This report presents and analyzes statistical data on the status and condition of American youth, ages 16-24. A brief commentary on the problems of collecting data concerning Hispanic youth precedes the report's seven main sections, which deal with the following topics: population; marriage; childbearing and living arrangements; family income and…

  16. Basic HIV/AIDS Statistics

    MedlinePlus

    ... Abroad Treatment Basic Statistics Get Tested Find an HIV testing site near you. Enter ZIP code or city Follow HIV/AIDS CDC HIV CDC HIV/AIDS See RSS | ... Collapse All How many people are diagnosed with HIV each year in the United States? In 2014, ...

  17. Statistical Prediction in Proprietary Rehabilitation.

    ERIC Educational Resources Information Center

    Johnson, Kurt L.; And Others

    1987-01-01

    Applied statistical methods to predict case expenditures for low back pain rehabilitation cases in proprietary rehabilitation. Extracted predictor variables from case records of 175 workers compensation claimants with some degree of permanent disability due to back injury. Performed several multiple regression analyses resulting in a formula that…

  18. American Youth: A Statistical Snapshot.

    ERIC Educational Resources Information Center

    Wetzel, James R.

    This document presents a statistics snapshot of young people, aged 15 to 24 years. It provides a broad overview of trends documenting the direction of changes in social behavior and economic circumstances. The projected decline in the total number of youth from 43 million in 1980 to 35 million in 1995 will affect marriage and childbearing…

  19. The Statistics of a Function

    ERIC Educational Resources Information Center

    Gordon, Sheldon P.; Gordon, Florence S.

    2010-01-01

    One of the most important applications of the definite integral in a modern calculus course is the mean value of a function. Thus, if a function "f" is defined on an interval ["a", "b"], then the mean, or average value, of "f" is given by [image omitted]. In this note, we will investigate the meaning of other statistics associated with a function…

  20. Concept Maps in Introductory Statistics

    ERIC Educational Resources Information Center

    Witmer, Jeffrey A.

    2016-01-01

    Concept maps are tools for organizing thoughts on the main ideas in a course. I present an example of a concept map that was created through the work of students in an introductory class and discuss major topics in statistics and relationships among them.